20. Thank You Colin Symonds - IMCS Ltd Phone 021 HELP - AM 021 4 3 5 7 - 2 6 Acknowledgements: SWDC; Dean Taylor (Opus); NAMS, Ross Waugh (Waugh Infrastructure Management Ltd)
Hinweis der Redaktion
Helping Asset Managers is my passion – see phone number! Acknowledge: Client for commissioning the work this is based on Dean Taylor (Opus) for introducing us to the concept Kathy Dever-Todd (NAMS) for consultation suggestions Ross Waugh (Waugh Infrastructure Ltd) for my involvement.
Other Refs: NAMS PQS guideline; NZS/WASSA for Toilets, DWSNZ for water supply)
Apologise for image quality
Background: Low community interest to attend council-organised LoS meetings’ in 2005 Crucial strategy 2008 “we will go to them” Obviously some groups not representative of community wrt area of interest. Explain how minority group inputs normalised.
The bars indicate the percentage of total respondents who rated each service component high, medium, low, or not rated Staff expertise and location were highly important Range of books, off-street parking and disabled access.
Benefit for Councillors: See what ratepayers were prepared to pay for - Yes: toilets at a heavily used soccer park No: more playground equipment Now that we have good indication of service level preferences we can set about designing a star rating system that is meaningful to ratepayers.
You could apply weights to each of the service level components. I decided to keep the scoring system simple It works without weighting Once used, weighting could not be changed again – need continuity across future AMPs Consider who will be doing the site surveys and keep the definitions of service components simple also.
Asset provision or operation. Components often categorised as ‘availability, reliability, accessibility’ etc
Initial calibration based on experience and verbal description: v-low, low-av etc
Further calibration based on draft star rating: Without significant gap between rating numbers, result is too sensitive to service components.
Total score for a building can be thought of as ‘property quality score’. (NAMS Property manual). NAMS manual
Further calibration based on draft star rating: Without significant gap between rating numbers, result is too sensitive to service components.
Final calibration ensures that expenditure makes an appropriate difference to the rating. If funds limited, improving operations components might be achieved by re-balancing resources eg open shorter hours but also on Saturday eg cleaned same number of times but rescheduled to suit peak times eg fault response target time quicker because performance shows it is being achieved at current cost
Refer to Pyramid of Service measures – PQS detailed system supports Star Rating
Again – Star rating an objective asset management-based replacement for a very subjective satisfaction measure. This does not imply that managers can ignore customer satisfaction results. They reflect the mood of the community but not necessarily the level of service provided.
Consultation is continuous (reporting is part of it) The concept and the future reporting are in understandable terms.
Helping Asset Managers is my passion – see phone number! Acknowledge: Our client for commissioning the work this is based on Dean Taylor (Opus) for introducing us to the concept Kathy Dever-Todd/NAMS for PQS concept and consultation suggestions Ross Waugh (Waugh Infrastructure Ltd) for my involvement.