1 / 23

Contract management – new approaches

Contract management – new approaches. Measuring Contractor Performance. Acknowledgements to Stephen DeBoise, Portsmouth City Council www.portsmouth.gov.uk. Nick Capon Centre for Enterprise Research and Innovation www.ceri.ac.uk. Scope. Apologies for absence of Richard Tonge

dyan
Download Presentation

Contract management – new approaches

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Contract management – new approaches Measuring Contractor Performance Acknowledgements to Stephen DeBoise, Portsmouth City Council www.portsmouth.gov.uk Nick Capon Centre for Enterprise Research and Innovation www.ceri.ac.uk

  2. Scope • Apologies for absence of Richard Tonge • Recent research at Portsmouth City Council • Current methods they use • Strengths, weaknesses • Proposed improvements

  3. Definitions? • Before an order is placed • ‘Contractor EvaIuation’ • Pre Qualification Questionnaire (PQQ) • After an order is placed • ‘Contractor Appraisal’, or • ‘Vendor Rating’

  4. Benefits of measurement? Other stakeholders • Joint understanding of customer needs • Motivating improvement • Cost of control minimised • Reduced waste and complaints • Benchmarking of contractor performance • Demonstrating control of the vendor base. Value Your organisation Customer Contractor Align aims and priorities Align aims and priorities

  5. Challenges – excuses? • Workload – labour and data intensive • Subjectivity – evidence? rewards/ penalties cause bias? measures recorded depend on how explained? • Motivation – measures can create argument rather than benefit • Historical nature – slow • Investment needed to mechanise data collection and communication

  6. Theory - QTCC • If contractor not critical to continued success Quality Communication Cost Time

  7. Theory – 7’C’ If contractor critical: Competency Cost Control Consistency Cash resources Commitment Capacity • "Measure for Measure." Supply Management, 1 February 2001, 39

  8. What to measure? • Or same QTCC in greater detail: • Cost: Prices, target costs, non-performance costs, savings achieved/year • Quality: End customer complaints and feedback, SPC capability analysis and SPC trend reporting, SLA achievement, documented service design improvements • Time: Source reliability, staff turnover, compliance to procedures, financial stability, total workload for us as % of total turnover <30%, on time • Communication: Relationships, understanding of needs and values, communication delays. Purchasing Principles and Management, Bailey and Farmer, 2005

  9. What do we measure now? Organisation concern - consistency Contractor concern – trends, review Outputs - Achieve specification Process - Would we work with you again? Outcomes - Survey of clients Organisation - Difficult Contractor – Low response Contractor – Must compare Org – ethics, innovation

  10. What would we like to measure? Contractor Organisation Partnership, shared values Outcomes for service users Communication, trust Understanding of needs Value for money Sustainable company • Outputs, compliance with specification • Sustainability, continuity • Value for money • Innovation • Competition • Partnership, shared values • Skills, best practice

  11. What would help? What can organisation do to help contractor? What can contractor do to help organisation? Soft market testing to stimulate new suppliers Willingness to change measures to suit • Time to plan • Information sharing, also within organisation departments • Reduce complexity • Transparency • Stable agreed expectations • Feedback, clarity At need identification At ITT At PQQ At tender At SLA At contract review

  12. Conclusion Measure of Overall Satisfaction (9 Excellent, >6 Good, >0 Improvement required, <0 Failing to perform) Contract requirements (Outputs, service levels defined in specs) Process Expectations (subjective) Customer Perception (Outcomes) Satisfaction survey >90%, >75%, >60%, >0 plus complaints low Score 3, 2, 0, -1 Exceeds, Meets, Mostly, None Score 3, 2, 0, -1 ‘Least good at, best at…’ Exceeds, Meets, Mostly, None Score 3, 2, 0, -1 Evidence required Contractor data, plus periodic independent check

  13. How to measure? • Check goal alignment • What does contractor think are your priorities? • Remove your role and allow end customer to communicate direct to contractor if possible • Examples: Website feedback from customers, measure plus a quote • Travel agent website of hotels • Self assessment by contractor of trends • Encourages involvement • Reduces workload • SPC, trends more important than KPI • Independent assessor for depth • If customer is not web literate • Example: Help the Aged to assess Care Homes • 360 degree feedback

  14. Constraints • Resources in Organisation • to create three appropriate measures for each contract • simple transparent database to update results • Training for Contractors • Template for contractors to provide information • Sustaining

  15. Pilot testing - Outcomes • Volume of valid complaints • compared to a target agreed with the contractor. • Asking customers • ‘How likely are you to raise/recommend us to a friend (0-100%)?’ • if required ‘What extra should we have done to get 100% score? • Method • User surveys, comparison with benchmarks

  16. Pilot testing - Outputs • Meet timescale: • Non-achievement, missed/late deliveries • Rectification response time • Quality: • Capability, maintaining adequate resources and skills • Work completed in sufficient detail • Safety, environment, discrimination • Price: • Variations to contract pricing • Number of cost saving improvements

  17. Pilot testing - Process • Problem resolution including 360o feedback • Communication response • Invoice accuracy • Technical innovation • Financial stability ongoing • Cultural ethos/ values same as ours • Subjective assessment, with evidence

  18. Illustrative current practice - Outcomes % who measure this • Methods: • Complaints: • Unsolicited praise/ criticism by letter, newspaper or telephone • Realtime update of shared web database • Minuted monthly if serious • Satisfaction: • Proactive survey 100 80 60 40 20 0 Complaints (reactive, volume) Satisfaction (proactive, %)

  19. Illustrative current practice - Outputs % who measure this • Methods: • Self assessment by contractor • Some use SPC to monitor trends 100 80 60 40 20 0 Meet timecale Rectification response time Non achievement, missed deliveries Maintaining adequate resources and skills quality Work completed in sufficient detail Number of cost saving improvements Safety, environment, discrimination Financial overspend, Changes/ variations to contract in price

  20. % who measure this Illustrative current practice - Process 100 80 60 40 20 0 • Methods: • Verbal dialogue • Monthly minuted discussion Problem resolution, including 360o feedback Communication response Invoice accuracy Technical innovation Financial stability ongoing Cultural ethos same as ours

  21. How reported? • Monthly report, face to face discussion • Geographical analysis to direct improvement action • SPC charts to highlight significant issues

  22. Action resulting? • Financial sharing of improvements/ penalties for failure • Focus for improvement action • Planned transparent sharing of vendor rating results (IT system) with rest of organisation, who might buy from same contractor

  23. Questions?

More Related