“So it’s not about ranking and spanking but more…. measuring and pleasuring?”

“So it’s not about ranking and spanking but more…. measuring and pleasuring?”

Posted 11th August 2016

 

This is a guest blog from John McLean. John has a diverse background in both the public and third sector working on projects including policy development, quality management and impact measurement. He shares with us his views on monitoring quality and how to make good results a motivator not a tool of discipline. John currently heads up Quality and Compliance at PSS (UK); a health and social care organisation based in Liverpool and organisational members of Social Value UK.

 

Now you might read this headline and think ‘well that’s one of two things: either this article has slipped through the net of the communications team or someone really likes writing shocking headlines’. Well actually in this case it’s neither, this blog is an ‘on-the- ground’, ‘to-the- point’, ‘say-what-you- mean’ summary of our philosophy on quality measurement. Last month when we were discussing the thoughts that follow later in this blog this is how a member of our team summarised it, and you know what; I couldn’t have put it better myself.

Friendly Competition. It’s an interesting term; some would say it’s healthy, some would say dangerous and others would say it’s a contradiction in terms, but recently at a Social Value UK meeting I was asked how the vast range of services that PSS delivers work together around improving their quality ratings and impact reporting, and whether there is an element of friendly competition? It was a fair question and it allowed me to reflect a little on my experiences of competition in the public sector compared to the third sector.

My local government career was predominately in adult social care where after years of finding efficiency savings, councils had to look for creative ways to demonstrate they were achieving outcomes for their local populations with reduced resources. In the North West, the leads for each authority came together to discuss these outcomes and work together to create a framework to show they were achieving them.

However it became apparent (to some of us!) that the outcomes – and even the outcomes framework set out by the Government – were nothing more than a hotchpotch of bean counting, process-focused and output-based indicators which authorities then used to ‘benchmark’ themselves against their nearest and dearest. Some liked this process because they were good at the numbers game and it reassured them, some were just relieved to be middle of the pack; but it was a far less enjoyable experience for those at the bottom of the pile who were either too honest, or behind the curve.

The ‘benchmarking’ was originally intended to single out those doing exceptionally well, in order that they could share their wisdom region-wide and help the authorities finding it a bit more of a struggle. However when you have 23 local authorities – all fairly similar in function and, when it came to bids and inspections, in competition with each other – you found that those at the top of their game were less inclined to share. As the title of this blog suggests, the entire process was merely an example of “ranking and spanking” – and nobody wanted to be overly forthcoming with their tricks of the trade lest they ended up on the wrong end of the paddle.

Fast forward to my current role at PSS; where there are over 25 different services including Shared Lives, Health Trainers, Psychological therapy, Parent and Baby wellness services and Day Support, all with various commissioners and covering a wide geography.

 

Our approach at PSS has never been to foster bitter competition, but rather to demonstrate positive impact, share good practice and provide toolkits which encapsulate this learning, so that services can be the best they can be.

 

As part of my job, we use our Quality Assurance Framework to help each service build up its own Quality Profile – which includes commissioner & inspectorate ratings, our own internal quality review results, annual survey results and of course their own individual Impact Report. So to answer the original question – is there friendly competition between the services? The answer: absolutely there is competition, and more importantly it is entirely friendly. All the services want to demonstrate high quality; and whilst we all share the values of PSS and are bought-in to where we are going as an organisation – the services themselves are distinct enough to appreciate that they can learn from best practice in some ways, but they will need to innovate and make that difference themselves in other ways. The first few Social Impact Reports have been published already, and rather than inspire envy, we have been inundated with eager requests from service managers all saying “we want one – us next”! Our approach at PSS has never been to foster bitter competition, but rather to demonstrate positive impact, share good practice and provide toolkits which encapsulate this learning, so that services can be the best they can be. When it comes to quality and impact reporting, rather than focus on hierarchical ‘ranking and spanking’ we are more about ‘measuring and pleasuring’ and even those who have already published their reports know that they will continue to learn from other services as they work on demonstrating their impact going forward.

In this way, our services will continue to grow and learn from each other, and help further develop the positive and proactive culture that PSS endeavours to foster.