Social Outcomes Contracts (or Social Impact Bonds – SIBs) have been a big part of ATQ’s work since we formed almost exactly 10 years ago, in August 2012. We are proud to have supported many organisations to set up and implement contracts, and in recent years have become heavily involved in research and evaluation of such contracts through work such as this for the Department for Culture, Media and Sport.
Last week saw the publication of what might be the most high-profile piece of research we have ever done, into the value created by Social Outcomes Contracts (SOCs) in the UK since they first started more than a decade ago. Our report was part of broader work by Big Society Capital (BSC) to reflect on and celebrate the growth of SOCs over the last ten years.
The statistic that has gained most attention from our report, and was mentioned at Prime Minister’s Questions last week, is that every pound spent on SOCs in outcome payments has generated more that £10 of value. In other words, and in the technical language of cost benefit analysis, the Benefit Cost Ratio was 10.2. Even if you accept that some of this value would have happened anyway (as it probably would in some contracts, but by no means all) we are still looking at a return to government and society of many times the initial outlay.
As one of many organisations that have been grappling with SOCs and SIBs, and whether they are value for money for many years, we hope this report will provoke further debate about whether and when such contracts are a useful tool in the commissioner’s armoury. Indeed it looks like that debate has already started, and we are happy to be part of it. For now, I would offer three observations on this piece of work.
First, our report is and always was intended to be entirely factual. It takes data on the outcomes that have been achieved by SOCs – as measured and validated by those contracts – and attempts to put a value on them based on the improvements they make to people’s lives. This is not an exact science, and so we were deliberately cautious – see below; but we are making no judgement on the efficacy of SOCs, and nor are we comparing the performance of projects with each other or with other types of contract. We have recently made a major contribution to another research report – the third update on the evaluation of the Commissioning Better Outcomes Fund – which does explore the strengths and weaknesses of SOCs in some depth, but this report does not do that.
Second, I am slightly puzzled by suggestions in some quarters that our value estimates are too big, and therefore somehow less credible. Puzzled because we were deliberately conservative in our assumptions, as we explain in some detail in our report. This is our usual practice when undertaking cost benefit analysis because we know that inflating ‘savings’ – either consciously or unconsciously – is self-defeating. Our clients sometimes challenge our caution, and ask us to make more optimistic assumptions, but we tend to resist. In this case we were even more cautious than usual, consistently using low estimates of unit costs saved when larger, well-evidenced estimates were available; assuming no sustainment of outcomes such as periods of employment; and leaving many outcomes which potentially have value out of our analysis altogether. The truth is that whenever we have done this type of exercise, the estimates we produce tend to be large because the costs of adverse outcomes – children in care, young people long-term NEET, older people needing hospitalisation for conditions that can be managed better at home – are themselves large, and much greater than many realise.
Finally, both the work we have done and the wider analysis by BSC seem to me to confirm that, whatever their other benefits and defects, these types of contract are delivering a pretty good ‘bang for buck’ and have leveraged a lot of value for not much spending. This is not just because of the benefit cost ratios outlined above, but also due to two other factors. First, according to BSC’s figures these results have been achieved with the injection of around £71m in social investment. This is much less than many expected, and there has been criticism in some quarters that investment in SOCs has fallen well short of projections. But a ’glass half full’ view would be that this is a good thing. The amount of working capital needed to smooth the wheels of these contracts (mainly to make the payment by results mechanism work for social sector organisations) has been much less than the total contract values, and this makes them pretty efficient.
The second point is that government and other bodies (notably the National Lottery Community Fund) have funded a high proportion of outcome payments made through SOCs, leading to criticism that such contracts only exist because of such funding. This is likely true (there have been very few contracts implemented without some form of subsidy from one or other so-called Outcomes Funds) but it does look like a reasonable investment, and it looks like a very good investment when one considers the particular case of the Commissioning Better Outcomes Fund and Life Chances Fund, which typically cover 20-30% of total outcome payments – with the rest coming from local commissioners. So the effective leverage of these contracts is maybe 3-4 times what it would be if they were wholly funded by government or the Community Fund.
And since many commissioners and providers have said that they would not have put up their own money without this pump-prime funding, what’s not to like?
Sponsored Content