Thursday, 7 March 2013

What Works Centre for Local Economic Growth

I wrote yesterday about the problem with Portas Pilots, arguing that the way the scheme was implemented will make it almost impossible to figure out whether the projects actually have an impact. This is just one example of a more general failure - government is often very bad at generating convincing evaluation evidence and (arguably) even worse at using that evidence to inform policy making. Could a NICE for social policy help fix this? It appears that we may be about to find out as the government has announced the establishment of four 'What Works centres' covering the areas of crime, ageing, early intervention and local economic growth.

The main task of these independent centres will be to undertake a systematic review of existing evidence to "produce a sound, accurate, clear and actionable synthesis of the global evidence base which:
  • ranks interventions on the basis of effectiveness and cost effectiveness
  • shows applicability
  • shows the relative cost of interventions
  • shows the strength of evidence on the agreed scale."
That is clearly going to be a massive challenge as our recent work for the National Audit Office reveals. In that report (to be published soon) we assessed the quality of a selection of government evaluation reports. Overall, we concluded that the reports that we looked at in the areas of business support and spatial policy (those that would be covered by the what works centre) were arguably not fit for purpose. We would be very wary about drawing any conclusions on the cost effectiveness of government interventions from these reports.

That said, there are some government evaluations and academic studies out there that would provide more convincing evidence. Presumably the What Works centre would rank these as of high quality and the others of (very) low quality and that would help local policy makers, right?

The problem, of course, is that the less careful the study the greater is the tendency to find big positive impacts of the policy. In contrast, more careful studies tend to have a hard time detecting much impact. In fact, for the NAO study, it appears that the correlation between these two attributes (quality of the study and claims about impact) was depressingly negative. When you take this kind of evidence to local government I am very worried that there will be a tendency to ignore the quality ranking and focus on the impact ranking. After all, evaluation is hard, and even well trained central government analysts, who should know better, are willing to claim that the robust evaluation of spatial policies (e.g. that would score high on the Maryland scale) is not possible. Those concerns multiply once you get to politicians (either local or national) particularly if the convincing evidence is overwhelmingly negative on policies that government seem determined to introduce. Just look at Enterprise Zones.

Does all of this mean we shouldn't even try? Of course not. Indeed, it's good to see the government and ESRC investing money in these new centres. And, it should go without saying, in many policy areas Britain is already streets ahead of other countries in terms of the role that evidence plays in policy formation. But embedding the use of evidence in the development of local economic growth policy is certainly going to be a challenging task for whoever picks up the baton ...