Evaluation will set you free
Ben Hartridge | Researcher | @benhartridge
There is a clear need for social care services to become more innovative and adaptable to tackle complex challenges in dynamic and changing environments. However, we would argue that traditional approaches to monitoring and evaluation have potential to stifle innovation.
Our researcher, Ben Hartridge, draws on his experience with Crisis – the national homelessness charity, to argue that monitoring and evaluation can and should be designed to support innovation in public services and systems.
Traditional evaluation: holding back innovation
At the Dartington Service Design Lab (the Lab), we advocate for the need to monitor and evaluate public services and systems. We need to know that services are being delivered as intended and we should have evidence that they are having a positive effect. However, a traditional approach to monitoring and evaluation, places constraints on a service that may limit their capacity to innovate.
Ideally, in traditional monitoring and evaluation approaches of tightly defined services or activity, the service’s activities and objectives are known (or refined) before the evaluation begins. This allows the evaluation to be designed so that a judgment can be made of how effective the service is, or is likely to be, in achieving its objectives.
But this approach to evaluation doesn’t fit a service that is flexible or adaptable to its ever-changing context. The activities of an adaptable service are purposefully fluid. While its objectives are broadly defined – and can be represented in a theory of change – there may be flexibility in the approach to allow responsiveness to dynamic environments. If the service is made to fit into an overly restrictive evaluation framework then its capacity to innovate and respond nimbly will be constrained.
The Lab’s approach: evaluation to support innovation
Since 2017, the Lab has been supporting Crisis to trial innovative practice in Doncaster, South Yorkshire as part of the Hothouses for Innovation initiative. The Lab helped them define a set of challenges collectively with their local partners and service users around:
The intensity and impact of their service;
Sustainability of outcomes for their beneficiaries;
Effective local partnerships.
In response, Crisis embedded two of their coaches within the Complex Lives team in Doncaster: a new and emerging multiagency alliance of services for people living complex and chaotic lives.
Because Crisis were taking a flexible and personalised approach, neither the activities the coaches would undertake nor the outcomes they would work towards were fixed in advance. Instead of forcing Crisis to define their activities and outcomes to fit a traditional evaluative approach, the Lab designed a developmental evaluation framework that supported the charity’s adaptable approach to implementing the innovation. There were four tenets to our approach:
Co-produced
Evidence-informed
Proportionate
Fast-paced
Our evaluation design was co-produced with the Crisis team in Doncaster. We used a mixture of qualitative and quantitative methods to capture evidence about the service in a way that was proportionate to Crisis’ approach to implementation.
You can read more about the Lab’s approach to co-production and evaluation in this blog by my colleague, Kate Tobin.
The framework required minimal data collection because the activities and outcomes that should be measured were not known in advance. Semi-structured interviews with key stakeholders involved in delivering the innovation allowed the Lab to retain a focus on the extent to which the innovation addressed the initial challenge, while responding to and capturing learning from the developing reality of the work being delivered.
Over the six-month testing period, we produced monthly report-outs for Crisis. These provided cycle points at which decisions could be made to adapt and innovate based on evidence of what was happening in real time.
The central principle of our approach here was that monitoring and evaluation should foster, not hinder innovative and flexible service delivery. Our work with Crisis Skylight South Yorkshire gave them the space to innovate, but also the confidence that what they were learning and doing was being captured. At the Lab, we believe that well-designed evaluation can and should support innovation in public services.
Stay tuned for a fuller report from the Lab on Rapid Cycle Design and Testing – coming soon.
______________________
Further reading
Dartington Service Design Lab (forthcoming), Hothouses for Innovation: South Yorkshire Final Report
Dartington Service Design Lab (2019), Co-production in testing – the art of the possible?
Dartington Service Design Lab (2017), Crisis Skylight South Yorkshire: Challenge Brief
Michael Quinn Patton (2010), Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use