To help London’s children, one mentoring service is improving itself first

 

Deon Simpson | Service Design Specialist


The Dartington Service Design Lab (‘the Lab’) has partnered with Chance UK to support the redesign, testing and improvement of their child mentoring service. Using the Lab’s method–Rapid Cycle Design and Testing–we’re considering what it takes to create a mentoring service that children and parents want, mentors can deliver, and supports children’s social and emotional development.  

 

Blending science, user experience and practice 

Using scientific evidence to inform service design is important. It increases the chances that services include the ‘active ingredients’ that can support positive outcomes for people using them. But the evidence of what these ingredients are isn’t always available to service providers, and if it is, it still needs to be tailored to fit the context of the specific service and the preferences of its users. New knowledge gained from designing and delivering a service–which is also a type of evidence–needs to be fed back into the design in a timely way to maintain the service’s quality and longevity.

 
 
RCAT diagram final.png

Rapid Cycle Design and Testing is a method developed by the Lab that combines the best available evidence of ‘what works’, with users’ own take on what’s right for them, and providers’ first-hand experience of what can be implemented in practice. It guides services through several fast-paced cycles of designing, delivering, learning and refining, with each cycle divided into five key stages: Assess, (Re)design, Implement and Observe, Analyse and Learn, and Pause and Decide. 

Rapid Cycle Design and Testing draws on  implementation science, user-centred research and systems thinking to result in services that are evidence-informed, co-produced, doable, and sustainable within different settings. 

The My Future programme

Chance UK is a London-based charity supporting children with emotional and behavioural difficulties since 1995. Their long-standing mentoring service pairs each participating child with an adult volunteer from the community, with the aim of building a relationship that will foster the child’s positive social and emotional development. They take great care in recruiting volunteers whose backgrounds fit their service goals, and in pairing children and mentors based on similar backgrounds and interests – conditions which the scientific evidence suggests are required for mentoring to work well [1].

 After years of delivering and variability in their effectiveness, Chance UK is on a mission to improve their service through Rapid Cycle Design and Testing. They partnered with the Lab in late 2018 to design, deliver and test the new My Future programme. My Future aims to establish positive relationships between children and volunteers in Camden and Southwark over nine months, and hopes to learn how this supports building children’s self-esteem and self-regulation through a range of flexible one-to-one and group activities.

 

Beginning at the end

In October 2018, the Lab facilitated a series of workshops and focus groups with Chance UK staff, trustees and mentors. The goal was to collaboratively explore Chance UK’s existing data and frontline experience to build a shared understanding of the service’s strengths and weaknesses. 

The group reaffirmed their commitment to helping the children they serve reduce their emotional and behavioural difficulties, and worked backward to consider whether their current design targeted these difficulties in children as intended. A review of the scientific evidence aided in aligning mentoring activities to these needs. Mentors also suggested how their training and supervision can be improved and shared invaluable insights about the reality of working with young children on complex issues like self-awareness and self-esteem. 

This first Assess stage resulted in a new Theory of Change articulating a causal link from both new and existing activities to intermediate skills in self-regulation and self-esteem to the end-of-service outcome of reduced emotional and behavioural difficulties.  

 

Adaptive, user-centred and communicated

The work to redesign existing activities and design innovations took about four months. Redesigns include a more structured curriculum for one-to-one mentoring sessions, with a clearer plan for delivery over nine months and suggestions for flexing activities to suit children’s age and needs when necessary.

Innovations include a new facilitator-led group component, with children working together in five group sessions over five months. These sessions include engaging activities that give children the opportunity to enhance social skills by interacting with peers. Taking a user-centred approach, group sessions were piloted on two different occasions with mentors and children from another of Chance UK’s mentoring services. Their feedback helped to refine the group activities and delivery model. 

To prepare mentors to deliver My Future, a new training was developed, and refined after each offering based on feedback from trainees. All these changes and innovations have been spelled out in the new My Future Mentor Manual, facilitation notes for group sessions, and materials that mentors use with the children. 

  

Testing and taking early notes

Continuous, real-time data monitoring is a fundamental part of Rapid Cycle Design and Testing. It supports timely decision-making and redesigns, and earlier fixes for unintended consequences. We achieve it by first agreeing with partners on the kinds of questions that should be asked about the Theory of Change in order to test it. We then agree to collect only the types and amount of data needed to answer these questions with a ‘good enough’ level of confidence. 

Data collection forms, analysis plans and reporting templates are all co-designed early to make continuous monitoring and decision-making feasible. To foster a new culture and capacity for continuous monitoring, data collection and recording are led by partners from the start, while data analysis and reporting are handed over to them gradually from the Lab as their confidence grows.

Since April 2019, the Lab has been supporting Chance UK to track the implementation of My Future using a mixture of quantitative and qualitative data collected and recorded by mentors. In this Implement and Observe stage, we’re learning early that not all mentors have the same understanding of parts of the data collection forms, and the data system has some malfunctions – both threatened data accuracy and quality so we responded immediately with more guidance and technical support. 

Another valuable lesson is that the way the data looks after it is recorded in the data system makes it really useful for individual mentor decision-making (great!). But it requires extensive formatting to be used for tracking and answering questions about the Theory of Change (some changes to the system’s output are needed!).

 

Pragmatic hopes for learning

As we approach the first Analyse and Learn stage, a key question we’re keen to answer is ‘what are the most reliable ways to measure successful delivery of My Future, both in the short-term and by the end of the programme?’ This is just one of several questions we’re including in our success criteria for the programme. As we define these criteria, we’re striving to strike a balance between evidence-informed expectations, experience-based explanations, and what’s most meaningful and valuable to the service and its users. 

This is so that when we come together around the data to Pause and Decide between cycles, we will be making decisions–whether to keep, change, or discard–that are in the best interest of Chance UK’s children, parents and staff.

My Future will continue until May 2020. We’ll publish our first full report of the first six months of Rapid Cycle Design and Testing later this year. Watch this space!  

 

Deon Simpson leads the use of Rapid Cycle Design and Testing in the My Future programme.

Reference

DuBois DL, Portillo N, Rhodes JE, Silverthorn N, Valentine JC. (2011). How effective are mentoring programs for youth? A systematic assessment of the evidence. Psychological Science in the Public Interest, 12(2), 57-91.

 
ReflectionBLOGDeon