Looking back on the Nesta Future Ready Fund: lessons learned on impact monitoring

In Spring 2019, Franklin Scholars was awarded a grant under the Nesta Future Ready Fund, which we used to not only widen our reach to more school across England, but to invest in Franklin Scholars’ impact monitoring, evaluation, and reporting capacities. Here, we reflect on the lessons learned over the last 1.5 years.

Retrospective studies can provide a wealth of information

In June 2020, and as part of our grant, we published an extensive retroactive impact study for the Beacon Programme, which is available for download here. This study included data that we had collected over seven years from thousands of students and hundreds of parents, as well as academic progress data from 11,260 students submitted to us by partner schools. The analyses presented in the report helped us clarify what parts of our programme were most impactful on young people, and have lead to the development of a free online diagnostic tool for teachers wishing to run peer mentoring programmes in their schools as well as a suite of free digital resources available for download. The report also helped us convey to partner schools that we are serious about developing high-impact programmes, and willing to take a critical view of our interventions to see how we can improve.

Randomised Controlled Trials (RCTs) are doable - even for small organisations

Our biggest achievement under the Future Ready Fund, was that we developed and initiated a Randomised Controlled Trial (RCT) of the Beacon Programme across seven partner schools. The RCT was designed to assess the impact of the Beacon Programme on mentors and mentees using validated tests for self-efficacy (the Self-Efficacy Questionnaire for Children, SEQ-C) and academic tests benchmarked to the literacy and numeracy curriculum. Almost 500 students were enrolled in the study, which ran from September 2019 to March 2020, when it was interrupted by COVID-19.

Due to COVID-19, the RCT was not completed. Still, we were able to learn really valuable lessons on how RCTs are structured and run, and we present some lessons learned here. Regarding different tools used to measure social & emotional, as well as academic, progress under the programme:

  1. The Self-Efficacy Questionnaire for Children seemed very intuitive for the year groups we targeted (Years 7 and 10). Completion rates (e.g. answering every question on the survey) were relatively high, with 78% of students answering every question. In addition, we felt that the wording of the questions were appropriate for young people who may be suffering from social isolation or other complex life situations, including those who might speak English as a second language.

  2. Students who completed their self-efficacy surveys at school, were more likely to have parents who completed similar surveys that were sent to them on email and by text (parents were asked to complete Self-Efficacy Parent Report Scale using a text-based or online-based survey tool). This may be because parents asked their children about the research before they themselves completed their surveys. 

  3. Student and parental assessments of self-efficacy were correlated. As students rated themselves higher in their self-efficacy, so did their parents (and vice versa). This is interesting because we wanted to survey parents and teachers so that we could ‘triangulate’ student self-assessments. Had the RCT finished, we believe the parental data would have been very useful and effective in ‘validating’ the data collected from students.

  4. Dip in self-reported self-efficacy took place 5 to 8 weeks into the school year. Both students on Franklin Scholars and their peers reported dips in self-efficacy 5 to 8 weeks into the programme. It is not entirely clear why this is the case, but we have noticed this when we do other types of social and emotional skills testing in the past. It is for this reason that we advocate for social and emotional skills to get tested at three time points: the start of the year, slightly into the school year or at the mid point, and at the end of the year.

  5. Teachers/ Programme Leaders almost never completed surveys. Like parents, teachers and programme leaders were asked to complete the Self-Efficacy Teacher Report Scale for all students on the programme. However, they were often too busy to complete the surveys and they sometimes did not feel that they knew the students well enough to complete the surveys accurately.

  6. Academic tests that we created for the RCT, seemed appropriate for measuring changes in literacy and numeracy for Year 7 and Year 10 students. The academic tests that Franklin Scholars created for this RCT (by hiring experts in the curriculum) appear to be well suited for assessing literacy and numeracy of Year 7 and Year 10 students. The test results looked reasonably well distributed, meaning that the average student was able to get a 50% score on the test and an equal proportion of students scored above and below 50%. At the start of the school year, there was no difference detected in test scores between students on the Franklin Scholars programme and their peers. Presumably, we would have been able to detect – at the end of the year – whether students on the programme differed from their peers in any way.

Based on the above lessons learned, we feel that the self-efficacy questionnaires for children and parents were well-suited for our programme. We also think the academic tests were appropriate for our study purposes. This is promising, as it means we can re-use these testing tools in the future, and also share them with partner schools wishing to measure the impact of their peer mentoring intervention.

From a more operations and logistical perspective, we also learned a few important lessons in running an RTC:

  1. Consent procedures: We had complicated and multi-tiered consent procedures because we had intended to put out a peer-reviewed publication based on this RCT. Not every organisation wanting to run an RCT needs this level of consent, however, we highly recommend similar procedures are followed as a risk management strategy. Our consent procedures included: 1) School partnership agreement detailing the data to be collected signed by our CEO and the school (in England, schools retain the legal responsibility to ensure that data collected on their grounds is done ethically, etc.); 2) Parental consent opt-in for all students (using electronic forms, which have a higher return rate than paper forms), though one school notified all parents of the study but ran an ‘opt out’ strategy instead of ‘opt in’; and 3) all students had to write their full name (first and last name) on every test given to them if they consented for the data to be collected.

  2. Cost of an RCT: An RCT costs very little-to nothing in terms of physical resources. There were few non-staff costs aside from small fees incurred in hiring curriculum experts to develop the numeracy and literacy tests for us, and printing and shipping of test materials (we printed all test materials for all schools so as to limit the workload on the schools). However, the RCT was time intensive and required about 100 days of staff time to liaise with schools, prepare test materials, send the test materials, grade the tests, and enter the tests into an electronic database. Should the RCT have been completed, and the results published, it would have taken around c. 250 days of staff time to complete and report on. In other words – the RCT itself was simple to run, it just took a lot of time.

  3. Technical expertise: Most RCTs cost a lot because organisations ‘buy in’ technical expertise. In this instance, we were fortunate enough to have a former researcher on staff who could do data analysis and structure the RCT data collection in a way that was reasonable for analysis. In addition, we had the benefit of having two staff members on board who had been running Franklin Scholars’ M&E for years, and who deeply understood how our partner schools would respond and react to the RCT. However, what we needed was a third party to ‘sign off’ on our methods and to help advise on best practice for measuring social and emotional skills. We also needed some training on how to set our expectations for how to run an RCT in the education context. For this, the University of Sussex (the Nesta Future Ready Fund innovation partner) was invaluable. For example, they pointed us to the SPECTRUM resources (where we picked the three self-efficacy questionnaires that we used) and inspired us to think outside the box (it was their suggestion that we also survey parents and teachers, for example). Unfortunately, in the UK, there is no central entity that currently provides limited or pro-bono advising to small organisations trying to run their own RCTs. As such, without the Nesta Future Ready Fund, it is not clear where we would have gone for assistance.

Looking forward, the tools developed and selected for use during this RCT will continue to be used by Franklin Scholars in our impact measurement. In addition, and even though the RCT was not able to run to completion (due to COVID-19), the process of thinking through our impact - and how we could measure this compared to a control group - definitely improved our understanding of our intervention and helped us think through which students would benefit most from Franklin Scholars.

For more information on the Nesta Future Ready Fund, please click here. For more information on the Franklin Scholars seven-year impact report, which includes the retrospective impact report, please click here.