Evaluation results show strong gains - especially for girls

Isatu’s story

Schools are supposed to be a safe space for learning, but for Isatu Kabba, 14, that’s not how it felt at her old primary school. “There was no proper monitoring of who comes in and out of the school,” she recalls. “There was a lot of noise, a lot of bad language, bad behaviour from some of the boys, and less concentration.”

That’s changed since she enrolled for secondary school at one of Rising’s schools in Sierra Leone. She says she feels safer in the school environment, and more motivated because of the encouragement and quality teaching she gets from school staff. In last year’s public exams, she received one of the highest marks of any Rising student.

As a first-generation learner - neither her mother, a petty trader, nor her father, currently out of work, ever had the opportunity to go to school - Isatu is passionate about her education. She has continued on to the senior secondary level, focusing on the science stream. When she finishes, she wants to study medicine at university and dreams of becoming a doctor.

The experience of girls like Isatu, and just how different it is from the experience of girls in other schools, is one of the highlights of the comprehensive final report of a three year impact evaluation of Rising’s work in Sierra Leone. The study, by Dr David Johnson and PT Jenny Hsieh of Oxford University, finds that girls in Rising schools make faster progress than the boys, and 2-4 times the progress of girls in comparison schools.

As ever, if you want to jump straight to the study, it’s here. We’ve also put together this factsheet with the headline findings. For more of the background to the study and our reactions to it, read on.

Background to the study

I’ve written about the background to the study previously (here), but to briefly recap:

  • The study assessed the learning gains of Rising junior secondary students in reading and maths from January 2016-May/June 2018. 

  • These learning gains were benchmarked against the progress of comparison groups of students attending comparable private schools and government schools in the same neighbourhoods. 

  • The incoming ability levels of these students at baseline were approximately similar.

  • This period of 2.5 calendar years actually represents 3 academic years: the September 2015-July 2016 academic year was truncated as a result of the Ebola epidemic. 

  • The study used an innovative computer-adaptive testing software to estimate the progress made by individual students more precisely than is usually the case with a paper and pencil test.

  • In addition to cognitive measures, the study also explored progress in non-cognitive domains.

Figure 1. How the evaluation defines impact

Figure 1. How the evaluation defines impact

There are some limitations to the study:

  • It is a quasi-experimental not a fully experimental research design. Although the specific students sampled in each school were randomly selected, the schools themselves were purposively sampled. This creates the risk of sample selection bias, where the outcomes achieved by different schools are caused in part by differences in the backgrounds of the students themselves. 

  • This is dealt with in part by checking that students’ prior achievement was approximately similar, and then focusing on progress from that baseline i.e. comparing learning gains (differences-in-differences) rather learning levels. But there are differences in the socio-economic status of students (SES) in the study, particularly between students at private schools (both Rising and other private schools) and their peers in government schools. The Oxford team plans to do more work to explore the impact of these differences.

  • The research team also experienced significant sample attrition (that is, students surveyed at baseline not presenting for follow-up assessments) across all three school types. Again, this creates a risk of outcomes being shaped by differences in the composition of students presenting for assessments rather than because of anything to do with the schools themselves.

  • The team have dealt with this mainly by reporting average scores for three sub-groups of students:

    a. All students presenting on that particular test (irrespective of whether or not they presented on any other test)

    b. All students who presented at the baseline, each end of year test, and the endline 

    c. All students who presented at the baseline and endline

  • Group (a) is not useful for making comparisons over time as the students taking a test in one period may not be the same students taking that test in the next period. Group (b) is in some ways the most interesting as it allows us to see the shape of their learning trajectories over time. But the sample that took all 4 of these assessments is very small. Group (c) is larger, allows for a fair comparison of progress over time and is therefore the best group to focus on. While the pattern is broadly the same whether you look at Group (b) or Group (c), there are important differences in the results which is why it is helpful for the team to present both.

The baseline reports, and Year 1 and Year 2 midline reports, are all on our website if you want to check them out.

Headline findings

The key findings largely echo what the two midline reports found: RAN students make significantly larger gains than their peers in other schools, and these gains are more equitably distributed.

Finding 1. RAN students make significantly more progress than students in either comparison group in reading. 

  • RAN students make 48% more progress in reading than the private comparison students. That is, RAN students learn as much in 1 year of schooling as their peers learn in 1.5-1.75 years. For those who care about such things, this is an effect size of 0.41.

  • RAN students make 160% more progress in reading than the government comparison students. That is, RAN students learn as much in 1 year of schooling as their peers learn in 2.5 years. This is an effect size of 0.77.

Finding 2. RAN students make significantly more progress than students in either comparison group in maths. 

  • RAN students make 120% more progress in maths than the private comparison students. That is, RAN students learn as much in 1 year of schooling as their peers learn in 2.2 years. This is an effect size of 0.42.

  • RAN students make 133% more progress than the government comparison students. That is, RAN students learn as much in 1 year of schooling as their peers learn in 2.3 years. This is an effect size of 0.46.

Finding 3. Girls do much better in RAN schools than in comparison schools.

  • In all school types, girls start behind boys at baseline. In government and private schools, they fall further behind every year - in fact, girls’ progress is less than half of the progress of boys. By contrast, in RAN schools girls actually progress faster than boys, and much faster (2-4x) than girls in either comparison group.

  • Indeed, a lot of the difference in overall learning gains between Rising and non-Rising schools seems to be driven by the differential performance of girls.

Figure 2. Reading gains by gender

Figure 2. Reading gains by gender

Figure 3. Maths gains by gender

Figure 3. Maths gains by gender

Finding 4. RAN schools also do a better job of improving performance of students with lower prior achievement. 

  • The Oxford team set thresholds to divide the sample into four ability bands.

  • In reading 74% of RAN students who started in the bottom two ability bands at baseline had progressed to a higher band by the endline, compared to 48% and 47% in private and government comparison schools respectively. 

  • In maths, 65% of of RAN students who started in the bottom two ability bands at baseline progressed to a higher band by the endline, compared to 34% and 39% in private and government comparison schools.

Reactions and reflections

First and foremost we’re grateful to the Oxford team for their work on this project over the last few years, and to the staff and students of the comparison schools for taking part.

We’re particularly delighted by the progress of our girls. Globally, the conversation about girls’ education can, at times, fall into one of two traps. The first is to get stuck at the why. As Echidna Giving argues. “it’s time to move from asking ‘Why serve girls’ to asking ‘How?’” The second is to talk about educating girls as if they are a different species. Yet as Dave Evans and Fei Yuan have recently argued, “one of the best ways to help girls overcome the learning crisis may be to improve the quality of school for all children.” A lot of the most effective interventions for improving outcomes for girls, they find, don’t specifically target girls. This study perhaps lends some weight to that view. Get the basics right for all students, and girls will flourish.  

I’m very proud of the hard work of our teachers, school leaders and central team in achieving these results. I heard a talk from an impact evaluation expert the other day who made the argument that most organisations try to do evaluations too quickly, before they have worked out all the kinks and settled on a stable model. I confess we’ve probably been a little guilty of that, launching this evaluation when we were less than 2 years old and when our schools had only been up and running for 6 months. But it also says a lot about our team that they welcomed this kind of rigorous scrutiny of their work when there was so much we were still figuring out. 

That relates to a third point. To be honest, the very idea of a stable model isn’t one that particularly resonates with us. Yes, I think we do have a more defined model today than we did three years ago. But whether it’s our curriculum or our school oversight systems or our approaches to teacher coaching, we’re always looking for ways to innovate and improve. 

The results themselves show why this is so important. A mantra at Rising is that “however well we do, we always strive to do better.” Compared to other schools, the learning gains here are impressive. It’s particularly striking that they are achieved by enabling more disadvantaged groups of learners - girls, and those with lower prior achievement - to progress much faster than is the case in other schools. 

But in absolute terms, our students still lag behind where they should be. Their learning trajectories are much steeper than their peers in other schools, but there’s so much more we need to do to keep bending them upwards even further.

So, to repeat something else we like to say round here, “our first draft is never our final draft.” Here’s to the next draft.