The Next Draft



"Our first draft is never our final draft."

It was a principle we embraced in the very earliest days of Rising, and embodies two things that have come to be fundamental to how we work. First, an insistence on setting ourselves high standards, even when - especially when - these standards exceed what others expect of us. Second, a recognition that quality rarely comes into the world fully formed: it is earned, through a relentless focus on trying to be better today than we were yesterday.

Since we began developing Rising in early 2014, we’ve written a pretty good first draft. 

In Sierra Leone, we’ve grown from 1 school to 10 in a little over 3 years and built an academic model that delivers 2 to 3 times the learning gains of comparable schools, according to independent research by Oxford University. Last year our first cohort of students to take public exams achieved a pass rate of 99%

It’s a model parents love. According to a study by the Acumen Fund, 90% of our parents said their children’s quality of education has very much improved since coming to Rising; 89% said their own quality of life has very much improved. Our Net Promoter Score is a category-leading 81. Even among parents who ultimately chose not to send their child to Rising, 87% said Rising was the best school in their area. 

Since 2016, we’ve successfully adapted that model to work in a very different context in Liberia: in rural not urban settings, in primary not secondary, and working with existing government schools and teachers rather than our own staff. Today we manage 29 schools across seven counties, and data from the randomised controlled trial we are part of suggests we are having a real impact.

On the back of this progress in our own schools, we’re increasingly being approached by other organisations interested in partnering with us to bring aspects of our model into their own work.

And we’ve done all this while navigating not just the Ebola Crisis, which forced all schools in Sierra Leone and Liberia to remain closed for nearly a year, but the enduring economic after-shocks in the region.

This has only been possible by investing in great people. Every day, Rising’s HQ teams work tirelessly to manage and support our teachers and school leaders to do great work.

I’m proud of what we’ve achieved, and the impact we’ve had on the lives of our students.

Students like Mary. Mary was in the initial cohort at our very first Rising Academy school in Sierra Leone. Mary’s mum is a market trader and a single parent, and when she found out Mary would be able to attend Rising Academy Regent she cried with happiness. She knew how much a quality education matters, and she knew her daughter had not been getting it before. At Mary’s primary school, her teachers would shout at her or beat her if she got answers wrong. Most of the time she didn’t even try. She was 12 years old when she joined us, but her reading was barely better than a student in the first grade of primary school. Thankfully, Mary has thrived at Rising. She’s become more confident, more bold in public speaking, her reading and maths has improved, and when she sat the BECE exam last year she passed it with flying colours.

Mary’s story is the story I want every Rising student to be able to tell. But the Rising Mary joined had 80 students. One of the reasons I know so much about Mary is that I knew almost all of those students by name. I knew many of their parents too. Today we have more than 8,500 students, and I definitely no longer know them all by name. And whether we have 80 students, or 8,000, or 80,000, I want them all to receive that same life-transforming quality of education that Mary has had. 

To do that, we need to learn to approach our work differently.

If you’re interested in helping us figure out how, we are hiring for two fantastic senior roles in our team. 

First, we’re looking for a Chief Operating Officer - the first time we’ll have had someone in this role. Their job will be to help us build the durable, scalable systems and routines that will empower every member of the Rising team, whatever their role - from teaching and learning to inventory and supply chains, from finance and school fees to recruiting and induction - to do fantastic work at scale. That will mean being able to interrogate the detail of how we currently do things, going back to first principles to think about how we could achieve the same things in a different way, and skilfully facilitating the tough conversations that need to happen across functions and disciplines to build a truly scalable approach.

The person I’m looking for will ideally have helped lead another organisation through this graduation from start-up to growth stage. They need to have very strong financial skills since, bluntly, the type of work we do and the places where we do it mean that the margin between success and failure is thin and requires tremendous financial discipline. But the reason it’s a COO role not a CFO role is that this challenge of system-building is one I see right across our work, from finance to operations to data to people. 

We are also looking for a Managing Director for Sierra LeoneSierra Leone is where it all started for us, and what our team has achieved there in such a short space of time is truly remarkable. At another time or in another place, it might be enough to coast on that initial success. But Sierra Leone remains one of the poorest countries in the world, and the headwinds in the post-Ebola environment are significant. The potential is there to grow our footprint and scale our impact, and we are as well positioned as we could possibly hope to be. But seizing that opportunity will require creativity, ingenuity and tenacity, and that needs to start with the Managing Director. They’ll have a great team behind them, but they’ll need to be a leader that can get the best out of people, who knows how to bring different functions together behind a coherent strategy, and who is not coming in expecting there to be easy answers. Above all, they'll need to be a champion of Rising’s values and principles; someone who can make sure that as we grow we don’t lose sight of who we are. 

And that brings me back to Mary. It’s too easy when you start talking about scaling up to remember the why but forget the who; for the students themselves to become faceless, nameless. We’re scaling up not because we’ve forgotten about the individual stories of girls like Mary. We’re doing it because we know there are thousands more just like her. I want their story to be a Rising story too.

If you share that ambition, I’ll hope you’ll apply.

The first draft is done. Come help us write the next draft.

ICYMI: Displaced podcast

In case you missed it, last month I was a guest of Ravi Gurumurthy and Grant Gordon on the Displaced podcast from the IRC and VoxMedia. We talked about Rising’s origins, our experience during the Ebola epidemic, the great public vs private debate, PPPs and education reform. It got pretty geeky, but it was a great conversation and I’m really grateful to Ravi and Grant for having me on.

Listen to the whole thing here.

Why do parents choose Rising?

New study finds high levels of parent satisfaction with Rising schools and strong belief in their quality.

The end of the school year is approaching, so our attention is turning to next year and how we recruit our next cohort of Rising students.

As parents consider the school options available to them for next year, what will go through their mind? What motivates some parents to choose Rising schools, while others consider our schools but ultimately go elsewhere?

Those were the questions we asked the Lean Data team at Acumen Fund to investigate for us. Acumen, a leading global impact investor, has spent the last four years developing an innovative, low cost, fast cycle approach to gathering customer insights in social impact organisations. Initially designed to help Acumen and its investees gather actionable data about the reach, impact and value of their activities, this “Lean Data” approach has subsequently been taken up by a growing number of other organisations.

As a methodology, Lean Data isn’t revolutionary, but it’s not meant to be. To me, its value lies in three things:

  • Providing standard question sets and modules that greatly simplify the process of questionnaire design
  • Challenging organisations to think about the merits of timely, good enough evidence over more rigorous but slower evidence
  • (Increasingly, as more social impact organisations start using these techniques and question sets), helping organisations to benchmark themselves against their peers.

Back to this study. To help us understand how Rising is perceived by parents and what we might need to do to reinforce or challenge those perceptions, the Lean Data team conducted phone interviews with a sample of current Rising parents (‘Choosers’) and a sample of parents who were interested enough in Rising to want more information from one of our outreach team, but ultimately didn’t enroll their child (‘Non-choosers’).

As with many Lean Data engagements, the sample sizes are small because the focus was on speed - from inception to full results was less than 4 weeks - so from a strictly statistical point of view the results are suggestive not definitive. Then again, the goal here wasn’t to get an objective measure of our quality. We have our independent evaluations for that. What we wanted to know was how the progress we’re seeing in those evaluations is informing parent perceptions of our schools, if at all.

So what did they find? Here are a few of the highlights:

1. Parent satisfaction with our schools is very high. Our Net Promoter Score (NPS), a common measure of customer satisfaction derived by asking parents how likely they are to recommend us to friends or family, yielded a score of 81 out of 100. The average for the 100+ social impact organisations around the world which the Lean Data team have worked with so far is 40, and anything above 50 is considered very good. (By way of contrast, less than 10% of parents who had considered Rising but ultimately gone with a different school said they were very likely to recommend that school to others.)

2. Parents were overwhelmingly positive about the impact Rising has had on their lives and the lives of their children. 89% said their own quality of life had “very much improved” because of Rising. 90% said the quality of education their children was receiving had “very much improved” because of Rising.

3. When asked to describe how this impact had presented, parents spoke not just about seeing improvements in their children’s academic performance (particularly in English) but in their attitudes to school overall. 20% mentioned seeing improvements in children’s desire to learn. “they love to read at home, not like before”, said one parent. Others mentioned improvements in children’s confidence and academic self-esteem: “It makes me a proud parent that my daughter can now stand boldly in front of people”, as one put it. That’s encouraging and tallies with some of what we’ve heard from students in Oxford University’s impact evaluation of our work.

4. Rising’s brand is strongly associated with quality. The top two factors cited by parents who had chosen Rising were education quality (mentioned by 43%) and teacher quality (mentioned by 29%). Even among parents who ultimately chose not to send their child to us, 87% said Rising was better quality than the alternatives in their area.

There were a lot of other insights that we’ll be exploring further and factoring into our outreach for next year. For example, it seems like word-of-mouth referrals were a particularly important channel for hearing about Rising and something that we need to make better use of. We also learned that for many "non-choosers" the reason for not choosing Rising was being reluctant to switch from a school their child was already attending. This suggests we need to do a better job of building long-term relationships ahead of key decision points. And while parents are impressed with what’s happening academically in our schools they want us to go further on the extra-curricular side. All useful insights.

But beyond the specifics, the study was a reminder of how important it is that we keep seeking this kind of feedback from parents, not least because it’s so energising to hear them say in their own words what it is that they value about Rising. Here were three of my favourite quotes:

  • “Rising Academies is here for we the low income earners.”
  • “Seeing him bossing his cousins at home in maths makes me proud."
  • “The one main reason why I will recommend Rising Academy is their dedication to see that our kids get the best education as possible.”

With that dedication in mind, we look forward to making the next school year our best yet.

Latest evaluation results from Sierra Leone

oxford study 2016-17 chart.png

Rising students in Sierra Leone continue to progress in reading and maths at 2 to 3 times the rate of their peers in other schools, but given their starting point they need to be progressing even faster.

That's the headline finding from the latest annual report from our independent evaluators at Oxford University.

For a summary of what the report says and my reflections on it, read on. To skip the commentary and dive straight into the report itself, click here.


A team at Oxford University led by Dr David Johnson have been leading a 3 year impact evaluation of Rising's schools in Sierra Leone. The baseline report completed in early 2016 is here; the first annual report (completed in September 2016) is here. This is the team's second annual report covering the 2016-17 academic year.

The study's full set of outcome measures include computer-adaptive tests of reading and maths, a measure of writing, and a measure of non-cognitive traits or 'learning dispositions'. Some of these measures will only be followed up at endline, however; the annual reports focus on the reading and maths measures.

The progress of Rising students on these measures is benchmarked against random samples of students from two types of schools: government schools and other private schools.

The first annual report found encouraging evidence that Rising students were making rapid learning gains compared to their peers in other schools. We were curious to see whether that trend would continue.

Key findings

Broadly speaking, the report finds that it has. To quote the conclusion:

“RAN schools seem to make better cumulative gains, learn at a faster rate, and weaker students make stronger transitions from poor performance to good performance bands when compared to matched student samples in private and government schools. It is also the case that learning outcomes are more equitable in RAN schools.”

Let's unpack each of those claims a bit.

First, Rising students are making more rapid progress than their peers in other schools. In reading, the cumulative gains represent about twice as much progress as students in comparison schools; in maths, about 2.4 times as much as students in other private schools and more than 3 times as much as students in government schools.

Second, these gains are evenly distributed across both boys and girls. Girls at Rising schools are progressing much faster than their male counterparts in comparison schools. In reading, Rising’s boys do continue to progress slightly faster than Rising’s girls, but in maths girls are closing the gap.  

Third, the distribution of learning gains between stronger and weaker learners is also more equitable in Rising schools. At baseline, about 75% of students in all three school types were in the lowest performance band for reading. At Rising, that proportion had nearly halved, to 40%. But in the other private and government schools, the percentage of students transitioning out of the lowest performance band was much lower: 56% of private school students and 59% of government students were still in the lowest performance band.

All of these findings suggest that, regardless of which subject or which subsample of students you look at, the learning trajectories of Rising students are now significantly steeper than their peers in other schools. The World Bank has been promoting the idea of "equivalent years of schooling" as an intuitive way to understand programme effectiveness. On this basis, each year in a Rising school is, according to these data, worth an extra 1 to 2 years in another private or government school.

The challenge the study still points to, however, is whether we are bending these learning trajectories sharply enough, particularly in maths. The baseline showed that students are enrolling into secondary school with reading and maths skills 5 grades below their expected level - closer to a second grader. This report shows they are making significant progress, but they are still behind where they should be.

What the report doesn't or can't say (yet)

Some of the report's limitations are by design. It doesn't revisit the non-cognitive measures, so we'll have to wait until the final report to see what our impact has been there. Those measures are important because there's a common critique that prioritising the achievement of excellence in foundational skills must somehow come at the expense of a more holistic approach to students' development, including their socio-emotional learning. I think that reasoning is flawed - my hypothesis is that the schools that are good at socio-emotional learning are probably the schools that are good at other types of learning too - but we'll have to wait to see what the data say before we settle that argument.

Other limitations were not by design. In particular, the study team experienced a significant amount of sample attrition. This seems to be about student absence on testing days - the perils of trying to do follow-ups during Sierra Leone's rainy season - rather than student drop-out. Both would be challenges, but the latter would be more worrying and harder to fix in the final year of the study.

The concern is whether differential attrition rates change the composition of the sample of students being assessed. In particular, in government schools attrition seems to have been concentrated in the lowest achieving students. This makes it problematic to measure progress by comparing the average scores of all students who took a particular test: if the sample who showed up for the latest follow up test differ in meaningful ways from the sample who showed up for the baseline test, there is a risk of attributing a change in average scores to a change in what students know when in fact it is just a change in which students are tested. The report is therefore cautious about interpreting these data. (In fact, these changes in the composition of the sample don't seem to make much difference to the estimated effect for Rising or the other private schools, but they make a big difference for the government schools.)

The way round this, and to guarantee the comparison over time is like-for-like, is to focus just on those students who were tested at baseline and the latest follow-up (see Tables 6 and 7 in the report).

What we'd like to see

We've enjoyed a good dialogue with the evaluation team about their research. Here are three areas we'd like to see the next (and final) stage of the research look at.

The first is the use of criterion-referencing (that is, using an absolute benchmark of performance against a fixed standard or level of competency, rather than a relative benchmark of performance as compared to other students) for setting the performance bands and student growth targets. There is a perfectly reasonable theoretical justification for defining the thresholds of the different bands and the target learning trajectories in the way that the team have. As a way of measuring progress towards mastery, it makes sense. But by definition it isn't contextualised - geometry is geometry, whether you are doing it in Freetown or Finland. As a result, you end up with the situation that the vast majority (above 75%) of all students in the sample were in the lowest performance band at baseline, making a granular analysis of exactly which students are progressing quite difficult. Without losing the focus on what mastery requires in absolute terms, it would be helpful to see more analysis using relative not just absolute benchmarks.

The second is to look at what might be driving these results. A study on this scale is unlikely to be able to identify precisely which features of our model are having the biggest impact; just detecting whether there's been an impact at all is methodologically challenging enough. But it would be interesting if the study were also able to document differences in inputs or intermediate outputs, such as teacher quality, instructional time or other factors we might expect to be associated with higher quality learning. Finding differences in these areas would lend support to the idea that observed differences in outcomes are true effects, and might help to focus future evaluation efforts.

The third area is around socio-economic status. One of the big debates around private schools is the extent to which any performance advantage they enjoy over public schools is purely the result of differences in the students they serve. 'Consensus' is probably too strong a word, but I would say the balance of evidence in the literature is that SES accounts for some but not all of the private school performance advantage. In any case, it's a legitimate concern in interpreting these sorts of results.

There are a number of features of the study that help address this concern. The first is obviously that the comparison group includes other private schools, rather than just making a crude private vs public comparison. Second, the team purposively sampled schools in the same geographic areas as Rising schools. Fee levels and average primary school leaving exam grades were also compared to ensure that student populations were drawn from broadly similar demographics. Third, the study benefits from being focused on secondary school students. We know that SES impacts are felt very early: in work by Pauline Rose and her colleagues in the Young Lives study, literacy gaps between rich and poor students are already significant by age 8 in many countries. Students in this study were ~12-13 years old at baseline and had all completed six years of primary school. That there were no significant differences in baseline reading and maths abilities across students in the three types of schools does not rule out the possibility of an SES effect, but it does at least beg the question of why this SES effect should suddenly be kicking in now when it had not materialised in students' academic trajectories through primary.

But for all that, it would still be interesting to see the team include a larger set of SES controls.

What this report adds

These quibbles notwithstanding, we're grateful to the Oxford team for another illuminating annual report. Alongside other sources of evidence - encouraging recent public exam results in Sierra Leone, the midline results from the randomised controlled trial of Partnership Schools for Liberia - it adds to an increasingly compelling picture of the potential our model seems to have to transform the quality of education available to families in Sierra Leone. The teachers, school leaders and HQ staff that have made this possible should be very proud.

But "however well we do, we always strive to do better", as we like to say round here, so I know my team, like me, will be asking how we can do more and do better.

Two areas stand out: first, what's behind the slower rate of progress in maths, and what can we do about it? Relative progress compared to other schools was actually greater in maths than in reading, so is it just that the content is more challenging? Are there specific topics where students are struggling? This requires further investigation. Second, we are still fairly new and fairly small. In the academic year studied here, we had a total of 13 schools and 2,400 students across Sierra Leone and Liberia. This year we have 39 schools and more than 8,500 students. We need to demonstrate not just that our model can produce big impacts, but that it can continue to do so consistently over time and as we grow.

So, those are my takeaways: what are yours? Read the report for yourself and then leave a comment below, or Tweet us @pjskids or @risingacademies.

Rising students excel in first public exams


In Sierra Leone, we've just received our first ever set of public exam results and I'm really excited by what they show about the progress we're making. The headline is that after just two years with us, 99% of our Junior Secondary School students achieved the grades they need to go onto Senior Secondary School

Here's the background.

Back in July/August 2017, the 97 students who made up our oldest cohort of Junior Secondary School (JSS) students sat the Basic Education Certificate Examination (BECE). (When we've opened schools we've typically only been enrolling one or two grades at a time, so in future years that cohort will be bigger as more students start to come through our system.) 

The BECE is an important staging post for students in Sierra Leone, marking the end of universal basic education and acting as a gateway into Senior Secondary School or technical and vocational study. Students sit papers in a number of core and elective subjects. Each paper is marked on a 1 to 7 scale with 1 being the highest. A subject pass requires a score of 6 or better. A student's overall result is based on their scores in six subjects: the four core subjects of English, Maths, Science and Social Studies, plus the best score from each of two sets of electives. To be awarded the BECE, students must achieve at least 4 subject passes, including at least one of either English or Maths. But to go onto Senior Secondary School, the Government of Sierra Leone has stipulated that students must achieve 5 subject passes, including at least one of English or Maths. (Those with only 4 passes are eligible to go onto technical and vocational study.)

I don't know what the national picture looks like for this year yet, and even finding recent historical benchmarks is quite hard (this World Bank report has numbers for 2000-2005, and this UNESCO report for 2005-2011). I'll have more to say about these results in due course once that national picture is clearer. But at first glance it seems our students have done remarkably well.

First and foremost, I want to say well done to every single one of our students for the hard work they put into preparing for these exams. It's hard to imagine just how disrupted Junior Secondary schooling has been for this cohort. They should have begun JSS in September 2014, just as the devastating Ebola epidemic forced all schools in Sierra Leone to remain closed. When they were finally able to start JSS in April 2015, they had to deal with a truncated academic calendar for JSS1 and JSS2, with six terms' worth of material compressed into four in order to make up for lost time. Everyone who sat this examination deserves credit for not letting that disruption get in the way of their determination to pursue their education.

Second, congratulations to all those who got their 5 passes, and commiserations to the one student who just missed out: I know that you did so by the narrowest of margins and despite passing both English and Maths. I hope that whatever our students thought when they first found out their results - whether they were thrilled or disappointed - they will remember the words of our school creed: however well we do, we always strive to do better. 

Third, that applies to Rising as an organisation too. These results are an important milestone for us. It's the first time any of our students have sat public exams and while we know from independent evaluations that our students seem to be making good progress compared to their peers in other schools, these exams are still an important test of whether we are delivering the quality education our students and their parents expect. I'm therefore delighted to see so many of our students do well. It's particularly encouraging given that these students had barely been with us two years when they took the exam, and given that their literacy and numeracy levels when they first enrolled were typically well below grade level. Nevertheless, I'm still looking forward to diving into this data to understand what we can be doing better. For instance, while pass rates are really important (especially for our students and their families), as a school network it's more informative for us to understand our "value-added": the extent to which students perform better at the BECE than you would predict given their incoming learning levels (as shown, for example, by their scores in the primary school leavers' exam, the NPSE). That's something we'd be very interested in looking at, but it's tricky to do with the data currently available. (Any researchers who want to help us out: call me!) There are also some interesting variations in performance across subjects that we need to understand and reflect on.

In short, lots to think about and work on, but a great piece of news with which to start 2018. 

A big congratulations again to our students, and a big thank you to all our teachers, school leaders, curriculum writers and the whole Rising team for the hard work they put into helping them achieve these great results.