Why do parents choose Rising?

New study finds high levels of parent satisfaction with Rising schools and strong belief in their quality.

The end of the school year is approaching, so our attention is turning to next year and how we recruit our next cohort of Rising students.

As parents consider the school options available to them for next year, what will go through their mind? What motivates some parents to choose Rising schools, while others consider our schools but ultimately go elsewhere?

Those were the questions we asked the Lean Data team at Acumen Fund to investigate for us. Acumen, a leading global impact investor, has spent the last four years developing an innovative, low cost, fast cycle approach to gathering customer insights in social impact organisations. Initially designed to help Acumen and its investees gather actionable data about the reach, impact and value of their activities, this “Lean Data” approach has subsequently been taken up by a growing number of other organisations.

As a methodology, Lean Data isn’t revolutionary, but it’s not meant to be. To me, its value lies in three things:

  • Providing standard question sets and modules that greatly simplify the process of questionnaire design
  • Challenging organisations to think about the merits of timely, good enough evidence over more rigorous but slower evidence
  • (Increasingly, as more social impact organisations start using these techniques and question sets), helping organisations to benchmark themselves against their peers.

Back to this study. To help us understand how Rising is perceived by parents and what we might need to do to reinforce or challenge those perceptions, the Lean Data team conducted phone interviews with a sample of current Rising parents (‘Choosers’) and a sample of parents who were interested enough in Rising to want more information from one of our outreach team, but ultimately didn’t enroll their child (‘Non-choosers’).

As with many Lean Data engagements, the sample sizes are small because the focus was on speed - from inception to full results was less than 4 weeks - so from a strictly statistical point of view the results are suggestive not definitive. Then again, the goal here wasn’t to get an objective measure of our quality. We have our independent evaluations for that. What we wanted to know was how the progress we’re seeing in those evaluations is informing parent perceptions of our schools, if at all.

So what did they find? Here are a few of the highlights:

1. Parent satisfaction with our schools is very high. Our Net Promoter Score (NPS), a common measure of customer satisfaction derived by asking parents how likely they are to recommend us to friends or family, yielded a score of 81 out of 100. The average for the 100+ social impact organisations around the world which the Lean Data team have worked with so far is 40, and anything above 50 is considered very good. (By way of contrast, less than 10% of parents who had considered Rising but ultimately gone with a different school said they were very likely to recommend that school to others.)

2. Parents were overwhelmingly positive about the impact Rising has had on their lives and the lives of their children. 89% said their own quality of life had “very much improved” because of Rising. 90% said the quality of education their children was receiving had “very much improved” because of Rising.

3. When asked to describe how this impact had presented, parents spoke not just about seeing improvements in their children’s academic performance (particularly in English) but in their attitudes to school overall. 20% mentioned seeing improvements in children’s desire to learn. “they love to read at home, not like before”, said one parent. Others mentioned improvements in children’s confidence and academic self-esteem: “It makes me a proud parent that my daughter can now stand boldly in front of people”, as one put it. That’s encouraging and tallies with some of what we’ve heard from students in Oxford University’s impact evaluation of our work.

4. Rising’s brand is strongly associated with quality. The top two factors cited by parents who had chosen Rising were education quality (mentioned by 43%) and teacher quality (mentioned by 29%). Even among parents who ultimately chose not to send their child to us, 87% said Rising was better quality than the alternatives in their area.

There were a lot of other insights that we’ll be exploring further and factoring into our outreach for next year. For example, it seems like word-of-mouth referrals were a particularly important channel for hearing about Rising and something that we need to make better use of. We also learned that for many "non-choosers" the reason for not choosing Rising was being reluctant to switch from a school their child was already attending. This suggests we need to do a better job of building long-term relationships ahead of key decision points. And while parents are impressed with what’s happening academically in our schools they want us to go further on the extra-curricular side. All useful insights.

But beyond the specifics, the study was a reminder of how important it is that we keep seeking this kind of feedback from parents, not least because it’s so energising to hear them say in their own words what it is that they value about Rising. Here were three of my favourite quotes:

  • “Rising Academies is here for we the low income earners.”
  • “Seeing him bossing his cousins at home in maths makes me proud."
  • “The one main reason why I will recommend Rising Academy is their dedication to see that our kids get the best education as possible.”

With that dedication in mind, we look forward to making the next school year our best yet.

Latest evaluation results from Sierra Leone

oxford study 2016-17 chart.png

Rising students in Sierra Leone continue to progress in reading and maths at 2 to 3 times the rate of their peers in other schools, but given their starting point they need to be progressing even faster.

That's the headline finding from the latest annual report from our independent evaluators at Oxford University.

For a summary of what the report says and my reflections on it, read on. To skip the commentary and dive straight into the report itself, click here.

Background

A team at Oxford University led by Dr David Johnson have been leading a 3 year impact evaluation of Rising's schools in Sierra Leone. The baseline report completed in early 2016 is here; the first annual report (completed in September 2016) is here. This is the team's second annual report covering the 2016-17 academic year.

The study's full set of outcome measures include computer-adaptive tests of reading and maths, a measure of writing, and a measure of non-cognitive traits or 'learning dispositions'. Some of these measures will only be followed up at endline, however; the annual reports focus on the reading and maths measures.

The progress of Rising students on these measures is benchmarked against random samples of students from two types of schools: government schools and other private schools.

The first annual report found encouraging evidence that Rising students were making rapid learning gains compared to their peers in other schools. We were curious to see whether that trend would continue.

Key findings

Broadly speaking, the report finds that it has. To quote the conclusion:

“RAN schools seem to make better cumulative gains, learn at a faster rate, and weaker students make stronger transitions from poor performance to good performance bands when compared to matched student samples in private and government schools. It is also the case that learning outcomes are more equitable in RAN schools.”

Let's unpack each of those claims a bit.

First, Rising students are making more rapid progress than their peers in other schools. In reading, the cumulative gains represent about twice as much progress as students in comparison schools; in maths, about 2.4 times as much as students in other private schools and more than 3 times as much as students in government schools.

Second, these gains are evenly distributed across both boys and girls. Girls at Rising schools are progressing much faster than their male counterparts in comparison schools. In reading, Rising’s boys do continue to progress slightly faster than Rising’s girls, but in maths girls are closing the gap.  

Third, the distribution of learning gains between stronger and weaker learners is also more equitable in Rising schools. At baseline, about 75% of students in all three school types were in the lowest performance band for reading. At Rising, that proportion had nearly halved, to 40%. But in the other private and government schools, the percentage of students transitioning out of the lowest performance band was much lower: 56% of private school students and 59% of government students were still in the lowest performance band.

All of these findings suggest that, regardless of which subject or which subsample of students you look at, the learning trajectories of Rising students are now significantly steeper than their peers in other schools. The World Bank has been promoting the idea of "equivalent years of schooling" as an intuitive way to understand programme effectiveness. On this basis, each year in a Rising school is, according to these data, worth an extra 1 to 2 years in another private or government school.

The challenge the study still points to, however, is whether we are bending these learning trajectories sharply enough, particularly in maths. The baseline showed that students are enrolling into secondary school with reading and maths skills 5 grades below their expected level - closer to a second grader. This report shows they are making significant progress, but they are still behind where they should be.

What the report doesn't or can't say (yet)

Some of the report's limitations are by design. It doesn't revisit the non-cognitive measures, so we'll have to wait until the final report to see what our impact has been there. Those measures are important because there's a common critique that prioritising the achievement of excellence in foundational skills must somehow come at the expense of a more holistic approach to students' development, including their socio-emotional learning. I think that reasoning is flawed - my hypothesis is that the schools that are good at socio-emotional learning are probably the schools that are good at other types of learning too - but we'll have to wait to see what the data say before we settle that argument.

Other limitations were not by design. In particular, the study team experienced a significant amount of sample attrition. This seems to be about student absence on testing days - the perils of trying to do follow-ups during Sierra Leone's rainy season - rather than student drop-out. Both would be challenges, but the latter would be more worrying and harder to fix in the final year of the study.

The concern is whether differential attrition rates change the composition of the sample of students being assessed. In particular, in government schools attrition seems to have been concentrated in the lowest achieving students. This makes it problematic to measure progress by comparing the average scores of all students who took a particular test: if the sample who showed up for the latest follow up test differ in meaningful ways from the sample who showed up for the baseline test, there is a risk of attributing a change in average scores to a change in what students know when in fact it is just a change in which students are tested. The report is therefore cautious about interpreting these data. (In fact, these changes in the composition of the sample don't seem to make much difference to the estimated effect for Rising or the other private schools, but they make a big difference for the government schools.)

The way round this, and to guarantee the comparison over time is like-for-like, is to focus just on those students who were tested at baseline and the latest follow-up (see Tables 6 and 7 in the report).

What we'd like to see

We've enjoyed a good dialogue with the evaluation team about their research. Here are three areas we'd like to see the next (and final) stage of the research look at.

The first is the use of criterion-referencing (that is, using an absolute benchmark of performance against a fixed standard or level of competency, rather than a relative benchmark of performance as compared to other students) for setting the performance bands and student growth targets. There is a perfectly reasonable theoretical justification for defining the thresholds of the different bands and the target learning trajectories in the way that the team have. As a way of measuring progress towards mastery, it makes sense. But by definition it isn't contextualised - geometry is geometry, whether you are doing it in Freetown or Finland. As a result, you end up with the situation that the vast majority (above 75%) of all students in the sample were in the lowest performance band at baseline, making a granular analysis of exactly which students are progressing quite difficult. Without losing the focus on what mastery requires in absolute terms, it would be helpful to see more analysis using relative not just absolute benchmarks.

The second is to look at what might be driving these results. A study on this scale is unlikely to be able to identify precisely which features of our model are having the biggest impact; just detecting whether there's been an impact at all is methodologically challenging enough. But it would be interesting if the study were also able to document differences in inputs or intermediate outputs, such as teacher quality, instructional time or other factors we might expect to be associated with higher quality learning. Finding differences in these areas would lend support to the idea that observed differences in outcomes are true effects, and might help to focus future evaluation efforts.

The third area is around socio-economic status. One of the big debates around private schools is the extent to which any performance advantage they enjoy over public schools is purely the result of differences in the students they serve. 'Consensus' is probably too strong a word, but I would say the balance of evidence in the literature is that SES accounts for some but not all of the private school performance advantage. In any case, it's a legitimate concern in interpreting these sorts of results.

There are a number of features of the study that help address this concern. The first is obviously that the comparison group includes other private schools, rather than just making a crude private vs public comparison. Second, the team purposively sampled schools in the same geographic areas as Rising schools. Fee levels and average primary school leaving exam grades were also compared to ensure that student populations were drawn from broadly similar demographics. Third, the study benefits from being focused on secondary school students. We know that SES impacts are felt very early: in work by Pauline Rose and her colleagues in the Young Lives study, literacy gaps between rich and poor students are already significant by age 8 in many countries. Students in this study were ~12-13 years old at baseline and had all completed six years of primary school. That there were no significant differences in baseline reading and maths abilities across students in the three types of schools does not rule out the possibility of an SES effect, but it does at least beg the question of why this SES effect should suddenly be kicking in now when it had not materialised in students' academic trajectories through primary.

But for all that, it would still be interesting to see the team include a larger set of SES controls.

What this report adds

These quibbles notwithstanding, we're grateful to the Oxford team for another illuminating annual report. Alongside other sources of evidence - encouraging recent public exam results in Sierra Leone, the midline results from the randomised controlled trial of Partnership Schools for Liberia - it adds to an increasingly compelling picture of the potential our model seems to have to transform the quality of education available to families in Sierra Leone. The teachers, school leaders and HQ staff that have made this possible should be very proud.

But "however well we do, we always strive to do better", as we like to say round here, so I know my team, like me, will be asking how we can do more and do better.

Two areas stand out: first, what's behind the slower rate of progress in maths, and what can we do about it? Relative progress compared to other schools was actually greater in maths than in reading, so is it just that the content is more challenging? Are there specific topics where students are struggling? This requires further investigation. Second, we are still fairly new and fairly small. In the academic year studied here, we had a total of 13 schools and 2,400 students across Sierra Leone and Liberia. This year we have 39 schools and more than 8,500 students. We need to demonstrate not just that our model can produce big impacts, but that it can continue to do so consistently over time and as we grow.

So, those are my takeaways: what are yours? Read the report for yourself and then leave a comment below, or Tweet us @pjskids or @risingacademies.

Rising students excel in first public exams

_MGL8749.JPG

In Sierra Leone, we've just received our first ever set of public exam results and I'm really excited by what they show about the progress we're making. The headline is that after just two years with us, 99% of our Junior Secondary School students achieved the grades they need to go onto Senior Secondary School

Here's the background.

Back in July/August 2017, the 97 students who made up our oldest cohort of Junior Secondary School (JSS) students sat the Basic Education Certificate Examination (BECE). (When we've opened schools we've typically only been enrolling one or two grades at a time, so in future years that cohort will be bigger as more students start to come through our system.) 

The BECE is an important staging post for students in Sierra Leone, marking the end of universal basic education and acting as a gateway into Senior Secondary School or technical and vocational study. Students sit papers in a number of core and elective subjects. Each paper is marked on a 1 to 7 scale with 1 being the highest. A subject pass requires a score of 6 or better. A student's overall result is based on their scores in six subjects: the four core subjects of English, Maths, Science and Social Studies, plus the best score from each of two sets of electives. To be awarded the BECE, students must achieve at least 4 subject passes, including at least one of either English or Maths. But to go onto Senior Secondary School, the Government of Sierra Leone has stipulated that students must achieve 5 subject passes, including at least one of English or Maths. (Those with only 4 passes are eligible to go onto technical and vocational study.)

I don't know what the national picture looks like for this year yet, and even finding recent historical benchmarks is quite hard (this World Bank report has numbers for 2000-2005, and this UNESCO report for 2005-2011). I'll have more to say about these results in due course once that national picture is clearer. But at first glance it seems our students have done remarkably well.

First and foremost, I want to say well done to every single one of our students for the hard work they put into preparing for these exams. It's hard to imagine just how disrupted Junior Secondary schooling has been for this cohort. They should have begun JSS in September 2014, just as the devastating Ebola epidemic forced all schools in Sierra Leone to remain closed. When they were finally able to start JSS in April 2015, they had to deal with a truncated academic calendar for JSS1 and JSS2, with six terms' worth of material compressed into four in order to make up for lost time. Everyone who sat this examination deserves credit for not letting that disruption get in the way of their determination to pursue their education.

Second, congratulations to all those who got their 5 passes, and commiserations to the one student who just missed out: I know that you did so by the narrowest of margins and despite passing both English and Maths. I hope that whatever our students thought when they first found out their results - whether they were thrilled or disappointed - they will remember the words of our school creed: however well we do, we always strive to do better. 

Third, that applies to Rising as an organisation too. These results are an important milestone for us. It's the first time any of our students have sat public exams and while we know from independent evaluations that our students seem to be making good progress compared to their peers in other schools, these exams are still an important test of whether we are delivering the quality education our students and their parents expect. I'm therefore delighted to see so many of our students do well. It's particularly encouraging given that these students had barely been with us two years when they took the exam, and given that their literacy and numeracy levels when they first enrolled were typically well below grade level. Nevertheless, I'm still looking forward to diving into this data to understand what we can be doing better. For instance, while pass rates are really important (especially for our students and their families), as a school network it's more informative for us to understand our "value-added": the extent to which students perform better at the BECE than you would predict given their incoming learning levels (as shown, for example, by their scores in the primary school leavers' exam, the NPSE). That's something we'd be very interested in looking at, but it's tricky to do with the data currently available. (Any researchers who want to help us out: call me!) There are also some interesting variations in performance across subjects that we need to understand and reflect on.

In short, lots to think about and work on, but a great piece of news with which to start 2018. 

A big congratulations again to our students, and a big thank you to all our teachers, school leaders, curriculum writers and the whole Rising team for the hard work they put into helping them achieve these great results.

What global education can, and can't, learn from global health

What can global education learn from global health? That was the topic of a recent article on the Center for Education Innovations website (the content, of which, I later discovered, was largely drawn from a roundtable in New York back in September with the same title). “I disagree with approximately all of this article,” I Tweeted snarkily after reading it. But Lee Crawfurd told me I couldn’t throw shade without explaining myself, so here goes.

First, the notion that “the sheer scale of the problem” somehow sets global education apart from global health (or indeed other sectors) strikes me as a strange starting point in thinking about what we can learn. For one thing, I’m not sure it’s possible to place completely different societal problems on some common scale of difficulty. But supposing we do agree to treat development as a morbid game of Top Trumps: what are the categories that make education a winner? That “57 million children around the world do not go to school” is indeed a tragedy; that in low income countries 1 in 14 still die before they are even old enough to go to school is too.

Second, the argument that learning is harder to measure than health outcomes, or can only be measured over decades, baffles me. We have some immensely reliable measures of children’s learning. They are called standardized tests. And before you say it, no I don’t think that everything that’s important about education can be captured in a standardized test. I do think that some important things can be captured in a standardized test, particularly where they are low stakes for the student and particularly where baseline levels of learning are extremely low, as they are in the countries worst affected by the global learning crisis. Nor is it impossible to see important changes in learning outcomes occur in relatively short periods of time. Take Partnership Schools for Liberia: whatever your views on the programme, it’s clear from the midline report of the RCT that it achieved significant gains in learning outcomes in a single academic year. Indeed, if you’re nerdy enough to read the fine print of the midline report, you'll see that the evaluators found some learning gains could already be detected in the early weeks of the programme.* As Lant Pritchett has argued, development is a complex modernization process, some features of which – like good governance and pluralism and human rights – really are hard to measure precisely. Education is not one of them.

While we’re on assessment, let’s dispense with two of the other arguments offered here: first, the idea that assessment methods in well-developed education systems inhibit the development of higher-order thinking skills is debatable, but I’m happy to concede there’s a discussion to be had there. But when it comes to the countries bearing the brunt of the global learning crisis, I’ve seen no evidence to suggest that their assessment systems – as problematic as they may be in some instances – are really the binding constraint to addressing the problem of shockingly low learning levels. If you don’t believe me, read Tessa Bold and co’s brilliant study of teacher quality and behaviour across seven countries covering 40% of Africa’s school age population and ask whether any of those findings would be ameliorated by a different assessment system. Second, can we please stop saying that the problem is “we’re teaching kids what was useful 100 years ago”? The problem is too often we’re not teaching kids what was useful 100 years ago, and would still be useful to them today, and for that matter will still be useful in 25 years even when we’re all out of a job and our snarky blogposts are being automatically generated by backflipping AI robots.

Third, I’m sceptical that the issue is a lack of awareness about “best practices” on the part of policy-makers or local implementers. I come back to Tessa Bold’s work: the puzzle is not that policy-makers are doing the ‘wrong’ things so much as that they are doing the ‘right’ things and finding that they don’t work:

"Over the last 15 years, more than 200 randomized controlled trials have been conducted in the area of education. However, the literature has yet to converge to a consensus among researchers about the most effective ways to increase the quality of primary education, as recent systematic reviews demonstrate. In particular, our findings help in understanding both the effect sizes of interventions shown to raise students’ test scores, and the reasons why some well-intentioned policy experiments have not significantly impacted learning outcomes. At the core is the interdependence between teacher effort, ability, and skills in generating high quality education."

Fourth, I don’t disagree that there are stark differences in the ‘ecosystems’ around global education and global health, though perhaps these differences – as the UK government’s Multilateral Aid Review ratings seem to suggest – are more about quality than quantity.  But the word ‘ecosystem’ suggests a degree of harmony and coherence that masks very real strategic tensions and debates within global health – in particular, between the ‘vertical’ funders like GAVI and the Global Fund and more traditional actors focusing on ‘horizontal’ system-strengthening work. To crudely caricature a big and (very) long-running debate, the narrow focus of the vertical funders and their prioritisation of a few specific capabilities over long-term institution building seem to have reduced the variability of their programming, allowing them to more consistently deliver on their objectives. Critics counter that this has come at a big cost: in the long-term, making countries dependent on constant injections of outside cash and capability rather than building permanent, high-performing institutions that can raise health outcomes for all; in the short-term, neglecting diseases that fall outside their focus area, even if they have a dramatic impact on public health. To take an extreme example, there is no Global Fund for Preventing Road Traffic Accidents, even though in many countries they now kill more people than malaria.

What’s interesting is that the education ecosystem has put its eggs fully in the “systems-strengthening” basket. The model of the Global Partnership for Education, for example, is essentially that developing countries develop an approved sector plan and GPE funds it, in line with development effectiveness principles like 'country ownership'. Of course, international donors also do their own programming, some of it working more directly through government systems and some of it less. But none of this is on anything like the scale of the global health verticals. Can you imagine, for instance, a Global Fund for Early Years Education that took the same mass-scale, vertically integrated approach to fixing the problem of poor quality/non-existent early years’ provision that GAVI has for vaccines? In other words, differences in ecosystems are also differences about strategy, and the question is whether some of the strategies pursued by the global health community are actually available to the global education community.

Finally, there’s the money question. Is under-investment in education a barrier? 100% yes. Is it the barrier? The evidence says probably not: as the WDR notes, "the relationship between spending and learning outcomes is often weak."

More to the point, no one in this debate ever seems to ask why the money isn't flowing in global education the way it has in global health. To invest in raising the quality of education, whether as a domestic policy-maker or an international donor, you presumably have to believe three things: that raising the quality of education is important; that there is a reasonable chance your investment will yield the promised benefits; and that the benefits of the investment outweigh its costs (financial, political or otherwise).

To scan the official reports and Twitter feeds of some of the biggest influential players in the education ‘ecosystem’, you’d think that the first of these three was all that mattered. They are littered with factoids extolling the benefits of education: for the earnings of the educated, for GDP, for health, for women's reproductive rights, for the environment, and so on. The implication of this messaging is that policy-makers and donors don’t yet understand the returns to education. Really? An alternative analysis would be that policy-makers do understand the returns to education, but believe (rightly or wrongly) that the true, risk-adjusted returns are much lower. Perhaps they aren’t confident that any of the policies or programmes at their disposal will actually yield the results they want given the messy reality of implementation. Given the litany of seemingly promising policy interventions found not to work, or found to work but not to scale, that would not be an altogether unreasonable conclusion. Or perhaps they eschew reforms that genuinely would make a difference because they carry an unacceptable political cost. Either way, messages that don't address these types of concerns are likely to fall on deaf ears.

Which brings me to the 'p' word. What’s missing from this article is a proper discussion of politics. As the team at RISE have long argued, and the recent WDR has amplified, the underlying cause of the global learning crisis is that the way education systems are organized, funded and incentivized is not necessarily designed to lead to sustained improvements in the quality of learning (as opposed to, say, the quantity of schooling) - and it's often politics that keeps it that way.

To me, the most interesting question for global education folk to explore with their colleagues in global health is therefore the politics of reform: the reform coalitions and institutional strategies that have enabled such impressive progress in some areas; the reasons why these strategies proved successful in overcoming particular institutional challenges or binding in vested interests or circumventing potential veto players; but also the limits of these strategies to deliver change in other areas where a different political calculus held. 

I’m all in favour of seeing what can be learned from other sectors, but as a wise man once said: if you miss the politics, you miss the point.

 

* See Romero, Sandefur and Sandholz (2017), p 24: "Students in treatment schools score higher at baseline than those in control schools by .076σ in math (p-value=.077) and .091σ in English (p-value=.049). There is some evidence that this imbalance is not simply due to “chance bias” in randomization, but rather a treatment effect that materialized in the weeks between the beginning of the school year and the baseline survey."

 

 

 

 

 

 

 

10 Reflections on the WDR

Last week, the World Bank published its latest World Development Report (WDR), the first dedicated exclusively to the topic of education. Learning to Realize Education's Promise may not be the punchiest title I've ever heard, but it's a really important piece of work.

wdr front cover.jpg

"Schooling is not the same as learning.

Education is an imprecise word, and so it must be clearly defined. Schooling is the time a student spends in classrooms, whereas learning is the outcome—what the student takes away from schooling. This distinction is crucial: around the world, many students learn little."

Here are 10 high level reflections:

1. Not new, but definitive. The very first sentence in the report is "Schooling is not the same as learning". This is not a new claim. Lant Pritchett literally wrote the book on this a couple of years ago; Pauline Rose took to Twitter to express her exasperation that anyone could ever have been presumed to think otherwise. But repetition is an under-appreciated tool in good communications, and often "about the time you get tired of saying it, they are just starting to hear it." In short, I don't expect the value of this report to be in its novelty but in its definitiveness. There's not much in here that hasn't been covered in some RISE paper or other. But the evidence is so exhaustive it should make the learning crisis the point of departure for every conversation about global education. 

2. o-LAY. That said, the report does include a couple of neat concepts I hadn't come across before. One that stood out was the Learning Adjusted Years of Schooling (LAYS). Borrowing (presumably) from the concept of Disability Adjusted Life Years (DALYs) in health, which modify the simple measurement of life expectancy in years by how healthy those years actually are, LAYS account for the fact that the productivity of a year of schooling when it comes to actual learning varies wildly between countries and time periods. Since "years of schooling" still, regrettably, remains such a common metric, this feels like a helpful contribution.

3. Improving education isn't easy but it is simple. I've written elsewhere that the barriers to improving education are not, in themselves, that complicated. The report does a nice job of providing a framework laying out the 'proximate causes' of poor quality learning: unprepared learners, unskilled or unmotivated teachers, weak school governance and management, and misdirected inputs and resources. And, the report notes, we actually know a lot more about addressing some of these issues than we used to thanks to an explosion in the number of high quality impact evaluations. The problem -  the not easy part - is that these proximate causes persist because of deeper, more political challenges.

4. If you miss the politics, you miss the point. This will not be news to my former colleagues, but the report helpfully underlines the importance of understanding the political factors that allow the learning crisis to go unaddressed. Some of these are self-evidently malign things like corrupt practices diverting resources from where they are needed, or patronage allowing too many of the wrong people to end up in vital jobs. But the report also points to some of the less obvious things, like the fact that learning is just harder to 'see' than student enrolment, teacher hiring or other potential areas of focus for education policy-makers (echoes here of James Scott's Seeing Like A State).

5. What gets measured, gets managed. Or does it? This need to make the learning crisis more visible motivates the authors to call for a big push on assessing learning. Ideally, this would involve a global learning metric, an idea that seems obviously sensible and relatively straightforward to me, but which is universally regarded by those more knowledgable than I am to be a diplomatic conundrum more complex to resolve than the Schleswig-Holstein Question. Absent a global metric, the authors suggest, more investment in national learning assessments would still be better than nothing.

I'm conflicted on this point. On the one hand, investing in better data seems an absolute no brainer; the first step to recovery is admitting you have a problem and all that. On the other hand, it's clear that even in countries where citizen-led learning assessments like ASER and UWEZO have taken root or where national Ministries have signed up to be part of regional exercises like PASEC, better data has not necessarily been the burning platform for which some might have hoped. It's hard not to conclude that you can equip Ministers with better data on the problem and more robust evidence on the types of policies that will and won't make a difference, but in the end whether anything changes comes down to finding reform-minded leaders with political courage, like Liberia's George K. Werner.

6. The power of And. I was pleased to see that the report tackled head-on the suggestion that more rigorous assessment of student learning necessarily involves a narrowing of focus to the exclusion of other things we might care about fostering in our young people, from character traits like resilience to the fuzzy but nevertheless crucial "21st century skills". To the extent that this argument has any merit in an OECD context (and I'm not sure that it does), it seems absurd given the scale of the quality crisis in the developing world and how intimately linked better teaching of the basics and improvements in some of these other areas are going to be. As the authors note: 

"Conditions that allow children to spend two or three years in school without learning to read a single word or to reach the end of primary school without learning two-digit subtraction are not conducive to reaching the higher goals of education."

The bottom line is that good schools can do both (one reason I'm glad our independent evaluators at Oxford are looking at both the cognitive and non-cognitive development of our students).

7. Forget about the price tag. The Report has been criticised from some quarters for saying too little about money, particularly when the ink has barely dried on Gordon Brown's Education Commission report calling for billions more each year to be funnelled into global education. Of course, the criticism of that report was precisely the mirror image of this: it provided highly detailed costings based on a series of assumptions about what will deliver quality education that had little basis in rigorous evidence. More generally, one of the problems with the discussion about resourcing is that more money is almost certainly both an input to and an output of more effective education reforms: if it were clearer that investments were delivering results, more money would flow into them.

8. Two sides of the same coin? Another criticism of the report is that it has relatively little to say about access, even though millions remain out of school and in some countries school enrolments appear to be dropping not rising. That said, the debate on access sometimes comes close to slipping into a kind of sequentialism - let's fix access, then worry about quality - and the report helpfully points out that they have to be addressed together (even if by different means). If students are not learning or are being asked to repeat grades, their (and their family's) motivation to stay in school falls.

9. Who benefits? In its discussion of fairness and equity, the report mostly focuses on within-country inequality, and the large gaps in access and achievement facing disadvantaged groups. Addressing these is clearly important, but I've noted elsewhere my concern that the lack of proper data on where most learners in developing countries sit in relation to a global 'cognitive poverty line' (analogous to the $1.90 a day global income poverty line) makes it easy to under-value the importance of improving outcomes for the millions of children who may be among the educationally better off in their own countries, but in global terms remain among the most disadvantaged in the world. One other comment on equity: the report usefully points out that fairness is not just about rich and poor students but about good and bad schools. An arresting statistic cited in the report is that in one study in Pakistan, the achievement gap on an English test between students in good and bad schools was 24 times bigger than that between richer and poorer students, even after controlling for student characteristics.

10. Private: no panacea? The report strikes a surprisingly cautious note on the potential contribution of private schools. Surprising in part because I had been reliably informed that the World Bank was secretly a vast conspiracy to push the privatization agenda of its paymasters in Big Edu(TM), but more because this seems to be one area where the Report seems to depart from what the evidence actually says. For example, the Report claims "there is no consistent evidence that private schools deliver better learning outcomes than public schools" and that such evidence as exists "may conflate the effects of private schools themselves with the effects of the type of students who enroll in private schools." Far be it from me to question the authors' interpretation of the literature (he says, preparing to do precisely that) but on the first claim it would seem that there is at least moderate evidence that private schools out-perform public schools, and that this performance advantage is mediated but not wholly eliminated when you control for observable student characteristics. But anyway, this minor quibble just goes to show that those of us who believe there is a complementary role for non-state school operators need to do a better job of building our evidence base. And the central claim of this part of the report - that "overseeing private schools may be no easier than providing quality schooling" - speaks to the fact that whether as a partner in initiatives like Partnership Schools for Liberia, or just as a regulator of private schools, we are talking about government as an enabling state, not a smaller state.

Positive early gains for Partnership Schools and Rising

‘Gold standard’ evaluation finds positive early gains for Partnership Schools and for Rising.

The evaluation team behind the Randomised Controlled Trial (RCT) of Partnership Schools for Liberia (PSL - okay, that’s enough three letter abbreviations) has just released their midline report. The report covers just the first year of PSL (September 2016-July 2017). A final, endline report covering the full three years of the PSL pilot is due in 2019.

While much anticipated, this is only a midline report with preliminary results from one year of a three year programme. The report therefore strikes a cautious tone and the evaluation team are careful to caveat their results. 

Nevertheless, there are important and encouraging early messages for PSL as a whole and for Rising in particular. Put simply, the PSL programme is delivering significant learning gains, and Rising seems to be delivering among the largest gains of any school operator.

For PSL as a whole, the headline result is that PSL schools improved learning outcomes by 60% more than in control schools, or put differently, the equivalent of 0.6 extra years of schooling.

These gains seem to be driven by better management of schools by PSL operators, with longer school days, closer supervision of staff and more on-task teaching resulting in pupils in PSL schools getting about twice as much instructional time as in control schools. PSL schools also benefited from having more money and having better quality teachers, particularly new graduates from the Rural Teacher Training Institutes. But the report is clear that, based on their data and the wider literature, it is the interaction of these additional resources and better management that makes the difference; more resources alone is not enough. (Anecdotally, I would add that our ability to attract these new teachers was at least in part because they had more confidence in how they would be managed, which illustrates the point that new resources and different management are not easily separated.)  

Rising Results

The report also looks at how performance varies across the 8 operators that are part of PSL. Even more than the overall findings, the discussion of operator performance is limited by the small samples of students the evaluation team drew from each school. For operators (like Rising) operating only a small number of schools, this means there is considerable uncertainty around the evaluators’ estimates. That said, the evaluation team do their best to offer some insights.

Their core estimate is that compared to its control schools. Rising improved learning outcomes by 0.41 standard deviations or around 1.3 additional years of schooling. This is the highest of any of the operators, though it is important to note the overlapping confidence intervals between several of the higher performing providers.

RCT - ITT estimate chart.jpg

However, this core estimate is what’s known as an “intent-to-treat” or ITT estimate. It is based on the 5 schools that were originally randomly assigned to Rising. But we only actually ended up working in 4 of those (* see below). The ITT estimate is therefore a composite of results from 4 schools that we operated and 1 school that we never set foot in. A better estimate of our true impact is arguably offered by looking at our impact just on those students in schools we actually ended up working in. This “treatment on the treated” or TOT estimate is considerably higher, with a treatment effect of 0.57 standard deviations or 1.8 extra years of schooling. This, again, is the highest of any operator, and by a considerably larger margin, though again the confidence intervals around the estimate are large.

RCT - ToT estimate chart.jpg

Whether the ITT or TOT estimate is the more useful depends, in my view, on the policy question you are trying to answer.  At the level of the programme as a whole, where the policy question is essentially "what will the overall effect of a programme like this be?”,  the ITT estimate seems the more useful because it is fair to assume that some level of ’non-compliance’ will occur and the programme won’t get implemented in all schools. But at the inter-operator level, where the salient policy question is “given that this is going to be a PSL school, what will be the impact of giving this school to operator X rather than operator Y?”, the TOT estimate seems more informative because it is based solely on results in schools where those operators were actually working. 

A further complication in comparing across operators is that operators have different sample sizes, pulled from different populations of students across different geographical areas. It cannot be assumed that we are comparing like with like. To correct for this, the evaluators control for observable differences in school and student characteristics (e.g. by using proxies for their income status, geographic remoteness etc), but they also use a fancy statistical technique called 'Bayesian hierarchical modelling'. Essentially, this assumes that because we are part of the same programme in the same country, operator effects are likely to be correlated. It therefore dilutes the experimental estimate for Rising by making it a weighted average of Rising's actual performance and the average performance of all the operators. It turns out that adjusting for baseline characteristics doesn’t make too much difference (particularly for Rising, since our schools were more typical), but this Bayesian adjustment does. It drags Rising back towards the mean for all operators, with the amount we are dragged down larger because our sample size is smaller. We still end up with the first or second largest effect depending on which of the ITT or TOT estimate is used, but by design we are closer to the rest of the pack.

Some reflections on the results

So what do we make of these results?

First of all, we are strongly committed to the highest levels of rigour and transparency about our impact. We had thought that the study wouldn’t be able to say anything specific about Rising at all for technical reasons to do with the research design (for nerdier readers: it was originally designed to detect differences between PSL schools and non-PSL schools, and was under-powered to detect differences among operators within PSL). We're glad the evaluation team were able to find some ways to overcome those limitations.

Second, it is interesting and encouraging that the results largely confirm the strong progress we had been seeing in our internal data. Those data looked promising, but absent a control group to provide a robust counterfactual, it was impossible to know for sure that the progress we were seeing was directly attributable to us. As we said at the time and as the evaluation team note in an appendix to this report, our internal data were for internal management purposes and were never meant to have the same rigour as the RCT. But as it turns out, our internal data and the RCT data are pretty consistent. Our internal data suggested that students had made approximately 3 grades' worth of progress in one academic year; the TOT estimate in the RCT is that they had made approximately 2.8 grades’ worth of progress in one academic year. Needless to say, knowing that we can have a good amount of conviction in what our internal data are telling us is very important from a management point of view.

Third, while making direct comparisons between operators is tricky for the reasons noted above, on any reasonable reading of this evidence Rising emerges as one of the stronger operators, and this result validates the decision by the Ministry of Education to allocate 24 new schools to Rising in Year 2. In both absolute and relative terms, this was one of the larger school allocations and reflected the Ministry’s view that Rising was one of the highest performing PSL operators in Year 1. It is good - not just for us but for the principle of accountability underlying the PSL programme as a whole - that the RCT data confirm the MoE’s positive assessment of Rising’s performance.

Taking up the challenge

I also want to be very clear about the limitations of the data at this stage. It is not just that it’s very early to be saying anything definitive. It’s also that these data do not yet allow Rising, or really any operator, to fully address two of the big challenges that have been posed by critics of PSL.

The first challenge is around cost. As the evaluators point out, different operators spent different amounts of money in Year 1, and all spent more money than would be typically made available in a government school. In the end, judgments about the success of PSL or individual operators within it will need to include some assessment not just of impact but of value for money. PSL can only be fully scaled if it can be shown to be effective and affordable. Rising was one of those operators whose unit costs were relatively high in Year 1. That’s because a big part of our costs is the people and the systems in our central team and with just 5 schools in year 1, we had few economies of scale. These costs should fall precipitously once they start to be shared over a much larger number of schools and students. But that’s a testable hypothesis on which the Ministry can hold us to account. In Year 2, we need to prove to them that we can deliver the same or better results at a significantly lower cost per student.

The second challenge is around representativeness. One criticism that has been aired is that Year 1 schools were the low hanging fruit. As the evaluation makes clear, it is simply not true that Year 1 schools were somehow cushy, but it is true that Year 1 schools were generally in easier to serve, somewhat less disadvantaged communities than the median Liberian school. And that’s precisely why the Ministry of Education insisted that the schools we and other operators will be serving in Year 2 be disproportionately located in the South East of Liberia, where those concerns about unrepresentativeness do not apply. If we can continue to perform well in these more challenging contexts, it will go some way to answering the question of whether PSL can genuinely become part of the solution for the whole of Liberia.

In short, the RCT midline provides a welcome confirmation of what our own data were telling us about the positive impact we are having. Our task for the coming academic year is to show that we can sustain and deepen that impact, in more challenging contexts, and more cost effectively. A big task, but one that we are hugely excited and honoured to be taking on.

A little over a year ago, Education Minister George Werner showed a great deal of political courage not just in launching this programme but in insisting that it be the subject of a ‘gold standard’ experimental evaluation. One year on, and these results show that his vision and conviction is beginning to pay dividends. This report is not the final word on PSL, but the next chapter promises to be even more exciting.

 

* Footnote: as the evaluators note in their report, the process of randomly assigning schools in summer 2016 was complex, made even more challenging by the huge number of moving pieces for both operators and the Government of Liberia as both endeavoured to meet incredibly tight timescales for opening schools on September 5th. Provisional school allocations changed several times; by August 14th, three weeks before school opening, we still did not know the identity of our fifth school and it was proving very difficult to find a pair of schools near enough to our other schools to be logistically viable. Faced with the choice of dragging the process out any longer and potentially imperilling operational performance or opting to run a fifth school that was not randomly assigned, we agreed with the Ministry on the latter course of action. 

Expanding our network in Liberia

 Photo credit: Kyle Weaver

Photo credit: Kyle Weaver

We're proud to announce that the Ministry of Education has invited Rising Academies to significantly expand its school network in Liberia. From September 2017, Rising Academies will be operating 29 government schools across 7 counties.

The move comes as part of an expansion of the Partnership Schools for Liberia (PSL) program. To learn more about PSL, click here. The Ministry’s decision to award Rising more schools in the second year of PSL followed an in-depth review and screening process, including unannounced spot checks of PSL schools. Rising Academies was one of three providers to be awarded the top “A” rating for its strong performance in Year 1.

We're really proud of the progress our schools have made this year. If you want to learn more about how we've been rigorously tracking this progress and using data to inform our approach, check out our interim progress report here.

Our press release on the announcement of the Ministry's Year 2 plans is available here.

Benjamin's Story

Last week at the Investing in Education for the Future conference in Monrovia, Liberia, I had the opportunity to speak about our work under the Partnership Schools for Liberia initiative.

 Benjamin Clarke. Principal, Sumo Town Public School

Benjamin Clarke. Principal, Sumo Town Public School

I chose to focus on the story of Benjamin Clarke (pictured right), the inspiring Principal of our school in Sumo Town, to illustrate the impact that PSL is having.

Here was the key bit of the speech:

When Benjamin took over as Principal in 2014 he inherited a school with just 1 payroll teacher and a part-time volunteer who could barely read. Even on a good day he had to teach 4 grades himself, as well as do all the admin, and the bad days outweighed the good. If he was sick or called away to a meeting, the school just didn’t operate.

Today, thanks to Partnership Schools, Benjamin has a qualified teacher in every classroom. He has a Master Teacher trained to observe lessons and give his teachers real-time feedback. His staff receive daily lesson plans with a focus on phonics and numeracy, and are trained in simple techniques that help them deliver more engaging lessons. And instead of being monitored almost never, he sees our team every week, and takes pride in showing them what is happening under his leadership.

I asked Benjamin if I could share his story with you tonight because I think it’s a reminder that for Partnership Schools to succeed it doesn't just need to correct the weaknesses in the Liberian education system, it needs to build on its strengths.

Check out the video for the full speech (5 mins).

Benjamin himself came along to the final day of the conference to share his experience of Partnership Schools with the delegates directly.

Quick reactions to The Learning Generation

The Learning Generation – the Report of The International Commission on Financing Global Education Opportunity chaired by Gordon Brown – was published last week. When the Commission was announced amid much fanfare last year, I was sceptical, but I have to say I found the final report a lot more encouraging than I expected to.

The first big thing the Commission gets right is to continue what CGD calls the pivot to quality. Words matter, and the whole language of a ‘learning generation’ underscores that the focus needs to move beyond getting kids into school and towards the question of what happens when they get there – where right now, the answer is not enough.

The second big thing the Commission gets right is to recognise that the primary responsibility for financing public education lies with domestic governments, though of course international actors must play a role.

The third big thing the Commission gets right, though I would have liked to see it go further, is to acknowledge that while government has ultimate responsibility for guaranteeing access and regulating standards, when it comes to delivery it should be looking to partner with and learn from the best of the non-state sector (including private schools) to achieve its goals, through things like public-private partnerships.

The fourth big thing the Commission gets right is to link investment and reform, and to posit a virtuous circle between the two “in which investment in education leads to reform and results, and reform and results lead to new investment”. Gordon Brown’s influence is particularly visible here: money combined with reform was his mantra when, as Britain’s finance minister, he significantly increased government spending on public services.

This is an important rhetorical shift: too much of the education debate continues to suggest that money by itself is the answer, as evidenced by the plaintive cries for "a Bill Gates for education”. As the Commission notes, there is no evidence to suggest that simply spending more money would help (e.g. in India private schools achieve learning gains that are the same or better than government schools at a third of the unit cost). No one is suggesting that money doesn’t matter, but it doesn’t appear to be a driving force so much as a force multiplier: where the system is configured to deliver improvements in learning, more money helps; where it isn’t, it doesn’t.

The real test of the Commission’s impact will be whether the new rhetoric signals a more fundamental shift in the international education community’s theory of change. At the moment, there is a tendency for the official voices of that community to squander their considerable profile and platform on platitudes about how important education is. The implicit assumption seems to be that if only people could understand the importance of education, they would open their wallets and all would be well. By this logic, appointing Rihanna as a Global Ambassador for education makes sense because she has the star power to get out the message that education matters. 

But if Rihanna is the answer, what is the question? I totally buy Jamie Drummond’s argument that there is a place for pop culture in advocacy. But as he points out, celebrity is not enough without the right policy and the right political analysis.  Find me a single politician – actually forget that, find me a single person – who says education doesn’t matter. Ignorance of education’s importance is not a convincing explanation for why so many education systems are failing children on a truly industrial scale.

Much more convincing is the emerging work of the RISE programme that locates the problem in the politics of education reform. On this view, school systems fail because prevailing political incentives reward policy-making that won’t improve learning while punishing or at least not rewarding policy-making that might. The decline of development assistance to the education sector (at a time when funding for the health sector has grown strongly) in part reflects a fairly rational calculation on the part of donors about the prospects of getting a good return on their investment.

And here’s where the Commission’s Report is less compelling. In its definition of what constitutes quality, and its financial modelling of what it will take to deliver it, the Commission falls back on the sorts of policies that fall into the first category – things that don’t improve learning – not the second. For example, defining quality teaching by reference to increasing the supply of teachers with a tertiary degree leads the Commission to the rather implausible conclusion that 60% of tertiary graduates should be going into teaching, or that teacher salaries need to be 7 to 8 times GDP per capita.

Indeed, the whole exercise of putting a (rather eye-watering) price tag on a universal quality education feels premature when the evidence base about how to achieve it is essentially non-existent. Yes, there is a growing body of rigorous research about the sorts of things that improve learning outcomes (which essentially boil down to better, more differentiated pedagogy and stronger accountability), but very little about how these get diffused across actually existing education systems.

The bottom line is that the politics of raising quality are much, much tougher than the politics of increasing enrolment. Closing failing schools makes you much less popular than building them; getting rid of bad teachers wins you fewer friends than hiring them (actually, even finding out whether teachers are good or bad can make you pretty unpopular); and facing up to the hard facts about how little kids are learning takes a lot more courage than basking in a self-congratulatory glow about reaching universal enrolment.

The ideas in the report that feel most exciting are therefore the ones that feel like they have some potential to make the politics easier for reformers - in particular, the push for investing in internationally comparable learning assessments. Through peer pressure and the implicit threat to national prestige of slipping down the rankings, the OECD's PISA assessments have provoked important political conversations within education ministries in rich countries about the relative performance of their school systems in a way that hasn't happened as much in less developed countries, despite sterling work by the community-led learning assessment movement. 

"Without the ability to successfully navigate the politics of reform to build support for change", the Commission notes, "the best intentions will not lead to results." By making clear that the fate of the learning generation rests on the shoulders of true reformers, and that the responsibility of the international education community should be to support them (both financially and otherwise), the Commission has done an important service.

First year evaluation results show promise

Feedback is central to teaching and learning at Rising Academies. Students and teachers learn to give and receive feedback using techniques like Two Stars and a Wish or What Went Well...Even Better If. The Rising Academy Creed reminds us that "Our first draft is never our final draft." Given that, it would be pretty strange if Rising as an organisation didn't also embrace feedback on how well we are doing at enabling more children to access quality learning.

That's why, even as a new organisation, we've made rigorous, transparent monitoring and evaluation a priority from the outset. Internally, we've invested in our assessment systems and data. But my focus here is on external evaluation, because I'm excited to report that we have just received the first annual report from our external evaluators. If you want to understand the background to the study and our reactions to the first annual report, read on. If you're impatient and want to jump straight into the report itself, it's here.

Background

Last year, we commissioned a team led by Dr David Johnson from the Department of Education at Oxford University to conduct an independent impact evaluation of our schools in Sierra Leone.

The evaluation covers three academic years:

  • (The abridged) School Year 2016 (January-July)
  • School Year 2016-17 (September-July)
  • School Year 2017-18 (September-July)

The evaluation will track a sample of Rising students over those three years, and compare their progress both to a comparison group of students drawn from other private schools and government schools.

The overall evaluation will be based on a range of outcome measures, including standardised tests of reading and maths, a measure of writing skills, and a mixed-methods analysis of students' academic self-confidence and other learning dispositions.

The evaluation is based on what is known as a 'quasi-experimental' design rather than a randomised controlled trial (unlike our schools in Liberia, where we are part of a much larger RCT). But by matching the schools (on things like geography, fee level, and primary school exam scores), randomly selecting students within schools, and collecting appropriate student-level control variables (such as family background and socio-economic status) the idea is that it will ultimately be possible to develop an estimate of our impact over these 3 years that is relatively free of selection bias.

 Figure 1: How the evaluation defines impact

Figure 1: How the evaluation defines impact

 

BASELINE

To make sure any estimate of learning gains is capturing the true impact of our schools, one of the most important control variables to capture is students' ability levels at baseline (i.e. at the start of the three-year evaluation period). This allows for an estimate of the 'value-added' by the student's school, controlling for differences in cognitive ability among students when they enrolled. Baselining for the evaluation took place in January and February 2016. The baseline report is available here. It showed:

  • That on average both Rising students (the treatment group) and students in the other schools (the comparison group) began their junior secondary school careers with similar ability levels in reading and maths. The two groups were, in other words, well matched;
  • That these averages were extremely low - for both reading and maths, approximately five grades below where they would be expected to be given students' chronological age.

YEAR ONE PROGRESS REPORT: RESULTS

The Year One Progress Report covers Academic Year 2016. The Ebola Crisis of 2014-15 disrupted the academic calendar in Sierra Leone. Students missed two full terms of schooling. The Government of Sierra Leone therefore introduced a temporary academic calendar, with the school year cut from three terms to two in 2015 (April-December) and again in 2016 (January-July). The normal (September-July) school year will resume in September 2016.

The Progress Report therefore covers a relatively short period - essentially 4.5 months from late January when baselining was undertaken to late June when the follow-up assessments took place. It would be unrealistic to see major impacts in such a short period, and any impacts that were identified would need to be followed-up over the next two academic years to ensure they were actually sustained. As the authors note, "it is a good principle to see annual progress reports as just that – reports that monitor progress and that treat gains as initial rather than conclusive. A more complete understanding of the extent to which learning in the Rising Academy Network has improved is to be gained towards the end of the study."

Nevertheless, this report represents an important check-in point and an opportunity for us to see whether things looking to be heading in the right direction.

Our reading of the Year One report is that, broadly speaking, they are. To summarise the key findings:

  • The report finds that Rising students made statistically significant gains in both reading and maths, even in this short period. Average scaled scores rose 35 points in reading (from 196 to 231) and 36 points in maths (from 480 to 516). To put these numbers in context, this change in reading scores corresponds to 4 months' worth of progress (based on the UK student population on which these tests are normed) in 4.5 months of instruction.
  • These gains were higher than for students in comparison schools. The differences were both statistically significant and practically important: in both reading and maths, Rising students gained more than twice as much as their peers in other private schools (35 points versus 13 points in reading, and 36 points versus 4 points in maths). Students in government schools made no discernible progress at all in either reading or maths. (For the more statistically inclined, this represents an effect size of 0.39 for reading and 0.38 for maths relative to government schools, or 0.23 for reading and 0.29 for maths relative to private schools, which is pretty good in such a short timespan.) 
  • The gains were also equitably distributed, in that the students who gained most were the students who started out lowest, and there were no significant differences between boys and girls.
  • Finally, there are early indications that students' experience of school is quite different at Rising compared to other schools. Rising students were more likely to report spending time working together and supporting each others' learning, and more likely to report getting praise, feedback and help when they get stuck from their teachers.

That's the good news. What about the bad news? The most obvious point is that in absolute terms our students' reading and maths skills are still very low. They are starting from such a low base that one-off improvements in learning levels are not good enough. To catch-up, we need to sustain and accelerate these gains over the next few years.

That's why, for example, we've recently been partnering with Results for Development to prototype and test new ways to improve the literacy skills of our most struggling readers, including a peer-to-peer reading club.

So what are my two stars and a wish?

  • My first star is that our students are making much more rapid progress in our schools than they did in their previous schools, or than that their peers are making in other schools they might have chosen to attend;
  • My second star is that these gains are not concentrated in a single subset of higher ability students but widely and equitably shared across our intake;
  • My wish is that we find ways to sustain these gains next year (particularly as we grow, with 5 new schools joining our network in September 2016) and accelerate them through innovations like our reading club. If we can do that, and with the benefit of 50% more instructional time (as the school year returns to its normal length), we can start to be more confident we are truly having the impact we're aiming for.

Take a look at the report yourself, and let us know what you think. Tweet me @pjskids or send me an email.

Rising Academy Partnership Schools launched in Liberia

Monday September 5th marks the start of the new school year in Liberia - and with it the start of a new relationship between Rising Academies and the Government of Liberia.

From today, five government elementary schools – three in Bomi County and two in Montserrado County – will become Rising Academy Partnership Schools. They will remain in public ownership, free to attend and non-selective, using qualified government teachers on the government payroll, observing the Liberian National Curriculum, and with government retaining responsibility for the physical upkeep of the school buildings. But responsibility for the day-to-day management of the schools and for improving the quality of teaching and learning will pass to Rising.

This effort is part of Partnership Schools for Liberia (PSL), a bold and deliberately experimental pilot programme to explore whether bringing in operators from outside government can help address the chronic crisis of education quality in the public system.

The case for change is compelling: for every 100 children of primary school age in Liberia, only 38 attend primary school, of whom only 23 will complete grade 6, of whom only about 8 will make it through secondary school and sit the WAEC exams at the end of grade 12, of whom only about 4 will pass. Along the way, even the kids in school aren’t learning what they need. To be considered fluent readers, Grade 3 students given an age-appropriate text ought to be able to read 45 to 60 words per minute correctly; in Liberia, the average is less than 20 words per minute.

Much has been written about the PSL programme – in The New York Times, The Guardian, Vox and HuffPo among others. Unfortunately, early (mis)conceptions about the programme have proved hard to shake. For a clear and comprehensive account of the programme from two people who actually know what they are talking about, this piece by Susannah Hares and Justin Sandefur is the best place to start.

Here’s the short version: under PSL, 90 primary schools (less than 3% of the total) will be handed over to outside operators to run in 2016-17. As well as Rising, these operators include international NGOs like BRAC, Streetchild and More Than Me, private school operators like Omega Schools and Bridge International Academies, and Liberian organisations Stella Maris Polytechnic and the Liberia Youth Network. As a relatively new organisation, Rising is proud to be among such distinguished company.

Operators are paid a fixed per capita grant for each student enrolled, and then held accountable for using this money to improve learning outcomes for students. If operators do well, they might be allowed to expand to manage additional schools in future; if they fail, they might be stripped of the schools they are running. If the programme as a whole shows it can make a difference to the quality of schooling, it might be expanded; if the programme as a whole fails, it will be shut down.

The key point is that these decisions will be based on rigorous evidence. A major attraction of the programme is that it gets beyond unhelpful and ideological debates about who should run schools and focuses on getting the data. PSL has been designed as a ‘gold standard’ randomised controlled trial, with schools randomly assigned to a treatment condition where they are run by a PSL operator or a control condition where they are not. Comparing what happens in these two groups of schools over time should therefore provide a reliable estimate of what difference (if any) it makes to have an outside operator. Formal details of the evaluation are on the AEA’s RCT Registry here.

The speed at which the PSL programme has moved from idea to implementation is staggering, particularly in a country where, as Liberia’s President Ellen Johnson Sirleaf has admitted, ambition is often thwarted by a lack of capacity in the system to get things done. Some operators have had a head-start, but in Rising’s case we only submitted our initial Expression of Interest on 4th May, only got the green light from government in mid-July, and were only notified which schools we were being asked to run in early August (and in one case even later than that).

Nevertheless, it was absolutely right for the Government to try to move quickly, rather than delay another whole year and risk losing momentum, and we have been doing our best to make the most of limited time available.

 Candidates attend Rising's assessment centre

Candidates attend Rising's assessment centre

A major focus has been on staffing, screening existing staff and undertaking an urgent recruitment exercise. Our initial assessment showed that our schools had less than half the number of teachers they were supposed to have. In one case, a Principal had been assigned a teacher who had collected salary but never showed up for 3 years. That’s meant On August 23rd, we held an assessment centre for more than 200 recent teacher training graduates, with 12 candidates appointed to take up positions in our schools.

 Teachers work through a card exercise during training

Teachers work through a card exercise during training

A second major focus has been on training. Through our work in Sierra Leone, we have developed an effective teacher pre-service training programme. Rather than lots of theory, teachers are given a small number of specific, high impact, practical skills to practise and receive feedback on. Rapid improvements in the level and quality of student engagement are possible as a result. Without the time to do the programme in full, 28 existing teachers received an abridged version of the training, with intensive in-service training taking place each afternoon for the first two weeks of term, and further training scheduled for later in the term.

 Many schools lack basic infrastructure

Many schools lack basic infrastructure

A third focus has been on fixing some of the basic infrastructure in the schools. Bigger issues like leaky roofs remain the responsibility of government. But most of the schools didn’t have enough desks and chairs, and those they had were in disrepair, so we’ve procured more than 250 items of furniture to address the most urgent needs.

Finally, one of the freedoms granted to operators under PSL is to innovate with how the National Curriculum gets delivered. Some elements of our approach, like our highly effective phonics programme in partnership with Phonics International, remain just as relevant in Liberia as they do in Sierra Leone. But in other areas our international team of curriculum writers have been hard at work producing lesson plans and materials that will be appropriate for this new context.

With PSL, Rising embarks on a new and unfamiliar journey: a new country, and a new way of working. Among the teachers and principals, there is already a sense of excitement about what we might achieve together. Among parents too: student enrolment is up as parents get to hear about our new role in their local school.

Unlike so many traditional education programs which seek to raise the quality of outputs simply by increasing the number of inputs, PSL starts by correctly identifying the source of the education crisis as the way that schools are managed and held to account. With the right management and accountability, rapid improvements in student achievement are possible; without them, the system will continue to fail the children who need it most.

We wanted to be part of PSL because, when the history of this brave reform initiative is written, we want the world to know we did our best to make it a success.

If you are interested in learning more about Rising Academy Partnership Schools, tweet me @pjskids or send me an email.

Evaluation baseline report available

Rising Academies has commissioned a team from Oxford University to complete an independent impact evaluation of our work in Sierra Leone. The study will track the progress of a sample of Rising Academy students over three academic years, and benchmark this against the progress of a comparison group of students from government and private schools in Sierra Leone. The main outcome measures are tests of reading and maths. The team is also looking at writing skills, as well as academic self-confidence and learning dispositions.

The team conducted baseline assessments for English and Maths earlier this year. You can read the full baseline report here or the executive summary here.

The first follow-up assessments were completed in July, and so an annual progress report will be available later this year.

RAN joins CEI database

We're pleased to announce that the Rising Academy Network is the latest organization to be profiled in the Center for Education Innovations online database. CEI "seeks to fill the gaps in global understanding about innovative education programs striving to increase access to quality education for students in low income communities." CEI's database features more than 650 innovative education programs across 145 countries, and attracts over 10,000 visitors a month.

You can find RAN's profile here.

Simple, but not easy? Tackling the learning crisis

Across the developing world, more children are in school. We should celebrate that and acknowledge that the job is not yet done: in Nigeria alone, 10.5m children are out of school.

Nevertheless, it is time to move beyond a focus on getting kids into school and start focusing on the quality of the education they receive when they get there.

When will they ever learn?

In many parts of the world, we have created a learning crisis: more kids are in school, but they are not learning. “We are failing the children on a massive scale,” says celebrated development economist Esther Duflo, John Bates Clark Medal winner and author of Poor Economics. “There has been improvement in enrolment and in the physical capacity of schools. But learning is not about enrolment, teacher-student ratio, having latrines in school; it’s about if we are serious about learning.”

In September, world leaders will get together and agree that improving the quality of education should be one of the Sustainable Development Goals that replace the MDGs. Something must be done, they will say. But what?