10 Reflections on the WDR

Last week, the World Bank published its latest World Development Report (WDR), the first dedicated exclusively to the topic of education. Learning to Realize Education's Promise may not be the punchiest title I've ever heard, but it's a really important piece of work.

wdr front cover.jpg

"Schooling is not the same as learning.

Education is an imprecise word, and so it must be clearly defined. Schooling is the time a student spends in classrooms, whereas learning is the outcome—what the student takes away from schooling. This distinction is crucial: around the world, many students learn little."

Here are 10 high level reflections:

1. Not new, but definitive. The very first sentence in the report is "Schooling is not the same as learning". This is not a new claim. Lant Pritchett literally wrote the book on this a couple of years ago; Pauline Rose took to Twitter to express her exasperation that anyone could ever have been presumed to think otherwise. But repetition is an under-appreciated tool in good communications, and often "about the time you get tired of saying it, they are just starting to hear it." In short, I don't expect the value of this report to be in its novelty but in its definitiveness. There's not much in here that hasn't been covered in some RISE paper or other. But the evidence is so exhaustive it should make the learning crisis the point of departure for every conversation about global education. 

2. o-LAY. That said, the report does include a couple of neat concepts I hadn't come across before. One that stood out was the Learning Adjusted Years of Schooling (LAYS). Borrowing (presumably) from the concept of Disability Adjusted Life Years (DALYs) in health, which modify the simple measurement of life expectancy in years by how healthy those years actually are, LAYS account for the fact that the productivity of a year of schooling when it comes to actual learning varies wildly between countries and time periods. Since "years of schooling" still, regrettably, remains such a common metric, this feels like a helpful contribution.

3. Improving education isn't easy but it is simple. I've written elsewhere that the barriers to improving education are not, in themselves, that complicated. The report does a nice job of providing a framework laying out the 'proximate causes' of poor quality learning: unprepared learners, unskilled or unmotivated teachers, weak school governance and management, and misdirected inputs and resources. And, the report notes, we actually know a lot more about addressing some of these issues than we used to thanks to an explosion in the number of high quality impact evaluations. The problem -  the not easy part - is that these proximate causes persist because of deeper, more political challenges.

4. If you miss the politics, you miss the point. This will not be news to my former colleagues, but the report helpfully underlines the importance of understanding the political factors that allow the learning crisis to go unaddressed. Some of these are self-evidently malign things like corrupt practices diverting resources from where they are needed, or patronage allowing too many of the wrong people to end up in vital jobs. But the report also points to some of the less obvious things, like the fact that learning is just harder to 'see' than student enrolment, teacher hiring or other potential areas of focus for education policy-makers (echoes here of James Scott's Seeing Like A State).

5. What gets measured, gets managed. Or does it? This need to make the learning crisis more visible motivates the authors to call for a big push on assessing learning. Ideally, this would involve a global learning metric, an idea that seems obviously sensible and relatively straightforward to me, but which is universally regarded by those more knowledgable than I am to be a diplomatic conundrum more complex to resolve than the Schleswig-Holstein Question. Absent a global metric, the authors suggest, more investment in national learning assessments would still be better than nothing.

I'm conflicted on this point. On the one hand, investing in better data seems an absolute no brainer; the first step to recovery is admitting you have a problem and all that. On the other hand, it's clear that even in countries where citizen-led learning assessments like ASER and UWEZO have taken root or where national Ministries have signed up to be part of regional exercises like PASEC, better data has not necessarily been the burning platform for which some might have hoped. It's hard not to conclude that you can equip Ministers with better data on the problem and more robust evidence on the types of policies that will and won't make a difference, but in the end whether anything changes comes down to finding reform-minded leaders with political courage, like Liberia's George K. Werner.

6. The power of And. I was pleased to see that the report tackled head-on the suggestion that more rigorous assessment of student learning necessarily involves a narrowing of focus to the exclusion of other things we might care about fostering in our young people, from character traits like resilience to the fuzzy but nevertheless crucial "21st century skills". To the extent that this argument has any merit in an OECD context (and I'm not sure that it does), it seems absurd given the scale of the quality crisis in the developing world and how intimately linked better teaching of the basics and improvements in some of these other areas are going to be. As the authors note: 

"Conditions that allow children to spend two or three years in school without learning to read a single word or to reach the end of primary school without learning two-digit subtraction are not conducive to reaching the higher goals of education."

The bottom line is that good schools can do both (one reason I'm glad our independent evaluators at Oxford are looking at both the cognitive and non-cognitive development of our students).

7. Forget about the price tag. The Report has been criticised from some quarters for saying too little about money, particularly when the ink has barely dried on Gordon Brown's Education Commission report calling for billions more each year to be funnelled into global education. Of course, the criticism of that report was precisely the mirror image of this: it provided highly detailed costings based on a series of assumptions about what will deliver quality education that had little basis in rigorous evidence. More generally, one of the problems with the discussion about resourcing is that more money is almost certainly both an input to and an output of more effective education reforms: if it were clearer that investments were delivering results, more money would flow into them.

8. Two sides of the same coin? Another criticism of the report is that it has relatively little to say about access, even though millions remain out of school and in some countries school enrolments appear to be dropping not rising. That said, the debate on access sometimes comes close to slipping into a kind of sequentialism - let's fix access, then worry about quality - and the report helpfully points out that they have to be addressed together (even if by different means). If students are not learning or are being asked to repeat grades, their (and their family's) motivation to stay in school falls.

9. Who benefits? In its discussion of fairness and equity, the report mostly focuses on within-country inequality, and the large gaps in access and achievement facing disadvantaged groups. Addressing these is clearly important, but I've noted elsewhere my concern that the lack of proper data on where most learners in developing countries sit in relation to a global 'cognitive poverty line' (analogous to the $1.90 a day global income poverty line) makes it easy to under-value the importance of improving outcomes for the millions of children who may be among the educationally better off in their own countries, but in global terms remain among the most disadvantaged in the world. One other comment on equity: the report usefully points out that fairness is not just about rich and poor students but about good and bad schools. An arresting statistic cited in the report is that in one study in Pakistan, the achievement gap on an English test between students in good and bad schools was 24 times bigger than that between richer and poorer students, even after controlling for student characteristics.

10. Private: no panacea? The report strikes a surprisingly cautious note on the potential contribution of private schools. Surprising in part because I had been reliably informed that the World Bank was secretly a vast conspiracy to push the privatization agenda of its paymasters in Big Edu(TM), but more because this seems to be one area where the Report seems to depart from what the evidence actually says. For example, the Report claims "there is no consistent evidence that private schools deliver better learning outcomes than public schools" and that such evidence as exists "may conflate the effects of private schools themselves with the effects of the type of students who enroll in private schools." Far be it from me to question the authors' interpretation of the literature (he says, preparing to do precisely that) but on the first claim it would seem that there is at least moderate evidence that private schools out-perform public schools, and that this performance advantage is mediated but not wholly eliminated when you control for observable student characteristics. But anyway, this minor quibble just goes to show that those of us who believe there is a complementary role for non-state school operators need to do a better job of building our evidence base. And the central claim of this part of the report - that "overseeing private schools may be no easier than providing quality schooling" - speaks to the fact that whether as a partner in initiatives like Partnership Schools for Liberia, or just as a regulator of private schools, we are talking about government as an enabling state, not a smaller state.

Positive early gains for Partnership Schools and Rising

‘Gold standard’ evaluation finds positive early gains for Partnership Schools and for Rising.

The evaluation team behind the Randomised Controlled Trial (RCT) of Partnership Schools for Liberia (PSL - okay, that’s enough three letter abbreviations) has just released their midline report. The report covers just the first year of PSL (September 2016-July 2017). A final, endline report covering the full three years of the PSL pilot is due in 2019.

While much anticipated, this is only a midline report with preliminary results from one year of a three year programme. The report therefore strikes a cautious tone and the evaluation team are careful to caveat their results. 

Nevertheless, there are important and encouraging early messages for PSL as a whole and for Rising in particular. Put simply, the PSL programme is delivering significant learning gains, and Rising seems to be delivering among the largest gains of any school operator.

For PSL as a whole, the headline result is that PSL schools improved learning outcomes by 60% more than in control schools, or put differently, the equivalent of 0.6 extra years of schooling.

These gains seem to be driven by better management of schools by PSL operators, with longer school days, closer supervision of staff and more on-task teaching resulting in pupils in PSL schools getting about twice as much instructional time as in control schools. PSL schools also benefited from having more money and having better quality teachers, particularly new graduates from the Rural Teacher Training Institutes. But the report is clear that, based on their data and the wider literature, it is the interaction of these additional resources and better management that makes the difference; more resources alone is not enough. (Anecdotally, I would add that our ability to attract these new teachers was at least in part because they had more confidence in how they would be managed, which illustrates the point that new resources and different management are not easily separated.)  

Rising Results

The report also looks at how performance varies across the 8 operators that are part of PSL. Even more than the overall findings, the discussion of operator performance is limited by the small samples of students the evaluation team drew from each school. For operators (like Rising) operating only a small number of schools, this means there is considerable uncertainty around the evaluators’ estimates. That said, the evaluation team do their best to offer some insights.

Their core estimate is that compared to its control schools. Rising improved learning outcomes by 0.41 standard deviations or around 1.3 additional years of schooling. This is the highest of any of the operators, though it is important to note the overlapping confidence intervals between several of the higher performing providers.

RCT - ITT estimate chart.jpg

However, this core estimate is what’s known as an “intent-to-treat” or ITT estimate. It is based on the 5 schools that were originally randomly assigned to Rising. But we only actually ended up working in 4 of those (* see below). The ITT estimate is therefore a composite of results from 4 schools that we operated and 1 school that we never set foot in. A better estimate of our true impact is arguably offered by looking at our impact just on those students in schools we actually ended up working in. This “treatment on the treated” or TOT estimate is considerably higher, with a treatment effect of 0.57 standard deviations or 1.8 extra years of schooling. This, again, is the highest of any operator, and by a considerably larger margin, though again the confidence intervals around the estimate are large.

RCT - ToT estimate chart.jpg

Whether the ITT or TOT estimate is the more useful depends, in my view, on the policy question you are trying to answer.  At the level of the programme as a whole, where the policy question is essentially "what will the overall effect of a programme like this be?”,  the ITT estimate seems the more useful because it is fair to assume that some level of ’non-compliance’ will occur and the programme won’t get implemented in all schools. But at the inter-operator level, where the salient policy question is “given that this is going to be a PSL school, what will be the impact of giving this school to operator X rather than operator Y?”, the TOT estimate seems more informative because it is based solely on results in schools where those operators were actually working. 

A further complication in comparing across operators is that operators have different sample sizes, pulled from different populations of students across different geographical areas. It cannot be assumed that we are comparing like with like. To correct for this, the evaluators control for observable differences in school and student characteristics (e.g. by using proxies for their income status, geographic remoteness etc), but they also use a fancy statistical technique called 'Bayesian hierarchical modelling'. Essentially, this assumes that because we are part of the same programme in the same country, operator effects are likely to be correlated. It therefore dilutes the experimental estimate for Rising by making it a weighted average of Rising's actual performance and the average performance of all the operators. It turns out that adjusting for baseline characteristics doesn’t make too much difference (particularly for Rising, since our schools were more typical), but this Bayesian adjustment does. It drags Rising back towards the mean for all operators, with the amount we are dragged down larger because our sample size is smaller. We still end up with the first or second largest effect depending on which of the ITT or TOT estimate is used, but by design we are closer to the rest of the pack.

Some reflections on the results

So what do we make of these results?

First of all, we are strongly committed to the highest levels of rigour and transparency about our impact. We had thought that the study wouldn’t be able to say anything specific about Rising at all for technical reasons to do with the research design (for nerdier readers: it was originally designed to detect differences between PSL schools and non-PSL schools, and was under-powered to detect differences among operators within PSL). We're glad the evaluation team were able to find some ways to overcome those limitations.

Second, it is interesting and encouraging that the results largely confirm the strong progress we had been seeing in our internal data. Those data looked promising, but absent a control group to provide a robust counterfactual, it was impossible to know for sure that the progress we were seeing was directly attributable to us. As we said at the time and as the evaluation team note in an appendix to this report, our internal data were for internal management purposes and were never meant to have the same rigour as the RCT. But as it turns out, our internal data and the RCT data are pretty consistent. Our internal data suggested that students had made approximately 3 grades' worth of progress in one academic year; the TOT estimate in the RCT is that they had made approximately 2.8 grades’ worth of progress in one academic year. Needless to say, knowing that we can have a good amount of conviction in what our internal data are telling us is very important from a management point of view.

Third, while making direct comparisons between operators is tricky for the reasons noted above, on any reasonable reading of this evidence Rising emerges as one of the stronger operators, and this result validates the decision by the Ministry of Education to allocate 24 new schools to Rising in Year 2. In both absolute and relative terms, this was one of the larger school allocations and reflected the Ministry’s view that Rising was one of the highest performing PSL operators in Year 1. It is good - not just for us but for the principle of accountability underlying the PSL programme as a whole - that the RCT data confirm the MoE’s positive assessment of Rising’s performance.

Taking up the challenge

I also want to be very clear about the limitations of the data at this stage. It is not just that it’s very early to be saying anything definitive. It’s also that these data do not yet allow Rising, or really any operator, to fully address two of the big challenges that have been posed by critics of PSL.

The first challenge is around cost. As the evaluators point out, different operators spent different amounts of money in Year 1, and all spent more money than would be typically made available in a government school. In the end, judgments about the success of PSL or individual operators within it will need to include some assessment not just of impact but of value for money. PSL can only be fully scaled if it can be shown to be effective and affordable. Rising was one of those operators whose unit costs were relatively high in Year 1. That’s because a big part of our costs is the people and the systems in our central team and with just 5 schools in year 1, we had few economies of scale. These costs should fall precipitously once they start to be shared over a much larger number of schools and students. But that’s a testable hypothesis on which the Ministry can hold us to account. In Year 2, we need to prove to them that we can deliver the same or better results at a significantly lower cost per student.

The second challenge is around representativeness. One criticism that has been aired is that Year 1 schools were the low hanging fruit. As the evaluation makes clear, it is simply not true that Year 1 schools were somehow cushy, but it is true that Year 1 schools were generally in easier to serve, somewhat less disadvantaged communities than the median Liberian school. And that’s precisely why the Ministry of Education insisted that the schools we and other operators will be serving in Year 2 be disproportionately located in the South East of Liberia, where those concerns about unrepresentativeness do not apply. If we can continue to perform well in these more challenging contexts, it will go some way to answering the question of whether PSL can genuinely become part of the solution for the whole of Liberia.

In short, the RCT midline provides a welcome confirmation of what our own data were telling us about the positive impact we are having. Our task for the coming academic year is to show that we can sustain and deepen that impact, in more challenging contexts, and more cost effectively. A big task, but one that we are hugely excited and honoured to be taking on.

A little over a year ago, Education Minister George Werner showed a great deal of political courage not just in launching this programme but in insisting that it be the subject of a ‘gold standard’ experimental evaluation. One year on, and these results show that his vision and conviction is beginning to pay dividends. This report is not the final word on PSL, but the next chapter promises to be even more exciting.

 

* Footnote: as the evaluators note in their report, the process of randomly assigning schools in summer 2016 was complex, made even more challenging by the huge number of moving pieces for both operators and the Government of Liberia as both endeavoured to meet incredibly tight timescales for opening schools on September 5th. Provisional school allocations changed several times; by August 14th, three weeks before school opening, we still did not know the identity of our fifth school and it was proving very difficult to find a pair of schools near enough to our other schools to be logistically viable. Faced with the choice of dragging the process out any longer and potentially imperilling operational performance or opting to run a fifth school that was not randomly assigned, we agreed with the Ministry on the latter course of action. 

Expanding our network in Liberia

Photo credit: Kyle Weaver

Photo credit: Kyle Weaver

We're proud to announce that the Ministry of Education has invited Rising Academies to significantly expand its school network in Liberia. From September 2017, Rising Academies will be operating 29 government schools across 7 counties.

The move comes as part of an expansion of the Partnership Schools for Liberia (PSL) program. To learn more about PSL, click here. The Ministry’s decision to award Rising more schools in the second year of PSL followed an in-depth review and screening process, including unannounced spot checks of PSL schools. Rising Academies was one of three providers to be awarded the top “A” rating for its strong performance in Year 1.

We're really proud of the progress our schools have made this year. If you want to learn more about how we've been rigorously tracking this progress and using data to inform our approach, check out our interim progress report here.

Our press release on the announcement of the Ministry's Year 2 plans is available here.

Benjamin's Story

Last week at the Investing in Education for the Future conference in Monrovia, Liberia, I had the opportunity to speak about our work under the Partnership Schools for Liberia initiative.

Benjamin Clarke. Principal, Sumo Town Public School

Benjamin Clarke. Principal, Sumo Town Public School

I chose to focus on the story of Benjamin Clarke (pictured right), the inspiring Principal of our school in Sumo Town, to illustrate the impact that PSL is having.

Here was the key bit of the speech:

When Benjamin took over as Principal in 2014 he inherited a school with just 1 payroll teacher and a part-time volunteer who could barely read. Even on a good day he had to teach 4 grades himself, as well as do all the admin, and the bad days outweighed the good. If he was sick or called away to a meeting, the school just didn’t operate.

Today, thanks to Partnership Schools, Benjamin has a qualified teacher in every classroom. He has a Master Teacher trained to observe lessons and give his teachers real-time feedback. His staff receive daily lesson plans with a focus on phonics and numeracy, and are trained in simple techniques that help them deliver more engaging lessons. And instead of being monitored almost never, he sees our team every week, and takes pride in showing them what is happening under his leadership.

I asked Benjamin if I could share his story with you tonight because I think it’s a reminder that for Partnership Schools to succeed it doesn't just need to correct the weaknesses in the Liberian education system, it needs to build on its strengths.

Check out the video for the full speech (5 mins).

Benjamin himself came along to the final day of the conference to share his experience of Partnership Schools with the delegates directly.

Quick reactions to The Learning Generation

The Learning Generation – the Report of The International Commission on Financing Global Education Opportunity chaired by Gordon Brown – was published last week. When the Commission was announced amid much fanfare last year, I was sceptical, but I have to say I found the final report a lot more encouraging than I expected to.

The first big thing the Commission gets right is to continue what CGD calls the pivot to quality. Words matter, and the whole language of a ‘learning generation’ underscores that the focus needs to move beyond getting kids into school and towards the question of what happens when they get there – where right now, the answer is not enough.

The second big thing the Commission gets right is to recognise that the primary responsibility for financing public education lies with domestic governments, though of course international actors must play a role.

The third big thing the Commission gets right, though I would have liked to see it go further, is to acknowledge that while government has ultimate responsibility for guaranteeing access and regulating standards, when it comes to delivery it should be looking to partner with and learn from the best of the non-state sector (including private schools) to achieve its goals, through things like public-private partnerships.

The fourth big thing the Commission gets right is to link investment and reform, and to posit a virtuous circle between the two “in which investment in education leads to reform and results, and reform and results lead to new investment”. Gordon Brown’s influence is particularly visible here: money combined with reform was his mantra when, as Britain’s finance minister, he significantly increased government spending on public services.

This is an important rhetorical shift: too much of the education debate continues to suggest that money by itself is the answer, as evidenced by the plaintive cries for "a Bill Gates for education”. As the Commission notes, there is no evidence to suggest that simply spending more money would help (e.g. in India private schools achieve learning gains that are the same or better than government schools at a third of the unit cost). No one is suggesting that money doesn’t matter, but it doesn’t appear to be a driving force so much as a force multiplier: where the system is configured to deliver improvements in learning, more money helps; where it isn’t, it doesn’t.

The real test of the Commission’s impact will be whether the new rhetoric signals a more fundamental shift in the international education community’s theory of change. At the moment, there is a tendency for the official voices of that community to squander their considerable profile and platform on platitudes about how important education is. The implicit assumption seems to be that if only people could understand the importance of education, they would open their wallets and all would be well. By this logic, appointing Rihanna as a Global Ambassador for education makes sense because she has the star power to get out the message that education matters. 

But if Rihanna is the answer, what is the question? I totally buy Jamie Drummond’s argument that there is a place for pop culture in advocacy. But as he points out, celebrity is not enough without the right policy and the right political analysis.  Find me a single politician – actually forget that, find me a single person – who says education doesn’t matter. Ignorance of education’s importance is not a convincing explanation for why so many education systems are failing children on a truly industrial scale.

Much more convincing is the emerging work of the RISE programme that locates the problem in the politics of education reform. On this view, school systems fail because prevailing political incentives reward policy-making that won’t improve learning while punishing or at least not rewarding policy-making that might. The decline of development assistance to the education sector (at a time when funding for the health sector has grown strongly) in part reflects a fairly rational calculation on the part of donors about the prospects of getting a good return on their investment.

And here’s where the Commission’s Report is less compelling. In its definition of what constitutes quality, and its financial modelling of what it will take to deliver it, the Commission falls back on the sorts of policies that fall into the first category – things that don’t improve learning – not the second. For example, defining quality teaching by reference to increasing the supply of teachers with a tertiary degree leads the Commission to the rather implausible conclusion that 60% of tertiary graduates should be going into teaching, or that teacher salaries need to be 7 to 8 times GDP per capita.

Indeed, the whole exercise of putting a (rather eye-watering) price tag on a universal quality education feels premature when the evidence base about how to achieve it is essentially non-existent. Yes, there is a growing body of rigorous research about the sorts of things that improve learning outcomes (which essentially boil down to better, more differentiated pedagogy and stronger accountability), but very little about how these get diffused across actually existing education systems.

The bottom line is that the politics of raising quality are much, much tougher than the politics of increasing enrolment. Closing failing schools makes you much less popular than building them; getting rid of bad teachers wins you fewer friends than hiring them (actually, even finding out whether teachers are good or bad can make you pretty unpopular); and facing up to the hard facts about how little kids are learning takes a lot more courage than basking in a self-congratulatory glow about reaching universal enrolment.

The ideas in the report that feel most exciting are therefore the ones that feel like they have some potential to make the politics easier for reformers - in particular, the push for investing in internationally comparable learning assessments. Through peer pressure and the implicit threat to national prestige of slipping down the rankings, the OECD's PISA assessments have provoked important political conversations within education ministries in rich countries about the relative performance of their school systems in a way that hasn't happened as much in less developed countries, despite sterling work by the community-led learning assessment movement. 

"Without the ability to successfully navigate the politics of reform to build support for change", the Commission notes, "the best intentions will not lead to results." By making clear that the fate of the learning generation rests on the shoulders of true reformers, and that the responsibility of the international education community should be to support them (both financially and otherwise), the Commission has done an important service.

First year evaluation results show promise

Feedback is central to teaching and learning at Rising Academies. Students and teachers learn to give and receive feedback using techniques like Two Stars and a Wish or What Went Well...Even Better If. The Rising Academy Creed reminds us that "Our first draft is never our final draft." Given that, it would be pretty strange if Rising as an organisation didn't also embrace feedback on how well we are doing at enabling more children to access quality learning.

That's why, even as a new organisation, we've made rigorous, transparent monitoring and evaluation a priority from the outset. Internally, we've invested in our assessment systems and data. But my focus here is on external evaluation, because I'm excited to report that we have just received the first annual report from our external evaluators. If you want to understand the background to the study and our reactions to the first annual report, read on. If you're impatient and want to jump straight into the report itself, it's here.

Background

Last year, we commissioned a team led by Dr David Johnson from the Department of Education at Oxford University to conduct an independent impact evaluation of our schools in Sierra Leone.

The evaluation covers three academic years:

  • (The abridged) School Year 2016 (January-July)
  • School Year 2016-17 (September-July)
  • School Year 2017-18 (September-July)

The evaluation will track a sample of Rising students over those three years, and compare their progress both to a comparison group of students drawn from other private schools and government schools.

The overall evaluation will be based on a range of outcome measures, including standardised tests of reading and maths, a measure of writing skills, and a mixed-methods analysis of students' academic self-confidence and other learning dispositions.

The evaluation is based on what is known as a 'quasi-experimental' design rather than a randomised controlled trial (unlike our schools in Liberia, where we are part of a much larger RCT). But by matching the schools (on things like geography, fee level, and primary school exam scores), randomly selecting students within schools, and collecting appropriate student-level control variables (such as family background and socio-economic status) the idea is that it will ultimately be possible to develop an estimate of our impact over these 3 years that is relatively free of selection bias.

Figure 1: How the evaluation defines impact

Figure 1: How the evaluation defines impact

 

BASELINE

To make sure any estimate of learning gains is capturing the true impact of our schools, one of the most important control variables to capture is students' ability levels at baseline (i.e. at the start of the three-year evaluation period). This allows for an estimate of the 'value-added' by the student's school, controlling for differences in cognitive ability among students when they enrolled. Baselining for the evaluation took place in January and February 2016. The baseline report is available here. It showed:

  • That on average both Rising students (the treatment group) and students in the other schools (the comparison group) began their junior secondary school careers with similar ability levels in reading and maths. The two groups were, in other words, well matched;
  • That these averages were extremely low - for both reading and maths, approximately five grades below where they would be expected to be given students' chronological age.

YEAR ONE PROGRESS REPORT: RESULTS

The Year One Progress Report covers Academic Year 2016. The Ebola Crisis of 2014-15 disrupted the academic calendar in Sierra Leone. Students missed two full terms of schooling. The Government of Sierra Leone therefore introduced a temporary academic calendar, with the school year cut from three terms to two in 2015 (April-December) and again in 2016 (January-July). The normal (September-July) school year will resume in September 2016.

The Progress Report therefore covers a relatively short period - essentially 4.5 months from late January when baselining was undertaken to late June when the follow-up assessments took place. It would be unrealistic to see major impacts in such a short period, and any impacts that were identified would need to be followed-up over the next two academic years to ensure they were actually sustained. As the authors note, "it is a good principle to see annual progress reports as just that – reports that monitor progress and that treat gains as initial rather than conclusive. A more complete understanding of the extent to which learning in the Rising Academy Network has improved is to be gained towards the end of the study."

Nevertheless, this report represents an important check-in point and an opportunity for us to see whether things looking to be heading in the right direction.

Our reading of the Year One report is that, broadly speaking, they are. To summarise the key findings:

  • The report finds that Rising students made statistically significant gains in both reading and maths, even in this short period. Average scaled scores rose 35 points in reading (from 196 to 231) and 36 points in maths (from 480 to 516). To put these numbers in context, this change in reading scores corresponds to 4 months' worth of progress (based on the UK student population on which these tests are normed) in 4.5 months of instruction.
  • These gains were higher than for students in comparison schools. The differences were both statistically significant and practically important: in both reading and maths, Rising students gained more than twice as much as their peers in other private schools (35 points versus 13 points in reading, and 36 points versus 4 points in maths). Students in government schools made no discernible progress at all in either reading or maths. (For the more statistically inclined, this represents an effect size of 0.39 for reading and 0.38 for maths relative to government schools, or 0.23 for reading and 0.29 for maths relative to private schools, which is pretty good in such a short timespan.) 
  • The gains were also equitably distributed, in that the students who gained most were the students who started out lowest, and there were no significant differences between boys and girls.
  • Finally, there are early indications that students' experience of school is quite different at Rising compared to other schools. Rising students were more likely to report spending time working together and supporting each others' learning, and more likely to report getting praise, feedback and help when they get stuck from their teachers.

That's the good news. What about the bad news? The most obvious point is that in absolute terms our students' reading and maths skills are still very low. They are starting from such a low base that one-off improvements in learning levels are not good enough. To catch-up, we need to sustain and accelerate these gains over the next few years.

That's why, for example, we've recently been partnering with Results for Development to prototype and test new ways to improve the literacy skills of our most struggling readers, including a peer-to-peer reading club.

So what are my two stars and a wish?

  • My first star is that our students are making much more rapid progress in our schools than they did in their previous schools, or than that their peers are making in other schools they might have chosen to attend;
  • My second star is that these gains are not concentrated in a single subset of higher ability students but widely and equitably shared across our intake;
  • My wish is that we find ways to sustain these gains next year (particularly as we grow, with 5 new schools joining our network in September 2016) and accelerate them through innovations like our reading club. If we can do that, and with the benefit of 50% more instructional time (as the school year returns to its normal length), we can start to be more confident we are truly having the impact we're aiming for.

Take a look at the report yourself, and let us know what you think. Tweet me @pjskids or send me an email.

Rising Academy Partnership Schools launched in Liberia

Monday September 5th marks the start of the new school year in Liberia - and with it the start of a new relationship between Rising Academies and the Government of Liberia.

From today, five government elementary schools – three in Bomi County and two in Montserrado County – will become Rising Academy Partnership Schools. They will remain in public ownership, free to attend and non-selective, using qualified government teachers on the government payroll, observing the Liberian National Curriculum, and with government retaining responsibility for the physical upkeep of the school buildings. But responsibility for the day-to-day management of the schools and for improving the quality of teaching and learning will pass to Rising.

This effort is part of Partnership Schools for Liberia (PSL), a bold and deliberately experimental pilot programme to explore whether bringing in operators from outside government can help address the chronic crisis of education quality in the public system.

The case for change is compelling: for every 100 children of primary school age in Liberia, only 38 attend primary school, of whom only 23 will complete grade 6, of whom only about 8 will make it through secondary school and sit the WAEC exams at the end of grade 12, of whom only about 4 will pass. Along the way, even the kids in school aren’t learning what they need. To be considered fluent readers, Grade 3 students given an age-appropriate text ought to be able to read 45 to 60 words per minute correctly; in Liberia, the average is less than 20 words per minute.

Much has been written about the PSL programme – in The New York Times, The Guardian, Vox and HuffPo among others. Unfortunately, early (mis)conceptions about the programme have proved hard to shake. For a clear and comprehensive account of the programme from two people who actually know what they are talking about, this piece by Susannah Hares and Justin Sandefur is the best place to start.

Here’s the short version: under PSL, 90 primary schools (less than 3% of the total) will be handed over to outside operators to run in 2016-17. As well as Rising, these operators include international NGOs like BRAC, Streetchild and More Than Me, private school operators like Omega Schools and Bridge International Academies, and Liberian organisations Stella Maris Polytechnic and the Liberia Youth Network. As a relatively new organisation, Rising is proud to be among such distinguished company.

Operators are paid a fixed per capita grant for each student enrolled, and then held accountable for using this money to improve learning outcomes for students. If operators do well, they might be allowed to expand to manage additional schools in future; if they fail, they might be stripped of the schools they are running. If the programme as a whole shows it can make a difference to the quality of schooling, it might be expanded; if the programme as a whole fails, it will be shut down.

The key point is that these decisions will be based on rigorous evidence. A major attraction of the programme is that it gets beyond unhelpful and ideological debates about who should run schools and focuses on getting the data. PSL has been designed as a ‘gold standard’ randomised controlled trial, with schools randomly assigned to a treatment condition where they are run by a PSL operator or a control condition where they are not. Comparing what happens in these two groups of schools over time should therefore provide a reliable estimate of what difference (if any) it makes to have an outside operator. Formal details of the evaluation are on the AEA’s RCT Registry here.

The speed at which the PSL programme has moved from idea to implementation is staggering, particularly in a country where, as Liberia’s President Ellen Johnson Sirleaf has admitted, ambition is often thwarted by a lack of capacity in the system to get things done. Some operators have had a head-start, but in Rising’s case we only submitted our initial Expression of Interest on 4th May, only got the green light from government in mid-July, and were only notified which schools we were being asked to run in early August (and in one case even later than that).

Nevertheless, it was absolutely right for the Government to try to move quickly, rather than delay another whole year and risk losing momentum, and we have been doing our best to make the most of limited time available.

Candidates attend Rising's assessment centre

Candidates attend Rising's assessment centre

A major focus has been on staffing, screening existing staff and undertaking an urgent recruitment exercise. Our initial assessment showed that our schools had less than half the number of teachers they were supposed to have. In one case, a Principal had been assigned a teacher who had collected salary but never showed up for 3 years. That’s meant On August 23rd, we held an assessment centre for more than 200 recent teacher training graduates, with 12 candidates appointed to take up positions in our schools.

Teachers work through a card exercise during training

Teachers work through a card exercise during training

A second major focus has been on training. Through our work in Sierra Leone, we have developed an effective teacher pre-service training programme. Rather than lots of theory, teachers are given a small number of specific, high impact, practical skills to practise and receive feedback on. Rapid improvements in the level and quality of student engagement are possible as a result. Without the time to do the programme in full, 28 existing teachers received an abridged version of the training, with intensive in-service training taking place each afternoon for the first two weeks of term, and further training scheduled for later in the term.

Many schools lack basic infrastructure

Many schools lack basic infrastructure

A third focus has been on fixing some of the basic infrastructure in the schools. Bigger issues like leaky roofs remain the responsibility of government. But most of the schools didn’t have enough desks and chairs, and those they had were in disrepair, so we’ve procured more than 250 items of furniture to address the most urgent needs.

Finally, one of the freedoms granted to operators under PSL is to innovate with how the National Curriculum gets delivered. Some elements of our approach, like our highly effective phonics programme in partnership with Phonics International, remain just as relevant in Liberia as they do in Sierra Leone. But in other areas our international team of curriculum writers have been hard at work producing lesson plans and materials that will be appropriate for this new context.

With PSL, Rising embarks on a new and unfamiliar journey: a new country, and a new way of working. Among the teachers and principals, there is already a sense of excitement about what we might achieve together. Among parents too: student enrolment is up as parents get to hear about our new role in their local school.

Unlike so many traditional education programs which seek to raise the quality of outputs simply by increasing the number of inputs, PSL starts by correctly identifying the source of the education crisis as the way that schools are managed and held to account. With the right management and accountability, rapid improvements in student achievement are possible; without them, the system will continue to fail the children who need it most.

We wanted to be part of PSL because, when the history of this brave reform initiative is written, we want the world to know we did our best to make it a success.

If you are interested in learning more about Rising Academy Partnership Schools, tweet me @pjskids or send me an email.

Evaluation baseline report available

Rising Academies has commissioned a team from Oxford University to complete an independent impact evaluation of our work in Sierra Leone. The study will track the progress of a sample of Rising Academy students over three academic years, and benchmark this against the progress of a comparison group of students from government and private schools in Sierra Leone. The main outcome measures are tests of reading and maths. The team is also looking at writing skills, as well as academic self-confidence and learning dispositions.

The team conducted baseline assessments for English and Maths earlier this year. You can read the full baseline report here or the executive summary here.

The first follow-up assessments were completed in July, and so an annual progress report will be available later this year.

RAN joins CEI database

We're pleased to announce that the Rising Academy Network is the latest organization to be profiled in the Center for Education Innovations online database. CEI "seeks to fill the gaps in global understanding about innovative education programs striving to increase access to quality education for students in low income communities." CEI's database features more than 650 innovative education programs across 145 countries, and attracts over 10,000 visitors a month.

You can find RAN's profile here.

Simple, but not easy? Tackling the learning crisis

Across the developing world, more children are in school. We should celebrate that and acknowledge that the job is not yet done: in Nigeria alone, 10.5m children are out of school.

Nevertheless, it is time to move beyond a focus on getting kids into school and start focusing on the quality of the education they receive when they get there.

When will they ever learn?

In many parts of the world, we have created a learning crisis: more kids are in school, but they are not learning. “We are failing the children on a massive scale,” says celebrated development economist Esther Duflo, John Bates Clark Medal winner and author of Poor Economics. “There has been improvement in enrolment and in the physical capacity of schools. But learning is not about enrolment, teacher-student ratio, having latrines in school; it’s about if we are serious about learning.”

In September, world leaders will get together and agree that improving the quality of education should be one of the Sustainable Development Goals that replace the MDGs. Something must be done, they will say. But what?