League tables

Chris Cook

Later this morning, Michael Gove, education secretary, will announce several big things. First and foremost, he is dropping his plan to introduce the EBC, his proposed new qualification for 16 year-olds, which has been attacked as fatally flawed since its announcement. Second, he will unveil details of the new curriculum. Both will deservedly absorb lots of column inches.

But Mr Gove will also announce a new pair of measures by which league tables will be constructed. This change might actually be the most important thing he does during his entire reign. League tables set out the incentives that drive schools. They define success and failure.

So what do we know? Schools will, first, be assessed on the share of pupils getting Cs or better in English and maths. A second measure will record whether children in each school do better or worse than children of similar ability – as measured by standardised tests at the age of 11.

This value-added score will gauge performance across English and maths, as well as three more core subjects and their three best ‘other’ subjects. This replaces the current measure – a crude tally of how many children get Cs or better in English, maths and three other subjects.

 Read more

Chris Cook

Today is league table day, when school exam results are published. The most interesting part of the table is the bottom: 195 schools* are below the government’s “floor targets”. These schools are risk of being taken over by a third party to turn them around (if the process is not already underway).

Schools in this category have fewer than 40 per cent of their pupils get Cs or better in English, maths and three other subjects. They must then also have fewer than 70 per cent of the schools’ pupils making “expected progress” in both English and maths.

A few system-level observations:

  • London does really well. Really well. Only 11 of its schools are below target. Only four are in inner London. The outer boroughs are now a bigger educational problem than the inner city.
  • None of the 164 selective schools was below the floor. Grammar schools cruise to the floor target, because they select bright kids. But some might repay a visit by the inspectors: three of them made less-than-expected progress in English.
  • About one third of failing schools are sponsor academies already. The DfE has yanked on the convert-to-academy lever a lot already. But there are another 132 schools left below the floor target not already attached to sponsors.
  • The academy chains are not going to find it easy to take them on. The best academies are all in London. The worst schools are not. There is limited really good improvement capacity in chains outside the capital, where it is needed.
  • The converter academies were not all good schools. Already, there are 14 which are not meeting the standard. The DfE’s terror of sorting out struggling academies is going to become an ever-bigger problem.

And here is the data: first, what type of schools are below the floor target. For the neophytes, voluntary-aided and -controlled schools are the two types of English religious schools. Foundation Schools and CTCs are types of schools with more independence from their local authorities than others (both are precursors to the academies). Studio Schools and UTCs are types of employer-led school (see here for more on them):

School type Above target Below target
Sponsor academies 249 63
Converter academies 666 14
CTCs 3
Community schools 897 72
Free schools 5
Foundation schools 475 33
Studio Schools 1
UTCs 2
Voluntary aided 399 11
Voluntary controlled 64 2

Second, this is where the struggling schools are. I have broken this down by the GCSE-age school population, which brings out some of the variation in regional school quality more clearly.

GCSE pupils in above-target schools GCSE pupils in below-target schools Proportion of pupils affected
East Midlands 41,799 4,490 9.7%
East of England 56,847 4,054 6.7%
London 71,817 1,746 2.4%
North East 25,450 1,782 6.5%
North West 70,592 4,802 6.4%
South East 78,976 5,113 6.1%
South West 50,789 1,736 3.3%
West Midlands 57,328 3,194 5.3%
Yorkshire and the Humber 51,500 4,359 7.8%

 Read more

Chris Cook

Last week, the TES, the leading UK teachers’ magazine, ran a number of fascinating pieces on the “EBC”, the proposed successor to the GCSE – the exam taken by English children at the age of 16. The basic point is that the Department for Education has come up with a plan for a new qualification that is causing grave concern within Ofqual, as has been made public, as well as among school leaders, inspectors and its own civil servants.

When the plan to reform GCSEs was originally leaked to the Daily Mail, it contained the claim that the new GCSE would only be for the brightest three-quarters of children. I wrote at the time that this would be problematic. The Lib Dems insist this aspect of the plan has gone. Some rightwingers appear to hold the opposite impression.

For their part, DfE officials are working under the assumption that children will need to know more to reach the lowest passing grade on the new qualification. But they also assume children will respond to the exam changes by learning more, so no more children will fail. This is, it is fair to say, an assumption resting on a rather thin evidence base.

Would it matter if this were to be wrong, and children were to leave with no qualifications, rather than getting an F or a G? After all, it is certainly true that an F or a G gives a pupil very little labour market benefit. For pupils themselves, these lower grades primarily act as a guide to how much further they have to go.

But the main benefit of awarding Fs and Gs at GCSE is to the school system. They mean that schools do not strong incentives to pick weaker pupils out for other, easier exams. And keeping such students on the GCSE track means they have some chance of getting a C or better, even if teachers misread their ability early on.

If you reform the system such that the exam does not measure the ability of more children, this important benefit will, one way or another, be eroded. And who will be affected? Once again, it is the children in the poorest neighbourhoods.

To illustrate this, this graph describes an exam system that works on the basis that 95 per cent of people will get some kind of passing grade – however low. I have used the average GCSE grade for each child in a mainstream state school as a proxy for their overall academic ability, and assumed that the five per cent with the lowest grades would fail under the new system. This is a bit rough ‘n’ ready, but is good enough for our purposes.

So what happens if a given exam excluded the bottom 5 per cent of children on this measure from some notional new examination? How many fail and so get “excluded” from measurement? You can see that a child in the poorest neighbourhoods has a 10 per cent chance of being in this band – twice the national average.

I have added a second band: “at risk”. This takes in the next 10 per cent of children, too. Schools might – wrongly – guess they will be below the line. Again, this line skews poor.

 Read more

Chris Cook

This week, I have written a fair amount about England’s schools, and how well the capital does. I thought that today, I would publish some data that will help explore some finer differences: how well do children do at a borough level?

Below the fold, I have worked out the FT score for each child (a score based on their performance in English, maths and three other GCSEs). I then ran a regression through the data, which predicts performance based on background and by local area.

This is, in effect, a similar exercise to the one in benchmarking school systems, and has all the same caveats. But this time around, the objective is to get a steer on how levels of attainment vary in different boroughs for an individual child of similar social circumstances. Read more

Chris Cook

I wrote a piece yesterday on the continued astonishing rise of London’s state schools. One of my brilliant colleagues posed an interesting question: what happens if a child moves into London?

Below, I have published how children who lived outside London at the age of 11 went on to do in their GCSEs (using our usual point score) at the age of 16.

I have divided this set of pupils twice: first, by whether they had moved into London by the age of 16 or not and second by how well they did in standardised tests at the age of 11.

 Read more

Chris Cook

In today’s Times, Greg Hurst writes about concerns that some academy chains might be a bit overstretched and find it difficult to continue growing. It has been pretty well established that the first round of “sponsor” academy takeovers was a success. The chains definitely improved the failing schools that they took over.

But some of the groups mentioned by Greg are not doing that well. To start with, here is a sample of academy chains’ results, using the government’s favourite measure: what proportion of children got Cs or better in English, maths and three other GCSE subjects in 2011? I have only included schools in the measure under their current leadership for three full years or more.

Provider Total Low PA Mid PA High PA
Academies Enterprise Trust (AET) 55.8% 28% 74% 95%
Ark Schools 63% 70% 83% 93%
The Bourne Family Trust 82% 88% 75% 100%
E-ACT 40% 28% 78% 100%
Harris Federation 67% 62% 86% 100%
Jack Petchey Foundation 61% 43% 59% 92%
Thomas Telford School 57% 47% 89% 100%
United Learning Trust (ULT) 50% 33% 73% 92%

 Read more

Chris Cook

Over the weekend, Ofqual announced it will examine the English modules that have caused so much concern lately, where many children who expected Cs were given Ds. This will focus on chunks of the new AQA English GCSE and, one assumes, take in the equivalent OCR and Edexcel* qualifications.

This is a very brief blogpost to briefly explain why this matters so much to schools (beyond the fact that they want their pupils to do well). First, we start off with a very simple chart using 2011 data: for each school, I have worked out the share of children passing English, maths and three other GCSEs with a C grade or above.

This measure (the “PACEM” metric) matters: it is the figure that is used to rank schools, and to decide whether they get shut down or not. A school where below 40 per cent of students are below the line is at risk of a forced change of management.

So I have ranked schools on this measure, bundled them into percentiles, and lined them up with the lowest league table position populations are at the left and the best are at the right.

For each percentile of schools, I have published two numbers:

  • The red section shows the share of pupils who passed on the PACEM measure, but only got a C in English. That is to say, pupils for whom a one-grade drop in results means falling below the PACEM waterline.
  • The blue section indicates children who passed with a higher grade in English. The two areas are stacked one on top of the other, so the line marking out the top of the blue section indicates the total pass rate.

 Read more

Chris Cook

UPDATE: 2 October 2012, to incorporate the latest ratings.

It’s official. Well, sort of. I’ve collected up the credit ratings that exist for the higher education sector, and all of those British universities (or university colleges) which have been rated are either prime or high-grade. (Italy, meanwhile, is not an Ivy League debt repayer.)

Institution Rating Outlook Rating issuer
University of Cambridge Aaa Stable Moody’s
 St Peter’s College, Oxford AAA Negative Fitch
 Lincoln College, Oxford AAA Negative Fitch
 Somerville College, Oxford AAA Negative Fitch
Keele Aa1 Negative Moody’s
Brunel Aa1 Negative Moody’s
De Montfort University Aa1 Negative Moody’s
Kings College, London AA Stable S&P
Lancaster University A+ Positive S&P
Nottingham, University of AA- Stable S&P
Sheffield, University of AA- Stable S&P

 Read more

Chris Cook

New Hefce data show England is experiencing the start of a market in undergraduate places with a very sudden shock. Read more

Chris Cook

Your birthday matters: children who are older when they start school as 4 year-olds outperform their peers. This is not a small effect, nor does it peter out as they get older. We can spot it easily at the national level among 16 year-olds. Read more

Chris Cook

The social mobility problem is not that there is a small number of weak schools serving a lot of poor kids. It is that poor children do badly in the majority of England’s schools. Read more

Chris Cook

An article in the TES, an education magazine, has caused some consternation – and rightly so. In a comment piece, written by a teacher, the author appears to describe being irritated at a child who is determined to get an A grade rather than a B at A-level.

That is not what the government wants this teacher to be doing. We can tell that from the incentives that this sixth-form teacher faces. The author works at a sixth form college, and if that child fails to get an “A”, it will show up in his college’s results. Sixth forms are ranked on the average grade attained by their students, and pushing a kid from a B to an A shows up in the school point score.

Were this teacher teaching a 16 year-old, however, his behaviour would be perfectly rational. The central measure for schools is the proportion of children getting passes of a C or better in five full GCSEs including English and maths.

Both regulation and league tables drive focus on that measure. There are buckets of data that reveal schools which are particularly focussed on that borderline, but as long as schools do well enough in the core measure, heads can safely ignore everything else.

As Graham Stuart, Tory chair of the education select committee has said, this measure offers no reward for pressing a child to move from a C to an A. It is rational for teachers to focus on getting children over the D/C borderline.

This measure also creates problems for those of us who follow DfE statistics. Read more