GCSE

Chris Cook

Today, I gave a brief presentation – based on our previous stories – on the performance of London schools to the excellent Centre for London. Some slides are a little mysterious without my burbling over the top, but I hope it’s understandable enough.


Chris Cook

Later this morning, Michael Gove, education secretary, will announce several big things. First and foremost, he is dropping his plan to introduce the EBC, his proposed new qualification for 16 year-olds, which has been attacked as fatally flawed since its announcement. Second, he will unveil details of the new curriculum. Both will deservedly absorb lots of column inches.

But Mr Gove will also announce a new pair of measures by which league tables will be constructed. This change might actually be the most important thing he does during his entire reign. League tables set out the incentives that drive schools. They define success and failure.

So what do we know? Schools will, first, be assessed on the share of pupils getting Cs or better in English and maths. A second measure will record whether children in each school do better or worse than children of similar ability – as measured by standardised tests at the age of 11.

This value-added score will gauge performance across English and maths, as well as three more core subjects and their three best ‘other’ subjects. This replaces the current measure – a crude tally of how many children get Cs or better in English, maths and three other subjects.

 Read more

Chris Cook

There is an iron law in English education: as any given argument about any problem with schools progresses, the probability that someone will claim grammar schools are the solution rapidly tends towards 1.

I thought I would set out the data on the grammar counties, where children are sorted at the age of 11 according to an academic test.

To do this, I have defined a new region of England: Selectivia. I have removed the biggest selective counties – Kent, Lincolnshire, Medway and Buckinghamshire – from their geographical regions and shoved them together into one new region*. So what is it like? First, you can see that this region is quite well off, compared to most regions, especially London.

Region IDACI score FSM
East Midlands 0.195 12.0%
East of England 0.168 9.2%
London 0.340 22.4%
North East 0.245 17.4%
North West 0.233 16.2%
Selectivia 0.162 8.8%
South East 0.150 8.3%
South West 0.164 9.4%
West Midlands 0.236 16.4%
Yorkshire and the Humber 0.216 14.6%

 Read more

Chris Cook

Last week, the TES, the leading UK teachers’ magazine, ran a number of fascinating pieces on the “EBC”, the proposed successor to the GCSE – the exam taken by English children at the age of 16. The basic point is that the Department for Education has come up with a plan for a new qualification that is causing grave concern within Ofqual, as has been made public, as well as among school leaders, inspectors and its own civil servants.

When the plan to reform GCSEs was originally leaked to the Daily Mail, it contained the claim that the new GCSE would only be for the brightest three-quarters of children. I wrote at the time that this would be problematic. The Lib Dems insist this aspect of the plan has gone. Some rightwingers appear to hold the opposite impression.

For their part, DfE officials are working under the assumption that children will need to know more to reach the lowest passing grade on the new qualification. But they also assume children will respond to the exam changes by learning more, so no more children will fail. This is, it is fair to say, an assumption resting on a rather thin evidence base.

Would it matter if this were to be wrong, and children were to leave with no qualifications, rather than getting an F or a G? After all, it is certainly true that an F or a G gives a pupil very little labour market benefit. For pupils themselves, these lower grades primarily act as a guide to how much further they have to go.

But the main benefit of awarding Fs and Gs at GCSE is to the school system. They mean that schools do not strong incentives to pick weaker pupils out for other, easier exams. And keeping such students on the GCSE track means they have some chance of getting a C or better, even if teachers misread their ability early on.

If you reform the system such that the exam does not measure the ability of more children, this important benefit will, one way or another, be eroded. And who will be affected? Once again, it is the children in the poorest neighbourhoods.

To illustrate this, this graph describes an exam system that works on the basis that 95 per cent of people will get some kind of passing grade – however low. I have used the average GCSE grade for each child in a mainstream state school as a proxy for their overall academic ability, and assumed that the five per cent with the lowest grades would fail under the new system. This is a bit rough ‘n’ ready, but is good enough for our purposes.

So what happens if a given exam excluded the bottom 5 per cent of children on this measure from some notional new examination? How many fail and so get “excluded” from measurement? You can see that a child in the poorest neighbourhoods has a 10 per cent chance of being in this band – twice the national average.

I have added a second band: “at risk”. This takes in the next 10 per cent of children, too. Schools might – wrongly – guess they will be below the line. Again, this line skews poor.

 Read more

Chris Cook

On the Today programme last week, Sir Michael Wilshaw, the chief inspector of schools, announced that Ofsted, the inspectorate, will start trying to piece together which local authorities are good at driving school improvement and which are weak.

This plan, intended to focus fire on local government, could end up drawing attention to the Department for Education. This is because Sir Michael will hold the local authorities to account for all local schools – including academies, independent state charter schools.

On the radio, he was up against David Simmonds, a Tory councilor from Hillingdon representing the Local Government Association, who pointed out that there is a particular problem with academies. He noted that academies, which now constitute half of all secondaries, answer directly to civil servants in the DfE – not to their local authority.

As a result, Mr Simmonds correctly pointed out that, councils have no power to sort things out when it comes to academies. Sir Michael replied, also rightly, that good local authorities do it anyway. The way that this works is that they lobby the DfE to take action. As it happens, a few days later, the TES reported that the pressure from the DfE on academies might soon become a bit more active.

But there remains a problem for local authorities if the DfE is slow-acting. This has been happening with Islington – one of the fastest improving boroughs in the country. Islington has urged the DfE to act on a struggling academy in the borough – the City of London Academy (COLA) – for some time. The COLA case study demonstrates that this can drag on and on. The department has been pestered about the school constantly.

We have some DfE officials’ notes on COLA from a year ago. Originally written for ministers, the notes explain the background and their position. Some betray a touch of irritation about the persistent London borough. Read more

Chris Cook

This week, I have written a fair amount about England’s schools, and how well the capital does. I thought that today, I would publish some data that will help explore some finer differences: how well do children do at a borough level?

Below the fold, I have worked out the FT score for each child (a score based on their performance in English, maths and three other GCSEs). I then ran a regression through the data, which predicts performance based on background and by local area.

This is, in effect, a similar exercise to the one in benchmarking school systems, and has all the same caveats. But this time around, the objective is to get a steer on how levels of attainment vary in different boroughs for an individual child of similar social circumstances. Read more

Chris Cook

I wrote a piece yesterday on the continued astonishing rise of London’s state schools. One of my brilliant colleagues posed an interesting question: what happens if a child moves into London?

Below, I have published how children who lived outside London at the age of 11 went on to do in their GCSEs (using our usual point score) at the age of 16.

I have divided this set of pupils twice: first, by whether they had moved into London by the age of 16 or not and second by how well they did in standardised tests at the age of 11.

 Read more

Chris Cook

Last summer, there was an eruption of concern among schools that the GCSE English exam had suddenly been made harder by a change in grade boundaries. Ofqual, the exams regulator whose job it is to keep exams equally easy in all years, certainly intervened: what is not clear is if it got it right, or whether it made it too difficult.

A judge is considering whether the boundary-setting was conducted via a fair process. But we now have some data with which to look at the issue from the National Pupil Database. I have GCSE English (or English language) results and each candidate’s scores at the age of 11 (although not which exam they took, nor their exam board*).

Since the aim of boundary-setting is to keep exams equally difficult, and since Ofqual believes the school system has not improved, we can use these two results together to tell us something: similarly able children at the age of 11 should get roughly the same grade in 2011 and 2012. There are horribly complex ways to do this formally, but I am going for an intuitive method. Read more

Chris Cook

At the moment, the Department for Education is considering changes to the league tables and the exam system. This seems an opportune moment to make a simple point about qualification-awarding and accountability: English school examinations are subject to measurement error in a really big way.

Here is a simple thought experiment to flesh it out. Imagine a class of 100 students. Let us specify that each one has a “true” ability that means that one pupil should get one point, one pupil should get two, one should get three and so on – up to 100 marks. Now, let’s award this class one of 10 grades: 90+ gets you an A, 80+ a B and so on.

Let us assume that the tests are perfect. If that were the case, you would get ten individuals in each grade. Easy enough. But what happens if we start introducing errors into the test? We can do that with a set of exotically named (but very simple) “Monte Carlo” estimates, which I calculated using this simple spreadsheetRead more

Chris Cook

Last week, I went to Wolverhampton where I spoke at a local debate, organised by the university and Pat McFadden, the local MP, about the local authority’s school. I was the warm-up act for Lord Adonis, former schools minister, setting the scene about the city’s education system before his talk on lessons on school improvement.

It was interesting event – and the city is clearly considering its future and the role of education within it. There is – judging by my inbox – serious and deep interest in improving schools in the city. One of the things I sought to do was set out Wolvo’s position in relation to the rest of the country – and what statistics about the city tell us.

Here is my presentation: Read more

Chris Cook

Last week, the excellent Paul Francis, political editor of the Kent Messenger, reported that Kent, the most significant selective county left in England had come up with a clever plan: to make the entry test for grammar schools “tutor-proof”.

This idea comes up a lot, largely from people promoting selection. You can see why: it is often presented as a means of squaring a problem. They can argue that grammar schools help bright poor children while dealing with the fact that very few get into them.

But, in truth, a properly administered test, which accurately captures the education enjoyed by people at the age of 11, should exclude large numbers of poor children. Not because they are intrinsically less able. But, at 11, the poor-rich divide is already a chasm. Read more

Chris Cook

On Thursday afternoon, journalists were taken into the basement of a Westminster building, fed chicken satay and walked through Ofqual’s report on the recent English GCSE. During the summer, a late shift in grade boundaries shocked schools, leaving many high-flying schools with significantly worse results than they had been expecting.

The most striking outcome of the Ofqual research is that it seems to find evidence of cheating. It is incidental to the main purpose of the review, which was to ask whether the shift in the grade boundaries was correct. But it’s a stunning – and quite clear – finding.

Here is the issue: English GCSE can be taken in such a way that the pupil has done everything except for teacher-marked “controlled assessments” in the final months. If they do that, the teachers know what marks each pupil needs. And teachers give those marks.

In the graph below, Ofqual have worked out how many marks candidates needed from their teachers to get a C. If they got a mark to the right of the red vertical line, the teacher gave them a high enough grade to get the C. The shape of that distribution is, frankly, a sign of something horribly wrong. Teachers are massaging marks.

 Read more

Chris Cook

So, we now know where free schools are going, I thought that I would quickly illustrate a curiosity about them. This map of local authorities is coloured by the performance of FSM-eligible children (a marker of poverty). Red is poor performance and blue good. Free schools are green.

I have chosen this measure because it’s a simple like-for-like metric. Differences are not scores simply accounted for by the fact that some areas have more poor children: I am looking only at deprived pupils’ attainment. This is a quick and dirty way to gauge LAs. Red areas, broadly speaking, are underperforming. Read more

Chris Cook

One of my grand theories is that public policy types are generally bad at geography. Or, at the least, they underestimate the importance of where you live. Here, below the fold, are two zoomable maps, coloured by the school performance of local state-educated children. The map is based on where the children live, not where they go to school. To explain:

  • The colouring is red for weaker results and blue for better ones. Darker colours mean more extreme results. If you want detail on an area, click on any one of the blobs and it should give you a run down of local statistics, where possible.
  • Both maps are coloured according to FT score results: that is the sum of state-educated pupils’ scores in English, maths and their top three other subjects.    Other data, including official measures, are in the boxes that pop up.
  • On the first map, the geographical blobs are smaller than on previous maps: the lowest super output area in high density places, and the middle-layer output area in zones of low density (this way, we can show maximum detail).
  • That map can be quite frazzling. The second might be more to some people’s tastes. This is exactly the same sort data, just arranged by parliamentary constituency. Since they are bigger lumps, we can include more detailed data.
  • For the constituencies, I have given a barrage of results for all local children in state schools. But also the same just for FSM-eligible children, and for children dubbed “middle attainers” – kids who score in the middle tenth of results aged 11.
  • (NB – Where statistics are missing, it is prevent people combining data sources to work out something about individual children.)

If you want a tour, I’d recommend scrolling along the coasts. Check out some of the coastal towns, and look at the belt of towns and cities between Hull and Liverpool. Also, take a peek at how few dark red areas there are in London. In-borough variation is interesting, too: look at the massive variation within, say, Kent. Read more

Chris Cook

A big story we have published records the stunning improvement in London’s schools that has taken place over the past decade (also: analysis on the topic).

As part of the number-crunching I did for to that story, I can also provide an update from our measure on social mobility in schools – how much does poverty damage your school results? It’s not good news, alas.

Last year, we reported that our educational mobility index had been rising for five consecutive years – from 2006-10. Unfortunately, this year, things deteriorated a little. That blip upwards in 2010-11 means poverty exerted a bigger influence on the school results of children in 2010-11 than it had in 2009-10.

As a reminder, for those of you who have not committed these things to memory: we measure this through quite a simple metric. First, we draw our old friend, the Graph of Doom, which shows how exam results interact with poverty:

To come up with this graph, we divide the country into hundredths, by their neighbourhood deprivation. Then we plot each grouping’s average score on the line, according to a simple performance measure (which I’ve tweaked since we last did this). Read more

Chris Cook

Your birthday matters: children who are older when they start school as 4 year-olds outperform their peers. This is not a small effect, nor does it peter out as they get older. We can spot it easily at the national level among 16 year-olds. Read more

Chris Cook

An article in the TES, an education magazine, has caused some consternation – and rightly so. In a comment piece, written by a teacher, the author appears to describe being irritated at a child who is determined to get an A grade rather than a B at A-level.

That is not what the government wants this teacher to be doing. We can tell that from the incentives that this sixth-form teacher faces. The author works at a sixth form college, and if that child fails to get an “A”, it will show up in his college’s results. Sixth forms are ranked on the average grade attained by their students, and pushing a kid from a B to an A shows up in the school point score.

Were this teacher teaching a 16 year-old, however, his behaviour would be perfectly rational. The central measure for schools is the proportion of children getting passes of a C or better in five full GCSEs including English and maths.

Both regulation and league tables drive focus on that measure. There are buckets of data that reveal schools which are particularly focussed on that borderline, but as long as schools do well enough in the core measure, heads can safely ignore everything else.

As Graham Stuart, Tory chair of the education select committee has said, this measure offers no reward for pressing a child to move from a C to an A. It is rational for teachers to focus on getting children over the D/C borderline.

This measure also creates problems for those of us who follow DfE statistics. Read more