GCSE

Chris Cook

Today, I gave a brief presentation – based on our previous stories – on the performance of London schools to the excellent Centre for London. Some slides are a little mysterious without my burbling over the top, but I hope it’s understandable enough.


Chris Cook

Later this morning, Michael Gove, education secretary, will announce several big things. First and foremost, he is dropping his plan to introduce the EBC, his proposed new qualification for 16 year-olds, which has been attacked as fatally flawed since its announcement. Second, he will unveil details of the new curriculum. Both will deservedly absorb lots of column inches.

But Mr Gove will also announce a new pair of measures by which league tables will be constructed. This change might actually be the most important thing he does during his entire reign. League tables set out the incentives that drive schools. They define success and failure.

So what do we know? Schools will, first, be assessed on the share of pupils getting Cs or better in English and maths. A second measure will record whether children in each school do better or worse than children of similar ability – as measured by standardised tests at the age of 11.

This value-added score will gauge performance across English and maths, as well as three more core subjects and their three best ‘other’ subjects. This replaces the current measure – a crude tally of how many children get Cs or better in English, maths and three other subjects.

 

Chris Cook

There is an iron law in English education: as any given argument about any problem with schools progresses, the probability that someone will claim grammar schools are the solution rapidly tends towards 1.

I thought I would set out the data on the grammar counties, where children are sorted at the age of 11 according to an academic test.

To do this, I have defined a new region of England: Selectivia. I have removed the biggest selective counties – Kent, Lincolnshire, Medway and Buckinghamshire – from their geographical regions and shoved them together into one new region*. So what is it like? First, you can see that this region is quite well off, compared to most regions, especially London.

Region IDACI score FSM
East Midlands 0.195 12.0%
East of England 0.168 9.2%
London 0.340 22.4%
North East 0.245 17.4%
North West 0.233 16.2%
Selectivia 0.162 8.8%
South East 0.150 8.3%
South West 0.164 9.4%
West Midlands 0.236 16.4%
Yorkshire and the Humber 0.216 14.6%

 

Chris Cook

Last week, the TES, the leading UK teachers’ magazine, ran a number of fascinating pieces on the “EBC”, the proposed successor to the GCSE – the exam taken by English children at the age of 16. The basic point is that the Department for Education has come up with a plan for a new qualification that is causing grave concern within Ofqual, as has been made public, as well as among school leaders, inspectors and its own civil servants.

When the plan to reform GCSEs was originally leaked to the Daily Mail, it contained the claim that the new GCSE would only be for the brightest three-quarters of children. I wrote at the time that this would be problematic. The Lib Dems insist this aspect of the plan has gone. Some rightwingers appear to hold the opposite impression.

For their part, DfE officials are working under the assumption that children will need to know more to reach the lowest passing grade on the new qualification. But they also assume children will respond to the exam changes by learning more, so no more children will fail. This is, it is fair to say, an assumption resting on a rather thin evidence base.

Would it matter if this were to be wrong, and children were to leave with no qualifications, rather than getting an F or a G? After all, it is certainly true that an F or a G gives a pupil very little labour market benefit. For pupils themselves, these lower grades primarily act as a guide to how much further they have to go.

But the main benefit of awarding Fs and Gs at GCSE is to the school system. They mean that schools do not strong incentives to pick weaker pupils out for other, easier exams. And keeping such students on the GCSE track means they have some chance of getting a C or better, even if teachers misread their ability early on.

If you reform the system such that the exam does not measure the ability of more children, this important benefit will, one way or another, be eroded. And who will be affected? Once again, it is the children in the poorest neighbourhoods.

To illustrate this, this graph describes an exam system that works on the basis that 95 per cent of people will get some kind of passing grade – however low. I have used the average GCSE grade for each child in a mainstream state school as a proxy for their overall academic ability, and assumed that the five per cent with the lowest grades would fail under the new system. This is a bit rough ‘n’ ready, but is good enough for our purposes.

So what happens if a given exam excluded the bottom 5 per cent of children on this measure from some notional new examination? How many fail and so get “excluded” from measurement? You can see that a child in the poorest neighbourhoods has a 10 per cent chance of being in this band – twice the national average.

I have added a second band: “at risk”. This takes in the next 10 per cent of children, too. Schools might – wrongly – guess they will be below the line. Again, this line skews poor.

 

Chris Cook

On the Today programme last week, Sir Michael Wilshaw, the chief inspector of schools, announced that Ofsted, the inspectorate, will start trying to piece together which local authorities are good at driving school improvement and which are weak.

This plan, intended to focus fire on local government, could end up drawing attention to the Department for Education. This is because Sir Michael will hold the local authorities to account for all local schools – including academies, independent state charter schools.

On the radio, he was up against David Simmonds, a Tory councilor from Hillingdon representing the Local Government Association, who pointed out that there is a particular problem with academies. He noted that academies, which now constitute half of all secondaries, answer directly to civil servants in the DfE – not to their local authority.

As a result, Mr Simmonds correctly pointed out that, councils have no power to sort things out when it comes to academies. Sir Michael replied, also rightly, that good local authorities do it anyway. The way that this works is that they lobby the DfE to take action. As it happens, a few days later, the TES reported that the pressure from the DfE on academies might soon become a bit more active.

But there remains a problem for local authorities if the DfE is slow-acting. This has been happening with Islington – one of the fastest improving boroughs in the country. Islington has urged the DfE to act on a struggling academy in the borough – the City of London Academy (COLA) – for some time. The COLA case study demonstrates that this can drag on and on. The department has been pestered about the school constantly.

We have some DfE officials’ notes on COLA from a year ago. Originally written for ministers, the notes explain the background and their position. Some betray a touch of irritation about the persistent London borough. 

Chris Cook

This week, I have written a fair amount about England’s schools, and how well the capital does. I thought that today, I would publish some data that will help explore some finer differences: how well do children do at a borough level?

Below the fold, I have worked out the FT score for each child (a score based on their performance in English, maths and three other GCSEs). I then ran a regression through the data, which predicts performance based on background and by local area.

This is, in effect, a similar exercise to the one in benchmarking school systems, and has all the same caveats. But this time around, the objective is to get a steer on how levels of attainment vary in different boroughs for an individual child of similar social circumstances.