Data Points

Our curated feed of data stories in the FT and elsewhere

Chris Cook

One of my grand theories is that public policy types are generally bad at geography. Or, at the least, they underestimate the importance of where you live. Here, below the fold, are two zoomable maps, coloured by the school performance of local state-educated children. The map is based on where the children live, not where they go to school. To explain:

  • The colouring is red for weaker results and blue for better ones. Darker colours mean more extreme results. If you want detail on an area, click on any one of the blobs and it should give you a run down of local statistics, where possible.
  • Both maps are coloured according to FT score results: that is the sum of state-educated pupils’ scores in English, maths and their top three other subjects.    Other data, including official measures, are in the boxes that pop up.
  • On the first map, the geographical blobs are smaller than on previous maps: the lowest super output area in high density places, and the middle-layer output area in zones of low density (this way, we can show maximum detail).
  • That map can be quite frazzling. The second might be more to some people’s tastes. This is exactly the same sort data, just arranged by parliamentary constituency. Since they are bigger lumps, we can include more detailed data.
  • For the constituencies, I have given a barrage of results for all local children in state schools. But also the same just for FSM-eligible children, and for children dubbed “middle attainers” – kids who score in the middle tenth of results aged 11.
  • (NB – Where statistics are missing, it is prevent people combining data sources to work out something about individual children.)

If you want a tour, I’d recommend scrolling along the coasts. Check out some of the coastal towns, and look at the belt of towns and cities between Hull and Liverpool. Also, take a peek at how few dark red areas there are in London. In-borough variation is interesting, too: look at the massive variation within, say, Kent. Read more

Chris Cook

A big story we have published records the stunning improvement in London’s schools that has taken place over the past decade (also: analysis on the topic).

As part of the number-crunching I did for to that story, I can also provide an update from our measure on social mobility in schools – how much does poverty damage your school results? It’s not good news, alas.

Last year, we reported that our educational mobility index had been rising for five consecutive years – from 2006-10. Unfortunately, this year, things deteriorated a little. That blip upwards in 2010-11 means poverty exerted a bigger influence on the school results of children in 2010-11 than it had in 2009-10.

As a reminder, for those of you who have not committed these things to memory: we measure this through quite a simple metric. First, we draw our old friend, the Graph of Doom, which shows how exam results interact with poverty:

To come up with this graph, we divide the country into hundredths, by their neighbourhood deprivation. Then we plot each grouping’s average score on the line, according to a simple performance measure (which I’ve tweaked since we last did this). Read more

Kate Allen

With the ONS publishing the results of its latest attempt to measure British people’s wellbeing, it’s worth a quick recap of how this compares to other countries’ methods as the collection of international wellbeing data is at an early stage.

Whilst the OECD is in the process of developing guidance to harmonise standards and approaches, existing surveys – including the World Values Survey, the European Social Survey and the Gallup World Poll – vary in approaches.

The ONS questions combined short-term measures with longer-term, more reflective indicators, on a scale of 0-10.

Gallup asks people to rate the quality of their life on a scale of 0-10, while the ESS and the WVS both ask respondents how satisfied they are with their life as a whole, again on a scale of 0-10. They also ask how happy they are, with the ESS again using a 11-point scale and the WVS offering a phrase-based menu of choices. Read more

Kate Allen

The steady improvement in the number of fatal injuries in UK workplaces appears to have tailed off, according to data recently-released to the FT by the Health & Safety Executive*.

173 workers were killed on the job in 2011/12, a rate of 0.6 deaths per 100,000 workers.

Taking into account chance variation, the overall trend suggests that death rates have plateaued since 2008 after a decade of downward trend.

 Read more

Chris Cook

The Daily Mail has published a rather startling story: from 2016, children will sit something akin to the old O-levels. Some parts of the story are relatively uncontroversial: the idea that there should be one exam board in each subject has many friends.

The newspaper also discusses abolishing the National Curriculum for secondary schools. However, if you have a single GCSE available in each subject, that sets a national curriculum in all but name. So these are less interesting than the sum of their parts.

But, if the Mail is correct, there is one proposal which stands out: splitting the GCSE. According to the report, under this new scheme some children would get the new O-level, and the bottom 25 per cent would take “CSEs”. This strikes me as a high-risk policy.

The GCSE’s strength is that it is a full-spectrum exam, measuring low to high ability. It includes questions designed to distinguish candidates that should get a G from candidates than deserve an F, as well as questions to filter A* candidates from those getting an A.

This is also its greatest PR weakness: it gets attacked by people citing the low-level questions. The Mail approvingly notes “questions like ‘Would you look at the Moon with a microscope or a telescope?’ from science GCSEs will be a thing of the past.”

The benefit of this system is that you get comparable qualifications, and there is no need for schools to attempt to sift children, guessing who will finish up with less than a C. The GCSE exams themselves do that work for them. But, according to the Mail:

Mr Gove believes those teenagers have been encouraged to think that a D, E, F or G grade at GCSE is a ‘pass’ when the real world treats those grades as a ‘fail’.

I confess that I do not see how it logically follows that the lower end of the GCSE should therefore be replaced with a CSE. The government would replace a D at GCSE with a certificate where the top grade is capped at a D. Maybe something got lost in the briefing.

The change would, however, have significant practical effects. Read more

Chris Cook

At the moment, groups putting forwards bids to open free schools – new academies opened from scratch – are finding out whether they have been approved for 2013 opening. This is an opportune moment to take a quick look at this programme.

Last week, I explained part of why the “converter academies” programme is so popular: it usually comes with a cash incentive to join in. But free schools have their own funding wrinkle. This one encourages primary free schools to be smaller than other local schools.

Using the DfE’s formula for free school funding, we can work out how much a primary free school would get in revenue (day-to-day) funding, plotted against how big it is, if it were to open at full capacity in the London borough of Camden in 2012-13.

Camden free school funding per pupil

This is the output of a formula: every primary free school gets a £95,000 payment plus a certain amount per child, which varies from borough to borough. In Camden, once you have counted in the pupil premium, SEN (special educational needs) funding and other funding, each extra child brings in, on average, an extra £5,870.

But the structure of the formula – a lump sum plus a roughly flat per-pupil payment – means that the amount you receive on average falls as the school grows. This is because the £95,000 lump sum (which is the same for all boroughs) gets shared between more and more pupils.

 Read more