The annual school league tables confusion


11:34 am - November 27th 2007

by Mike Ion    


      Share on Tumblr

The silly season is almost upon us. Soon the likes of the Daily Mail will be publishing a list of the ‘best’ and the ‘worst’ secondary schools in the country. Local papers will be naming and shaming those schools in their area that come at the bottom of the league tables and the letters pages will be full of indignant parents either defending the school their child attends or calling for the head and the governors to go.

So if we have to have school results published (sadly I think this is a Genie that is well and truly out of the bottle) can we at least agree on the format in which these results should be published. At present the DCFS publishes GCSE results in three different ways: raw results, value-added results and contextual value-added results (CVA). Confused?

Well you might well be if, as a parent, you were trying to judge whether school X is successful, complacent or under-achieving.

Let’s take a take the example of the fictitious St Helpusall Academy. The raw results put the school it in the top 25% of all secondary schools in the country but when you apply CVA data analysis it is in the bottom quartile. Which ‘table’ provides you with the most accurate picture? Which data set is most useful in evaluating the overall effectiveness of the school?

For me the answer is simple. If we have to have school results published nationally and therefore if we have to have league tables of schools let us use one main data set and let it be CVA. Why? Because raw results tell you little about how good a school is at its core function: teaching and learning.

Raw results simply tell you about the prior attainment of the kids on entry. In contrast CVA looks at the progress that the pupils make whilst in the school, in other words it tells you the difference the school has made to the life chances of a particular cohort of children. There are obviously challenges with such a proposal – not least being that such a move would be unlikely to attract ‘positive’ media attention. One way round the problem might be to publish each measure separately at different times of the year. That way, they would not be competing in the media for attention at the same time.

The truth is that a second set of league tables, showing the CVA results for the school would be a much better indication of teaching quality in a school than the old-fashioned league table of 5+ GCSE higher passes, a measure that will almost always have schools in the leafy suburbs at the top and the complex inner-city ones at the bottom.

The challenge then, over a period of time, would be to persuade parents and the general public that this was the case, that what makes a school a ‘good’ school must be linked to the quality of the teaching and learning and its impact on pupil outcomes. Now is the time to take on that challenge.

————-
This is a guest post. Mike Ion blogs here and at CIF.
He was Labour’s PPC for Shrewsbury in 2005.

    Share on Tumblr   submit to reddit  


About the author
This is a guest post. Mike Ion was Labour PPC for Shrewsbury in 2005. He blogs at mike-ion.blogspot.com and for Comment is free.
· Other posts by


Story Filed Under: Blog ,Education

Sorry, the comment form is closed at this time.


Reader comments


Yes the “old fashioned” idea that schools should be measured by results, after all what do results matter ?
I don’t buy the idea that measuring how much “difference” a school makes is usefull measurement. If the quality of their inputs is poor and the school make them average, they would score well on CVA , but if that means most kids are leaving at 16 with 3 GCSEs, as opposed to none, then it’s an acheivment, but it’s still not a school I would want my kids going to. Only someone who doesn’t have kids would expect parents to beleive that compromising their education as part of an experiment in social engineering which may or may not produce an improved education system at some unspecified point in the future is a good idea. Good luck (not) with persuading them that it is .

What’s your suggestion, Matt?

Education vouchers, expansion of grammar schools, technical and vocational training for the non-academic. Reduce funding for tertiary education to pay for it.
I’m still seething over Mr Nu Labs assumption that parents are too stupid to understand league tables, believe it or not some of us are educated enough to understand simple statistics without the help of the education commisariat.
I so would not vote for a party that he represented.

Education vouchers appear to have raised standards when tried in both the US and Sweden.

http://www.economist.com/world/international/displaystory.cfm?story_id=9119786

And, yet again, what a wonderfully lazy invocation of the bogeyman Daily Mail.
Do not most newspapers publish these league tables?

Or are you suggesting that only Mail readers are interested in the quality of their childrens’ education?

chrisc

It is the way the likes of the Mail use the data. If schools with a superb intake at KS 3 go onto achieve 80%+ 5 + A*-C GCSE passes it is deemed to be a ‘good’ school, a ‘top’ school etc. Actually the fact that 20% of the intake did not achieve 5 or more higher passes may indicate that it is at best, an average school. If a school with a very poor intake (in terms of prior attainment) achieves 40% 5 + A*-C GCSE passes it is a ‘poorly performing’ school or a ‘failing school.’

I think we need a proper debate surrounding the language of school improvement and the impact that schools have on the progression that pupils make.

Your entire argument is premised on the notion that academic ability is “learned”, and is evenly distributed, when there is no evidence to support either position.
The biggest predictor of acedemic acehivment is parental aspiration, the middle classes tend to be aspirational and they tend to live in catchment area enclaves, producing a clustering effect, one manifesttaion of which is school leage tables, which are really just showing where the best pupils live, on that I would agree with you. Whilst this is all very interesting for a sociologist, it’s irrelveant to education, which should concentrate on the highest possible attainment for all it’s pupils. The fact that for some that will be lower, in terms of exams passed, than others, is not something the education system can, or should, try to correct – I think 40 years of comprehensive education proved that.
Rather than yet another manipulation of exam statistics, I’d be more intersted to read stats on the number of guardian readers, “working in education”, who privately educate their kids.

The fact that for some that will be lower, in terms of exams passed, than others, is not something the education system can, or should, try to correct

Isn’t this the point of the value-added measure though? The important thing to measure is surely improvement, rather than absolute achievement, which you pretty much just stated is out of reach for some people, and only shows you that a particular school probably had good inputs, rather than showing what the school is actually doing.

But the point is, relative to the ability of the pupils, the school makes almost no difference to educational outcomes, so what’s the point of measuring value added, other than as an internal bureacratic process – what is the use of that information ? When you go to an employer, or into further education, they are intersted in absolute outcomes, not whether your exam results were an improvement for the school.

But you don’t put your school’s position in the league tables on your CV, you put your own results. This is about measuring schools, not pupils.

If you are interested in measuring how good a school is (and I think that is what parents care about), how much they improve the kids who attend is much more important, especially if your kid isn’t the sharpest knife in the drawer (if your kid is already brilliant, it is unlikely that the school they attend will have much impact on their outcome).

Neither measure is very good, because they don’t give useful information to a parent or to the state. The problem with both is that they measure the wrong averages. The results table averages over the pupils at the school, whilst the value added table averages over all initial attainment levels. The assumption behind value added is that there is a single ‘quality of teaching’ of a school that acts the same on all starting levels. This assumption is fairly unlikely to be true. What a parent wants to know is the expected value of attainment after the child has been at the school, given their current level (and maybe the variance and throw in a confidence interval for these estimates too ;-)).

Two hypothetical examples:

At school 1, most students are doing badly, but they are well taught and do well at the end. The school’s VA is high. However, effort is concentrated on improving the ability of the majority of the pupils, and so people who go into the school doing well, end up not improving much at all. This school would be a bad choice for a parent with an already high achieving kid.

At school 2, most students are already high achievers, and so don’t improve much. This school has a very low VA. The school’s ethos though is to focus on people who are left behind, and so low achievers who do get into this school do very well. This would be a good choice for a parent with a low achieving kid.

There’s no way around this problem if you insist on giving a single numerical measure for how good a school is. The best thing to do would be to oppose this idea that everything can be reduced to a single statistic. That’s not to say that statistical analysis is useless. For example, it seems from my quick googling of CVA that it works by dividing children into a range of different categories based on various factors including prior attainment, and then looks at how well each group does at that school compared to the average over all schools. Assuming they have chosen a large enough number of categories, this information is useful (although obviously more difficult to make sense of than a single number). The state can use it to determine whether a school is well suited to the local population, and a parent can use it to work out how well their kid is likely to do at that school. Reducing all this information to a single number and producing league tables though is really stupid. It’s not just stupid, it’s potentially harmful. By attempting to maximise these numbers, schools might well be doing something that is against the interests of their local community.

Rather than saying that CVA is a less irrational measure of a school’s performance that the other measures, wouldn’t it be better to do away with irrational measures entirely?

Dan – that’s a very convincing argument for not having a national curriculum or any form of stats issued by the qaungocracy.. When I went to school ( a grammar in the 1970s) the only performance indicators available to my parents were % who passed O levels, % who passed A levels, % who went to university and % who went to oxbridge. Everything else is meaningless sociological b******t.

Well as I said, I don’t think statistical analysis is necessarily useless, just that arbitrarily reducing things to one dimension is to oversimplify. Careful use of statistics could reveal useful information. The problem is that there is a political demand for a simple one-dimensional measure of how good a school is, and it’s this demand that is meaningless.

Actually you could make exactly the same argument that the stats you mentioned misrepresent the situation (because they will largely reflect the intake of the school). In fact, my guess would be that in terms of their political usage, they are probably even worse than measures like CVA.

But what’s wrong with representing the intake of the school – if that translates into exam results ?


Reactions: Twitter, blogs




Sorry, the comment form is closed at this time.