On the eve of public reporting of NAPLAN tests throughout Australia, Ben Jensen (ex-OECD, now running the education program at the Grattan Institute) has a new report on the topic. His key argument is for value-added scores (which will be possible when/if we get 2010 test data). The money quotes:
In this report we advocate that:
The current measures of school performance published on the ‘My School’ website should be replaced with value-added measures of school performance because:
- Their greater accuracy creates a fairer system, particularly for schools in lower socio-economic communities;
- A focus on student progress rather than performance at a single point in time serves a variety of policy objectives and is more effective in improving instruction and school education.
School principals and teachers should be empowered to use value-added measures to improve instruction and school programs. To achieve this:
- A user-friendly information technology system should be developed that allows school principals and teachers to better analyse and then act upon their own performance data;
- Education and training to incorporate performance assessment into instruction and school programs should be provided;
- Resources should be provided for teachers and schools to develop programs based on value-added measures and disseminate best practice.
Value-added measures of school performance should become an important benchmark in school evaluation. School evaluators should make their qualitative judgements of good practice in the context of value-added performance measures;
Value-added measures of student progress should be the basis for categorising schools as under-performing. Developmental steps should be explicit, with additional support for under-performing schools; and
School principals should be granted autonomy to effectively lead the school for which they are being held accountable. Individual teachers have continually been shown to have the greatest impact upon student performance and school principals should be empowered to determine who teaches in their school.
The full report is here.
Companies are valued based on profit, not turnover, so why is “value add” so often ignored in the debate over transparency of school performance (eg see Joshua Gans’ recent post)?
I hope that a copy of this report has been sent to Julia Gillard, and to Joshua Gans too.
LikeLike
As a parent I’m wondering about the value of knowing where the kids’ school ranks. I’d prefer to have more information about how my own children are doing.
LikeLike
Value add is far more useful than a single data point.
The trouble with this measure is that it means schools will try to shift the dumb kids at higher year levels so their average scores go up in later years. Private schools already do this with below average performers so their year 12 averages look better.
LikeLike
@M: you’re right, but a sophisticated value-add measure could account for this by excluding new arrivals (or even tracking their results from their previous schools). The general lesson is that “dumb” measures of performance will lead to dysfunctional behaviour by schools.
LikeLike
Dave my sources within some schools suggest that they can already track internally their value add. They can tell whether their extra maths/reading assistance programs work to help the stragglers (all evidence seems to indicate that it does).
You are definitely right though that providing bad information is likely to lead to dodgy action and outcomes.
LikeLike
Education has come late to arguments over measurements, performance metrics and, most importantly, how to make meaningful comparisons that lead to improvements.
What I find ironic about teacher unions’ total opposition to
the tables is that they are in the business of measuring and evaluating and ultimately grading their students, yet refuse to allow their organizations to be subjected to a set of desired outcomes for their services. I would think that teacher involvement in the debate about value-added measurements to set useful targets would be a more positive action than outright rejection.
LikeLike
This debate is similar to the climate debate in some regards. Activists in the climate debate want an ETS, even if it isn’t a terribly good one, because at least a framework will be established that can be improved upon. This is the same reasoning that proponants of school accountability and the My School website are using. It’s ironic seeing Liberals make the ‘framework’ argument re education but reject it in the climate change debate, and leftist activists make the framework argument in climate change but reject it in education reform.
LikeLike
@andrew1: an ETS, even a bad one, creates a carbon price and so provides correct incentives. A bad education performance metric, in contrast, is as likely to provide perverse incentives as positive ones.
LikeLike
Dave,
That comment is true for any and all organizations employing performance metrics. Why is education the exception?
LikeLike
Dave, does no education performance metric count as a bad education performance metric? Because the notorious lack of transparency in schools seems to me to be the biggest culprit in creating so called perverse incentives.
LikeLike
@DP: it is true that bad incentives will always lead to bad outcomes; that is not specific to education. But what stands out here is the government putting in place poor incentives when, with some time and trouble, some good incentives (through value-added measures) could be created.
@Andrew1: I am all for transparency, but I don’t quite understand how a lack of transparency would create perverse incentives. Perhaps you could elaborate.
In any case, there is plenty of information available already for parents who take the trouble to visit the school and talk to principal and teachers. The website gave me new facts about my kids’ school (it came 400th odd in the State and had better/worse exam results than “comparable” schools 500km away about which I know nothing) but no new useful information.
LikeLike
Dave,
I don’t discount your point about developing sensible measures and incentives. Ancedotally, organizations have a tendency to measure what is easy rather than what is useful. (I worked with software systems for many years…)
It’s not unreasonable to criticize the govt for poor measures, but in the case of the (NSW?) teachers union they won’t even participate in measurement and comparison. And that law that the NSW Parliament passed to prevent public comparison by newspapers is incredibly pathetic.
LikeLike
@DP: I’m no supporter of the teaching unions. I’m not sure why you would think I am.
LikeLike
Dave,
I got off-topic. (Having my own performance metrics issues at the moment – lack of participation being a major obstacle.)
LikeLike