Wednesday, May 21, 2008

What Value Value Added?

There’s an interesting article in the May 7th Education Week on the “value added” model for judging teachers and schools. Under a value added (VA) system quality is judged by the change scores on a consistent scale; the MAP assessment from the NWEA is a good example of what a value-added test might look like. Some pieces from the article that struck me:

“My personal opinion is that this model is promising way more than it can deliver,” (Audrey Amrein-Beardsley) said in an interview with Education Week. “The problem is that when these things are being sold to superintendents, they don’t know any better.”
I’d be curious to understand what exactly it is that a data model like value added can really promise. The results of data analysis can be spun in a variety of ways, true, but the data itself is what it is. I’ve looked at VA as more a piece of the puzzle rather than as a whole puzzle in of itself, but maybe I need to look closer at the issue.

Later the article talks about the problems associated with using VA for programs like merit pay:

For example, results might be biased if it turns out that a school’s students are not randomly assigned to teachers—if, for instance, principals routinely give high-achieving students to the teachers who are considered the school’s best.
This is a legitimate concern. I’ve known several teachers in my short career who can do amazing things with gifted kids but can’t reach out to the low learners at all; by the same token, I want the low kids because the growth is the most spectacular in them. I think that this could be a strength of VA, because if you add 30 points to the scaled score of a low kid but only 5 to a high achiever, it’s clear where the most progress was made.

The most important piece of the article, and one that I’ve brushed on before:

But the more sophisticated the technique, the less understandable it could become for practitioners. The question is whether the added accuracy will make it harder for teachers and administrators to buy into value-added accountability systems, several experts say.
It’s critical that the teachers understand what the score on the test means, and that they know what factors could move that score up or down, especially if you intend to use this test to make a judgment about the teachers or their students. There’s nothing that breeds distrust faster than to be told, “You don’t need to know the details,” because that’s where the devil usually is.

It will be interesting to watch this conversation unfold.

Labels: , ,


Anonymous Anonymous said...

The other dimension of the value-added is defining "what is of value."

Sure, reading and some math is fairly obviously a value. What is the point of social studies and who decides what "growth" looks like? What is the point of art or physical education and how about measuring those values? Oh, and then how about the value of field trips, or intermural sports, or guest speakers or career awareness efforts?

I see the merits of the ideal and I want it for my kids, but there are more brambles than golf green in the path to implementation.


9:23 AM  

Post a Comment

<< Home