Monday, May 22, 2006

Hot Air: Appropriately Named?

Kevin Carey of Education Sector released a new report recently called “Hot Air: How States Inflate Their Educational Progress Under NCLB. After putting the data together they’ve come up with something called “The Pangloss Index”, which combines 11 different sets of data that the states self-report to identify those states that are the most aggressive in reporting the highest numbers on factors like 4th grade test scores, 8th grades scores, and dropout rates. Their thinking is to expose states like Wisconsin, which has the best cumulative scores and is thus #1 on the index, but gets there by playing fast and loose with the data-gathering required under NCLB.

The devil, as usual, is in the details, and it’s the details that render the report meaningless. Two paragraphs on page 13 are important here. The first is when they’re talking about how they figured the scores for 4th grade, 8th grade, and high school:

This amount, as well as the average proficiency rates in eighth grade and high school, is calculated by averaging separately reported reading and math proficiency rates. If a state reported a proficiency rate in one subject but not the other, the proficiency rate for the reported subject was used.

In the 4th grade, Kansas, Missouri, and Ohio reported scores in math, but not reading. Kentucky and Nevada had scores in reading, but not math. Kansas thus gets to roll along with the 84.4% that they reported in math, while Missouri’s lone score, also in math, is a 43%. Nevada has to use its reading score, 41.5%. My guess is that Carey wanted to stick to those grades that equate to the NAEP test, but I think he would have benefited the paper by taking a score from a nearby grade (3rd or 5th, for example), instead of letting a score stand by itself.

The other paragraph of importance:

Some states did not submit data for some measures. In those cases, states were, for ranking purposes, assigned the median value of those states that did submit data.

27 states had at least one area where they didn’t report a score; Washington State, for example, didn’t report scores for 8th grade math and reading because our big state assessment has been in the 7th grade. New York leads the pack with 6 different areas where they didn’t report a score. For some states they were able to use a score from a different test, the way I described above, but for those states that had no alternate score to rely on they are being judged by the standards of what others are doing, and that doesn’t make sense.

In sum, the goal of what Carey has done here is worthwhile. One of the reasons that we're all in the trouble that we are now is because of the games that have been played with data over the years. Carey, though, commits the same sin that he's taking the states to task for doing, and it lessens the overall impact of what could have been a pretty important piece of work.

1 Comments:

Blogger Mike in Texas said...

I don't really even feel the need to read the study to know its bogus. Making up indexes to measure schools is a favorite tactic of the "reform" crowd.

9:22 AM  

Post a Comment

<< Home