Clearly we shouldn't believe Mayor Daley when he says the Chicago Public Schools are "on the way to becoming the best urban school district in the nation." But does the unprecedented rise in test scores tell us anything at all? The various articles in the press are pretty confused on the matter, and offer all kinds of reasons for the gains without saying anything definitive.
One possibility mentioned -- and one that's supported anecdotally by a CPS teacher I Know -- is that teachers are teaching to the test. But it's not clear why things should be any different in 2006 than they were in 2005; didn't teachers know how to teach to the test then? Perhaps the effort was more systematic and coordinated; if so, that would be news.
Another problem with the results is that for the test which showed the highest gains, they changed the passing requirement:
The jump in eighth-grade math -- once the hardest test to pass -- was astronomical, from roughly 33 percent passing to 66 percent. However, that increase came after state officials lowered the passing score from the 67th to the 38th percentile.If I understand percentiles properly, they are scaled against the raw test score based on where the test taker would be expected to fall in a distribution (presumably of the population you're testing against). So, if you set the bar at the 67th percentile, you would expect 33% to pass if the distribution of your test-takers looked like the population you were scoring against; and of course this is exactly what happened. Then if you lower your bar to the 38th percentile, you should expect 62% to pass, which again is pretty close to what happened. Whether or not that 4% increase beyond what might have been expected (from 62% to 66%) is significant or otherwise explicable I can't say -- again, that might be news. My point though is that most of that particular increase is a statistical triviality built into the way those scores are calculated and where the bar is set.
(I'm also aghast that the old passing standard was the 67th percentile; this means 2/3 of students were expected to fail. Maybe math skills are just so unfathomably bad that it's appropriate to fail this many students, but it really makes me wonder 1) how they arrived at that requirement and 2) how they decided to change it.)
Other possible reasons given for the increase -- more colorful testing materials, more time given to students to complete the test -- shouldn't make a difference on a test like this if it's properly scaled. Of course, it's good that they're making progress on test design.
Post a comment