Sunday, April 17, 2011

The mythology of educational numeracy

The other day MinnCan came out with a new issue brief, undoubtedly written by their contractor 50CAN*, titled Building a Barometer of Teacher Effectiveness in Minnesota (pdf).  The brief is built upon an unhealthy belief in the policy value of suspect numbers in general and student test scores in specific.  Focusing on the evaluation of teacher quality, the brief alleges that Minnesota does a poor job at it, and that what ratings do exist don't have real "consequences," which is apparently hurting education in the state:
Ineffective evaluations. Because teacher evaluations are rare, seldom based on how much students are learning, and almost never have consequences, they have little impact on the quality of teaching in our public schools.
Packed into this one bullet point are a number of fallacies that education deformers rely on for their attacks on public school teachers. These misdirections pale in comparison to the larger contextual omission of the report, and indeed in all of MinnCan's ministrations: Focusing only on teacher quality to bridge educational gaps and improve learning is not likely to change outcomes.  Additionally the way that deformers are approaching teacher quality will in all probability actually hurt educational outcomes.

The factors that influence the success of any particular student are well known: roughly 60-70 percent is due to family and student circumstances; 30-40 percent is due to the school, classroom and teacher. Roughly 20 percent is attributable to the school. That means that only 15-20 percent of student outcomes are attributable to a particular classroom and teacher. Given that there are very few so-called "bad teachers," and that today even KIPP's Wendy Kopp admits that not all people in any profession can be exceptional, there is very little fertile ground for increased educational attainment by focusing on teachers.

Indeed parts of the educational and political establishment have come to this view. Studies by economists have shown great future benefits can be attained, educationally, economically and socially, by investing in children at an early age.  Yet that's not what the education deformers and MinnCan focus on. They are fixated with squeezing blood from the rock of "teacher quality." In pursuit of these dubious goals they are more than willing to de-professionalize and demoralize the teaching corp.

The same is true for general investments in our schools. Research has shown that the school itself accounts for about 20 percent of educational outcomes. Why not invest in schools, making them welcoming environments, open early and late, encouraging them to become hubs for a community. But in Minnesota we're doing just the opposite: over the past two years $2 billion has been withheld from them. Maybe MinnCan will take note of the funding crisis now that even the state's charter schools are having problems.  Alas, so far not a peep from MinnCan about the decrease in funding to schools. All their eggs are in the "teacher quality" basket.

Fascination with, and abuse of numbers for social science purposes are legendary in post World War II America. Led by the Rand Corporation, leading American policy makers sought to reduce understanding of complex human behavior to numbers. According to Alex Abella, author of a book on Rand titled Soldiers of Reason, Rand would “...ruefully acknowledge the futility of trying to reduce human behavior to numbers.” Rand's discovery led to other axioms about what happens when numbers are elevated from being a diagnostic tool into a policy tool. Donald Campbell coined his law in 1976:
"The more any quantitative social indicator is used for social decision-making, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor."
You'd think this would give pause to education deformers who seek to create rating systems built on test scores that would "have consequences" in education systems. At most, student test scores might be useful in evaluating entire schools or districts. Test makers and researchers alike agree that trying to tease out individual teacher effectiveness using student test scores is a folly and mistake. There are too many confounding variables. But in their zeal the deformers have no room for such contrary thoughts. Campbell could have added that such hubris tends to indicate corruption in those who propose such actions.

The MinnCan brief is chock full of proposals to create statistical indices tying student and teacher performance, including the ultimate goal of "Update[ing] our data systems to link student performance to individual teachers and enable tracking of student learning over time." They way they put this idea makes it seem merely a matter of mathematics. If only. It would be nice, too, if we could devise a formula to convert lead into gold, which would probably be an easier project.

It takes a lot of hutzpah to advocate that the state spend enormous sums of scarce education money developing unreliable data systems targeting a small percentage of school teachers who only control 15 to 20 percent of student outcomes in any case. Just how unreliable are these so-called "value-added" measures?  Even the best VAMs have a margin of error of 28 percent. That means an "average" teacher could be rated anywhere from 22 to 78 percent effective. An EPI briefing paper concluded that "value-added measures" were extremely unstable with relation to invidivual teachers over two or more years:
One study found that across five large urban districts, among teachers who were ranked in the top 20% of effectiveness in the first year, fewer than a third were in that top group the next year, and another third moved all the way down to the bottom 40%. Another found that teachers' effectiveness ratings in one year could only predict from 4% to 16% of the variation in such ratings in the following year. Thus, a teacher who appears to be very ineffective in one year might have a dramatically different result the following year. The same dramatic fluctuations were found for teachers ranked at the bottom in the first year of analysis. This runs counter to most people's notions that the true quality of a teacher is likely to change very little over time and raises questions about whether what is measured is largely a "teacher effect" or the effect of a wide variety of other factors.
In New York City, where value-added measures have now been kept for two years,  one teacher reported on his blog that his Teacher Data Report for last year was under 10 and this year is over 95 - an 85 percent change from one year to the next. The teacher reports doing nothing substantially differently for the two years as he is veteran teacher. The New York system, supposedly one of the most advanced in existence, has a given margin of error of 35 points, which of course is wildly exceeded in this one anecdotal case. More honest analysts describe VAM measures as "weak and error-prone." But not the geniuses at MinnCan.

In you throw in Campbell's Law, rampant cheating and teaching to tests, the measures become that much less reliable.  For MinnCan and the education deformers these facts are irrelevant. That is because theirs is a political crusade to privatize and commercialize public education, not to improve the learning of children. Real concern for education would include considering the possibility that your actions will degrade, not improve education, and that there are other, more fruitful avenues which hold promising possibilities, such as early childhood intervention.

Recently the Minnesota Legislature showed what happens when irresponsible organizations like MinnCan propagate false information about schools and teachers. Armed with studies from the education deformers Minnesota Republicans proposed legislation that would have ended teacher tenure, instead basing retention and teacher salaries on value-added measures that included a 50 percent weight on student test scores. This is what MinnCan means when it talks about how numbers should "have consequences."

Far from improving students' education, MinnCan and the deformers at the legislature are instead dead-set on ruining teachers' reputations and encouraging them to quit their profession, all by using bogus numbers and studies. The only remaining question is how much of this agenda will pass, given Mark Dayton's proclivity to compromise on education issues.
*  *  *
*According to MinnCan's prospectus, it will pay 50Can 40 percent of its budget - $1.5 million over three years.

No comments: