A read of a new study from a new progressive think-tank comparing charter school accountability scores (as measured by the Department of Public Instruction’s new report cards) with regular public school accountability scores left me pondering a few questions:
- Why did the University of Illinois at Chicago professor (who has a long track-record of diabetes research) that conducted the statistical analysis not author the report? Why were the conclusions from the analysis left up to others?
- Why did the authors make such a strong conclusion: “The data clearly show that public schools are doing a better job offsetting the effects of poverty on education than their charter school counterparts”?
- Why didn’t the authors consider that charter schools are not all created equal?
- Why didn’t the authors mention that different types of schools are measured in different ways by the DPI report cards?
Most troubling to me was the false premise on which this study is built. No responsible charter advocate is arguing that charter schools are automatically higher achieving than regular public schools. The basic premise of charter schools is that schools failing to deliver promised results lose their charter.
I could go on like this, but instead I had a little fun this morning and re-created the study, and used the author’s methodology to test another question. First I built a data-set using the report card data and successfully re-created table 1 on page 10 of report. Meaning, I have the data and approach correct. Second I compared student growth scores rather than overall accountability scores.
This approach has the drawback of not including high schools, which have no growth data on the DPI report card. However, I still think this is a worthy exercise; increasing test scores is an important function of schools serving low-income pupils, and it makes more sense than comparing accountability scores built on different assumptions across schools.
Below are the mean growth scores for charter and regular public schools serving high-income, middle-income, and low-income pupils (Note I used the same cut-offs as the Forward Institute report for what constitutes a high, middle, and low-income school).
Charter – 74.09
Regular Public – 69.44
Charter – 70.46
Regular Public – 65.51
Charter – 64.21
Regular Public – 63.62
In the high and middle-income categories charters have statistically significant advantages on student growth scores at the 95% confidence level. In low-income schools, charter and regular public schools have no statistically significant differences.
So what does this exercise prove? Not much. At the very least it suggests the conclusion of the Forward Institute report is absurd. Presumably if one subset of schools were “offsetting the effects of poverty on education” better than the other it would show up as a difference in growth scores.
But really, I think this exercise demonstrates how the veneer of statistics can be used to make conclusions that are completely unjustified. Information is a powerful tool in education, we should be using to improve the quality of Wisconsin’s education system rather than misuse it to fight ideological battles.