Pro-charter think tank in New Orleans forced to retract flawed study

Last Friday, a pro-charter school think tank in New Orleans withdrew a widely publicized report on student achievement after its methodology was found to be fundamentally flawed.

The think tank, The Cowen Institute for Public Education Initiatives, was founded in 2007 at Tulane University by then school president Scott Cowen. It specializes in tabulating statistics on (and promoting) charter schools in New Orleans, which became the country’s first majority-charter-school district after Hurricane Katrina in 2005. Today, only 6 of the city’s 82 schools are public.

The Cowen Institute, like other charter-aligned research institutions, also releases studies for public consumption meant to build support for charter schools. Such studies naturally appear authoritative because they carry the imprimatur of Tulane University, which regularly ranks among the top 50 colleges and universities in the country.

This was the intent of the study released October 1. Entitled “Beating the Odds: Academic Performance and Vulnerable Student Populations in New Orleans Public High Schools,” it purported to use a “value-added model” (VAM) to assess student performance at New Orleans high schools, factoring in such variables as the percentage of students held back by the 9th grade or eligible for federally subsidized lunches. The researcher allegedly found that most high schools in New Orleans met or exceeded their predicted performance on three indices; these schools were then said to have “beaten the odds.”

VAMs, supposedly used to measure the contributions of teachers to the academic performance of students, are themselves an extremely dubious and controversial methodology. They are frequently pushed by charter groups for use in high-stakes evaluations as a method of scapegoating teachers for poor student test scores. In April of this year, the American Statistical Association released a statement cautioning against excessive reliance on VAMs, noting that “VAM scores and rankings can change substantially when a different model or test is used.”

However, the Cowen Institute study actually contained no value-added modeling at all. Instead, using data from high schools throughout the state of Louisiana, researchers constructed a multiple linear regression model (MLR) to predict school performance based on four criteria. The predicted value is merely a “line of best fit” for the data points and does not serve to assess the performance of the individual schools. Moreover, this model necessarily means that some data points will be below the predicted value and some will be above.

This means that the schools that the researchers said had “beaten the odds” had really done nothing more than just that: they had “beaten the odds” in the same way that correctly calling a coin flip six out of ten times is “beating the odds.”

The Cowen Institute only retracted the article last Friday, nine days after the study was first published. Executive Director John Ayers released a tersely worded statement admitting that the study was “flawed” and “inaccurate,” while vowing that the Cowen Institute would “thoroughly examine and strengthen its internal protocols” in light of the debacle.

However, the errors in the research were so basic that it strains credulity that any professional statistician could have arrived at them merely by accident. A far more likely explanation is that the researchers were fishing for conclusions they had already made in advance. Moreover, the two authors of the report are high-ranking members of the Cowen Institute. Patrick Sims, the study’s lead author, is the Institute’s senior research analyst; his colleague, Debra Vaughan, is the director of research.

The actual results for the New Orleans schools in two of the three indices were actually well within the statistical norm. Sixty percent of New Orleans schools performed at or above their predicted course graduation rates. An almost equal number of schools performed above, at and below their predicted scores on the ACT Index, a common college entrance exam. The study admits as much in the appendix that these figures are mostly explained by the statistical model.

Nevertheless, these figures had been parroted for days by the media, charter school groups and education officials, including the superintendent of the city’s Recovery School District, as proof that the majority of schools in the city, and therefore also charter schools, were lifting students out of poverty. As an article in the New Orleans Times-Picayune put it, the report “[suggested] that demographics aren’t destiny.” Friday’s retraction precipitated a humiliating climb-down. The Picayune, for its part, did not explain the basic methodological error to its readers in an article noting the retraction, writing only that VAMs were “controversial.”

The entire affair is an exposure of the charter school movement as a whole, which has always adhered to a double standard when it comes to performance measurements. For public schools, no standard is too exacting or severe; when schools are converted into charters, however, all such standards immediately evaporate.