Yo Teach…! Or how to avoid teaching like Jason

Closing the Teach For America Blogging Gap
Aug 06 2013

Asking questions when you know the answers: a case study in confirmation bias, EdNext edition

Before I discuss more comprehensively my critiques of market based school reforms, I thought I’d write about this new article in Education Next, written by Marc Holley, evaluation unit director at the Walton Family Foundation. The purpose of the study is to measure the effect of school choice on district schools. To do this,it performs a qualitative analysis of 8,000 print/digital articles from twelve urban districts heavily exposed to school choice. It then classifies the article as describing a constructive or obstructive response to competition by these public district schools. They then “reviewed minutes from school board meetings, district web sites, and other district artifacts to verify if, in fact, the practices and policies described in media reports have occurred.” Its unclear if they only counted evidence that was corroborated by these reviews, or simply anything that wasn’t dismissed. The main results can be seen here (click to enlarge):

While I appreciate qualitative analyses as well as assessments of charter schools that do not simply compare district and charter test scores (since underlying theory suggests charters should improve, not necessarily outperform district schools), this piece is pretty frustrating, unconvincing, and more misleading than constructive. Here are a few reasons why.

1. Notice how emulating charter practices/collaborating with charter schools/contracting CMOs are listed as an unambiguous positive. This correctly shows that schools are adapting a more competitive and growth-oriented mindset, but not that such a mindset is beneficial. Most charter schools are about the same as district schools, and a lot of prominent charter schools are a lot worse. Thus, importing “charter” practices could be great or horrible. The fact that this study doesn’t differentiate invalidates both its findings and its initial assumptions.

2. The study’s methodology is founded upon looking at newspapers. I don’t know if this is universally true, but a lot of schools with poor leadership, deprofessionalizing policies, or inhumanely competitive mindsets are fantastic at getting the press to gush over them. They also tend to prioritize website aesthetics and lack a critical board. This school looks great, right? It also loses a scary proportion of its successful (and unsuccessful) teachers each year (I’ll put myself in the middle) because of its embodiment of everything wrong with ed reform. Clearly, what get’s published about a school does not always correlate with what actually goes on in a school. That’s why we have researchers…like Mr. Holley…to dig through the PR spin.  Point three helps explain why such a toothless methodology was used.

3. I’m not one to criticize an article based the characteristics of its author, but it’s significant that this person works for the Walton Foundation. Would he have a job, or get published in EdNext, if his study didn’t have a positive spin on school choice (even if it had a neutral spin)? Of course not. This explains why the methodology seems to ensure that the results would be positive before the study had even begun.

In general, I think there is a lot to be learned by understanding how organizations respond to competition. I wrote my thesis on the topic based on school choice in Sweden. But qualitative organizational research means actually observing and talking to random samples of people involved in these organizations, a reasonable null hypothesis, and independent researchers. Not this. We deserve better.


About this Blog

Pontifications of the Unplaced


Subscribe to this blog (feed)