M.I.T. New Study of Boston Charters

Some MIT and Harvard economists studied Boston and Massachusetts charter schools in 2009 and 2010.  They also studied Boston pilot schools -- "in-district" charter schools with less freedom than charters but more freedom than traditional schools. 

Boston charters did well in this 2009 study: students made striking gains on math and English tests.  Non-Boston urban charters not so well in the 2010 study.  Massachusetts suburban charters not so well.  Pilots not as well in the 2009 study.  

This led to some new questions.  Maybe the Boston charters do well on MCAS because they "teach to the test."  So perhaps on any other tests, the kids would do poorly. 

And while the 2009 study controlled for "parent effect," maybe there was something else, called "peer effect," that led to the results. 

In other words, perhaps the charter school teachers weren't doing anything better.  Maybe there were other explanations.

So the economists went back to work.  Their new study was published yesterday.  Here is the Boston Globe story.  

The biggest bounce in achievement ­occurred on the SAT. On average, charter school students scored 100 points higher than their peers in other public schools, according to the study, which was prepared by the School ­Effectiveness and Inequality Initiative at the Massachusetts Institute of Technology.

...“If there was any justification remaining for limiting growth of these successful charter schools, I don’t know what it is,” said Paul ­Grogan, president of the ­Boston Foundation. “How can we place a limit on an educational phenomenon that is deliver­ing these results?”

Five quick reactions:

1. Test scores on SAT did go up, so it wasn't just an MCAS effect.  Also, the economists ruled out peer effect as the cause.  At some point, the more obvious causes -- the longer hours, the higher standards, the careful team building, the enormous teacher energy invested in building relationships with students -- Occam's Razor will take over and the world will better understand why some charters do well and others are crappy.  

2. Thank you to the scholars, including our acquaintances Josh Angrist and Sarah Cohodes.  This sort of careful study takes enormous time and energy.  And thanks also to the people on the ground who helped them assemble data: Kamalkant Chavda from BPS and from Carrie Conaway from DESE. 

3. I appreciate this, too:

Carol R. Johnson, Boston school superintendent, has strived to build a collaborative relationship with charter schools, and she said she ­intends to review the report and identify any best practices that can be adopted by the city’s school system.

Carol is a wonderful woman.  Her husband recently passed away and she will be retiring.  The openness to "what works" discussions is genuine.

4. I suspect teachers and leaders in all the six charter schools studied have two reactions. 

a. We're proud of the students.  They're the ones getting up early and getting home late.  They're doing all the work. 

b. And we can do much better.  We need to find ways to improve.  In the long run, we want to be accountable for the college success, and then the "life success," of our students. 

5. It's interesting how things get framed. 

Even on the one area where charter school performance did not quite measure up, high school graduation rates, charter advocates cast the data in a positive light. At charter schools, 59 percent of students graduated within four years, 10 percentage points lower than their peers in traditional schools.

The advocates say the rate is lower because charter schools set a high standard for graduation, and it takes some students one or two additional years to earn a diploma.

They point out that the study found that 82 percent of charter school students graduate within six years, compared with 78 percent for traditional school students.

This is an area where politics trumps any possible education interest. 

a. Many high school district teachers I know are frustrated -- they feel students who don't try at all in their class nonetheless get socially promoted by administrators. 

b. District administrators feel pressure to push up the 4-year graduation rate.  So they move kids up who haven't learned. 

c. The district leaders face acute pressure from outside.  They're given hard targets to hit, or else. 

Charters don't respond to those pressures the same way.  The default is: support the teacher in making promotion decisions, in setting the bar.  Not always, but mostly. 

So what happens when the socially promoted students get their high school diploma?  They hit the real world.  Their district teachers were unable to hold them accountable.  They haven't had to show grit, and they have lower academic skills. 

Nobody wins.

Of course that's partly speculation on my part.  We'll have to see if the Boston charters really have better labor market outcomes, and better college graduation rates.  We'll check back with these MIT economists in a few years, and they'll be ready to answer these questions.