Measuring the Effect of P.D.

A buddy emailed me today:

X organization, a client of mine, wants to do boring professional development for its staff. I'm telling them: bad idea. Do you have any kind of data that backs up that notion?

A refresher: a few months ago I blogged about an experiment. 195 teachers who each got 68 hours of training. A control group got about 13 hours of training.

Did the students of the "treated" teachers end up learning more than the "untreated" teachers?


The expense of the study was $108,000 per teacher treated, $21 million total.

But that was just one type of training. What about more broadly? What do we know about professional development of teachers?

Not much.

Matt Kraft is a doctoral student at Harvard. He and his buddy Shaun Doherty did a randomized trial for our Teacher Residency last summer.

Matt sent me this paper written in 2007. It has this exciting title: "Reviewing the evidence on how teacher professional development affects student achievement"

Of the more than 1,300 studies identified as potentially addressing the effect of teacher professional development on student achievement in three key content areas, nine meet What Works Clearinghouse evidence standards, attesting to the paucity of rigorous studies that directly examine this link.

How much paucity? To be clear:

1,300 studies of how "professional development" helps teachers.

9 of 1,300 have good evidence.

And 0 of the 9 are about training for middle and high school teachers.


As a field, we just don't know much about how to help teachers teach better.