Why Journalists Should Learn to Use Calculators

Not too long ago I wrote about this study by Raj Chetty and associates that purported to indicate that through the use of value-added modeling we could determine a lot about a child’s future income based on his or her teacher. That is, teachers with high value-added scores had significant lifetime impacts on their students as measured by future income.

That’s what Chetty was telling everyone about his research findings.  And the mainstream media loved it.

However, as I noted in my article urging caution on the use of value-added data to evaluate teachers, by using a simple calculator, one could determine that the effects claimed by Chetty weren’t that significant at all.  Perhaps as small as just a few hundred or maybe a thousand dollars in salary difference per year. But Chetty was aggregating his findings into a class of 28 students, so when you stated the impact out loud, it sounded impressive.  $250,000 to $1 million over the lifetime of the entire class.

Dividing those numbers by 28 (the number of kids in the class) and then 40, (the work life of someone who gets a job at 22 and retires at 62), yields distinctly less impressive numbers.

In fact, I’d argue such small effects actually indicate that value-added data doesn’t tell us much at all about the differences among teachers.

Now, as P. L. Thomas points out, there’s actually some scholarship that indicates that the Chetty findings aren’t exactly as they’ve been reported.

Not surprisingly, Chetty responds to defend his initial claims.

But the bottom line is this: Reporters who used a calculator at the outset wouldn’t have been so impressed by the Chetty claims. And, as Thomas urges, the mainstream media should clarify.  They should state explicitly that most of them got it wrong on the Chetty results. That they didn’t use calculators, they just accepted the impressive-sounding research from some bright people at Harvard.

But that would mean that all those states basing teacher evaluation on value-added scores have been wasting their money. And it would mean admitting a pretty embarrassing error. While statistics can be complicated, the idea of aggregating data to make research claims sound more impressive isn’t complicated at all. A journalist hearing or reading such claims for the first time ought to at least dissect them and do some basic math.

In this case, the numbers simply don’t add up.

For more on education politics and policy, follow @TheAndySpears

 

, , , ,

Trackbacks/Pingbacks

  1. An Ineffective Teacher? | Spears Strategy - July 21, 2014

    […] evaluation system (TEAM) in place in Tennessee. I’ve written before about the challenges and limitations of value-added modeling when it comes to evaluating […]

  2. Struggles with Value-Added Modeling | Spears Strategy - September 26, 2014

    […] written before about the limitations of value-added modeling and the over-reliance on raw numbers — especially by policymakers and journalists. Alfuth offers three areas of concern for those […]

Leave a Reply