The Absurdity of VAM

Donna Krache offers a very brief explanation of why using value-added modeling (VAM) as the key means of determining teacher effectiveness is seriously flawed:

The VAM system in New York also presents content challenges. Since the state’s tests are given only in math and English Language Arts, and since all teachers must be evaluated in part based on their students’ performance on these tests, then social studies teachers are often evaluated, in part, on how their students do on English Language Arts tests. Science teachers are often evaluated based on their students’ scores on math tests.

I don’t know how you feel, but the idea of awarding electricians their licenses based on how their plumber colleagues did on their tests makes me want to re-wire my house.

This problem is not unique to New York. And it doesn’t even mention the challenge of evaluating related arts teachers, for whom there aren’t typically (and shouldn’t be) standardized tests.

One Tennessee teacher I spoke to teaches higher level math. The classes she teaches are not those on the state’s End of Course test (EOC). So, the value-added portion of her evaluation was based one year on Algebra I scores. Problem is, those students wouldn’t have that teacher (if at all) until they were juniors and seniors. These aren’t students she may have encountered — like the social studies teacher who teaches kids taking the English EOC — these are students she hadn’t ever had in her classroom.

So, she was evaluated based on students she never encountered – maybe she smiled at them or saw them in the cafeteria?

Who knows if this teacher was or is a good upper level math teacher? By instituting a system that awards evaluation scores like this, state policymakers are indicating they aren’t actually concerned about teacher performance or student outcomes. Local school systems that comply rather than protest such a system are complicit in the state’s misguided policy.

Who suffers? Teachers, their students, and even parents — who are told these scores really do reveal something about teacher performance.

Evaluating teachers based on students they’ve never taught is bad enough. But, some studies suggest that value-added data is susceptible to grade level and subject matter bias.  That means even teachers who get VAM data for students they’ve actually taught may suffer at the hands of a seriously flawed system.

Then there are the concerns about the reliability of value-added predictions, which one scholar has suggested are “less reliable than flipping a coin.”

Yes, current applications of VAM are absurd. So far, that hasn’t stopped policymakers from requiring them.

For more on education politics and policy, follow @TheAndySpears

, , , ,

Subscribe today!

Subscribe to our e-mail newsletter to receive updates.

Trackbacks/Pingbacks

  1. VAM: “Arbitrary” and “Capricious” | Spears Strategy - May 11, 2016

    […] The Absurdity of VAM […]

  2. Hawaii Cans VAM | Spears Strategy - May 24, 2016

    […] The Absurdity of VAM […]

Leave a Reply