The Pitfalls of Value-Added Data

A version of this post first appeared in Tennessee Education Report.

Valerie Strauss has an interesting piece over at the Washington Post dealing with Value-Added Modeling.  More specifically, the post analyzes what can be learned from 20 years of the Tennessee Value-Added Assessment System (TVAAS) implemented as a result of the Education Improvement Act — the Act that created the Basic Education Program (Tennessee’s school funding formula, also known as BEP).

The promise of Value-Added Assessment was that we could learn a lot about which schools were working and which weren’t.  We could learn a lot about kids and how they were progressing.  We could even learn about teachers and how they were doing with all their students and with specific groups of students.  With all this information, Tennessee would intervene and take action that would move schools forward.

Unfortunately, that promise has not been delivered.  At all.

Here, I highlight the key takeaways from the Strauss piece.  Tennessee parents and policymakers should take note – TVAAS is taking up tax dollars and impacting teacher evaluations and it doesn’t really work all that well.

1. Using TVAAS masked persistently low proficiency rates.

The Tennessee value-added assessment model basically identified the schools that were already making required annual proficiency targets, but it failed to distinguish between schools with rising or declining proficiency scores.

In short, the Sanders Model did little to address the essential unfairness perpetuated by NCLB proficiency requirements, which insisted that those student further behind and with fewer resources than those in economically privileged schools had to work harder to reach the same proficiency point.  More importantly, there was no evidence that the Sanders version of value-added testing did anything to help or even predict the future outcomes for those furthest behind.


2. TVAAS is unstable and inappropriate for high-stakes decisions — like hiring and firing teachers, renewing licenses, or determining pay.

And despite the National Research Council and the National Academies’ flagging of value-added assessment as too unstable for high-stakes decisions in education …

…states like Tennessee rushed to implement a federally recommended system whereby value-added growth scores would come to dominate teacher evaluation for educators who teach tested subjects.  And contrary to the most basic notions of accountability and fairness, two-thirds of Tennessee teachers who teach non-tested subjects are being evaluated based on school-wide scores in their schools, rather than their own.

3. Continued use of TVAAS as an indicator of “success” leaves the most vulnerable students further and further behind.

In a 2009 Carnegie-funded report, Charles Barone points out that focus on value-added gains, or growth in test scores, may downplay the need for interventions to address low proficiency rates:  “Due to the projection toward proficiency being recalculated annually [in the TVAAS model], there is not necessarily a significant progression, over time toward proficiency . . . causing a delay of needed intervention at appropriate developmental times” (p. 8). So while showing academic progress, gain scores or growth scores easily mask the fact that minority and poor children are far below their well-heeled peers in becoming intellectually prepared for life and careers. And in masking the actual academic progress of the poor and minority students, the state (and the nation) is let off the hook for maintaining and supporting an adequate and equally accessible system of public education for all students. At the same time, politicians and ideologues can celebrate higher “progress rates” for poor and minority students who are, in fact, left further and further behind.

4. Tennessee has actually lost ground in terms of student achievement relative to other states since the implementation of TVAAS.

Tennessee received a D on K-12 achievement when compared to other states based on NAEP achievement levels and gains, poverty gaps, graduation rates, and Advanced Placement test scores (Quality Counts 2011, p. 46).  Educational progress made in other states on NAEP [from 1992 to 2011] lowered Tennessee’s rankings:

• from 36th/42 to 46th/52 in the nation in fourth-grade math[2]

• from 29th/42 to 42nd/52 in fourth-grade reading[3]

• from 35th/42 to 46th/52 in eighth-grade math

• from 25th/38 (1998) to 42nd/52 in eighth-grade reading.

5. TVAAS tells us almost nothing about teacher effectiveness.

While other states are making gains, Tennessee has remained stagnant or lost ground since 1992 — despite an increasingly heavy use of TVAAS data.

So, if TVAAS isn’t helping kids, it must be because Tennessee hasn’t been using it right, right? Wrong. While education policy makers in Tennessee continue to push the use of TVAAS for items such as teacher evaluation, teacher pay, and teacher license renewal, there is little evidence that value-added data effectively differentiates between the most and least effective teachers.

In fact, this analysis demonstrates that the difference between a value-added identified “great” teacher and a value-added identified “average” teacher is about $300 in earnings per year per student.  So, not that much at all.  Statistically speaking, we’d call that insignificant.  That’s not to say that teachers don’t impact students.  It IS to say that TVAAS data tells us very little about HOW teachers impact students.

Surprisingly, Tennessee has spent roughly $326 million on TVAAS and attendant assessment over the past 20 years. That’s $16 million a year on a system that is not yielding much useful information. Instead, TVAAS data has been used to mask a persistent performance gap between middle to upper income students and their lower-income peers.  Overall student achievement in Tennessee remains stagnant (which means we’re falling behind our neighboring states) while politicians and policy makers tout TVAAS-approved gains as a sure sign of progress.

In spite of mounting evidence contradicting the utility of TVAAS, Commissioner Huffman and Governor Haslam announced last week they want to “improve” Tennessee teacher salaries along the lines of merit — and in their minds, TVAAS gains are a key determinant of teacher merit.

Perhaps 2014 will at least produce questions from the General Assembly about the state’s investment in an assessment system that has over 20 years yielded incredibly disappointing results.

For a view on how use of value-added data for teacher evaluation and employment decisions has impacted Florida, read this teacher’s take.

For more on education policy and politics, follow me @TheAndySpears

, , , ,


  1. Ravitch Cites Spears on Value-Added | Spears Strategy - October 21, 2013

    […] written on this topic several times, including noting the shortcomings of value-added data and the inability of value-added data to effectively differentiate among […]

Leave a Reply