Archive for March 29, 2012

Sucked Into Value-Addled Teacher Evaluations

It’s been a month since my last post and I have just about had it with all the flap over our new contract, specifically “value-added”  test scores of students being used in the evaluation of teachers.

Using “value-added” to evaluate teachers is actually, in my view, value-addled evaluation of teachers:  it is senseless to hold teachers alone accountable for students achievement when so many other factors impinge upon student test scores. Supposedly, using value-added data wizardry, student test scores can objectively determine the “value” of what a teacher contributes to each student’s achievement in one school year . Assessment experts say the method is highly unreliable, but that hasn’t deterred policymakers in NYC, Chicago, or many other places, from jumping on the value-added teacher evaluation school bus.

I’ve twisted the term value-added teacher evaluation into value-addled because every teacher does not have equal students, classes, and resources yet tests scores are viewed as objective, empirical, and quantitative: tests scores are numerical data determined by standardized tests scores so this in turn makes them “objective”.  However, test scores are not completely objective data therefore not more reliable in value-added teacher evaluation, instead they are actually inconclusive factors used to determine a teacher’s worth.

Below I will use four major problems with value-added teacher evaluation outlined by the mathematician John Ewing, in his Washington Post article de-bunking value-added teacher evaluation, to validate why value-added teacher evaluation is truly value-addled teacher evaluation.

1. Influences.  Value-added teacher evaluations are actually value-addled because, as mentioned above, test scores are not truly objective.  In reality, test scores are influenced by many factors other than time spent with one teacher over the course of a single school year.  Test scores are impacted by numerous factors:  parental support, level of achievement coming into the classroom, attitudes of peers and previous teachers. It is impossible to determine the sole influence of one teacher during one year among all these influential variables.

2. Polls.  The poll-like nature of 1 – 3 standardized tests per school year reduces value-added to value-addled due to the reductive nature of polls:  a sample of student achievement covers a minute fraction of material from the much larger domain of curriculum covered over the course of a school year.  Student tests scores do not represent how much has been learned on material covered in a school year, and, unless numerous tests are used throughout the year, tests begin to resemble polls – they can be misleading.

3. Intangibles. Value-added is also value-addled due to endless intangibles which impact test scores such as:  attitude, engagement, self-motivation, and individual ability to learn independently.  The impact of such uncontrollable intangibles on student test scores rise exponentially when students do not have parental support and live in poverty.  Furthermore, these intangibles were up-front biases of standardized tests readily acknowledged by the “father of modern standardized testing”  (See Ewing article cited above for citation on the reserach.)

4. Inflation.  Value-addled is a better term for value-added teacher evaluations becasue test scores can increase without student knowledge increasing.  According to Ewing, this has been well documented, but is largely ignored by many in the education “establishment”.  How is this possible?  As every teacher is well aware,  teaching to the test – narrowing the curriculum to only what will be tested – can have a profound effect on test scores.  In fact, evidence shows that tunnel-vision teaching for tests can dramatically increase tests scores yet decrease student learning:  standardized test scores on on limited curricular materials are not the same as student achievement.

For further debunking of teacher-addled evaluation, I encourage you to go to the Washington Post piece cited above which I liberally made use of in the four points above.  Ewing makes it clear value-added is just another falsity used to undermine public school teachers, thus public education, in a mindless race to hold only teachers accountable for student achievement, when actually, in the real world outside of politics, there are also other major factors impacting student achievement.

 

One thing to note louis vuitton replica, seeing as I am a great fan of the Sakura Wars series omega replica, the Japanese Vass has a special place in my heart replica watches uk as they are the real stars that help make replica watches the characters come to life chanel replica and to hear a dub version of the characters prada replica is probably blasphemous to all the Sakura Wars replica watches fans out there. Well the dub hit replica handbags me with a great surprise, as I was impressed replica louis vuitton with how well it turned out. "What? You gotta be kidding me, this is Monster Island replica handbags we are talking here" as some of you might be saying rolex replica about now. Yes I know Monster Island, the bastard child of a louis vuitton replica dub house for ADV, whose chanel replica track record hasn't chanel replica been so great. I dunno how, it might have rolex replica been that the VAs gave more energy in their performance, but anyway it gucci replica all worked out well.