FEA President Asks Governor to Suspend VAM

Florida Teachers  poised to receive this year's VAM scores. Is the state pushing too hard and fast? FEA President Ford has asked state to temporarily suspend VAM.


A new analysis from the Manhattan Institute (a known mouth piece for The Foundation on Education Excellence and a strong supporter of Jeb Bush’s education policies) suggests Value Added Measurement or VAM can be part of a “sensible policy for retaining a teacher", but it also refers to VAM as "imperfect" and the group partially validates VAM criticisms.

VAM Impacts Evaluations
Teachers Question Use of VAM
Read Kim's story: A former
Teacher of the Year



Several weeks past (early November), President Ford sent a letter to Governor Scott urging him to use his executive authority to suspend the law requiring districts to apply VAM scores to teacher evaluations for the 2011-2012 school year. He also asked the Governor to halt the use of VAM until there is clear, concise and universal research that supports the use of VAM in evaluating teacher performance.

Read President Andy Ford's Letter to Gov. Scott
Gov. Scott's Response to FEA FLDOE Commissioner Pam Stewart Response to FEA 
President Ford's Response to Gov. Scott and FLDOE

FEA is concerned about how the obviously very flawed VAM scores this year will play out in the future.

 

In three years, this data is supposed to be part of the awarding of merit pay. How will this play out when the DOE cavalierly suggests that a “teacher’s performance as ‘effective’ if one cannot statistically conclude whether the VAM score is above or below whatever threshold has been established in your system.”

FEA’s concerns for the 2011-12 reporting are as follows:

 There was insufficient data                    
There are inaccuracies in VAM scores even after reanalysis. 
The 2011-12 process that linked teachers with their students was
inadequate.
The timelines imposed by the VAM data cycle are unworkable and untimely. 
Districts received final VAM reports well after the 90 day window designated in SB 736.




 

x

 

 

 

 

 

 

 

 

 

Value Added Measurement of VAM is one of the most controversial topics in education. You could call it the education reform fad or trend. An increasing number of states are looking at its use for teacher evaluations, even though there remains a great debate over its effectiveness in achieving good quality data that will have a significant impact on student achievement and teacher quality. There simply isn't enough research to support this.

Florida tried a value-added approach to merit pay for schools in the 1980s. District statisticians found their value-added models identified different schools as meritorious, depending on which factors they controlled for and they realized there was no right way to decide what to control for. Still a great deal of emphasis is being placed on VAM.

The FL-DOE was supposed to set score categories this year for districts to help in the assessments, but due to the administrative judge’s ruling which nullified that complex formula proposed by the FL-DOE for calculating the VAM score, each district had to decide on its own which teachers are effective, highly effective, unsatisfactory, needs improvement, etc. The state turned over preliminary VAM data to the districts last month. District administrators were to review the information, make corrections and return the final data to the FL DOE.

Value added analysis provides an estimate of how well a teacher is doing at increasing student performance on standardized tests. On the flip side or more accurately, it measures how well a teacher teaches to the test.

Policymakers say it provides an idea of how well a student will do on future tests by looking at past test scores (baseline) and comparing them to current scores, measuring student academic growth.

Academic Growth = Current/recent test performance - baseline (past test performance)

Yet, the U.S. Department of Education has estimated that VAM will be wrong 25% of the time.

In a report for the Anneberg Institute for School Reform, researchers point out that  "because value-added is statistically estimated, it is subject to uncertainty, or a 'margin of error'.” Teacher ratings can vary wildly from year to year, based on the student population, their prior test scores, school environment and conditions, aßnd many other factors. 

Recently, the Ohio Department of Education began encouraging its schools to dial back the emphasis on value-added scores. Instead, schools are urged to use other ways of incorporating student performance into teacher evaluations. The district leaders want less than 50 percent of a teacher’s evaluation to be based solely on one value-added score. The Ohio Department of Education recommends starting at 10 percent of a teacher’s evaluation and working VAM upward over time. Administrators say, it is difficult to fully measure a teachers impact.

Why? Because standardized tests aren’t always the most effective assessment of student skill and learning. Using standardized tests as a measurement tool doesn’t take into account the learning and growth that have been made in areas outside the scope of the test, not only academically, but socially and behaviorally as well. Often, teachers and ESPs have a huge impact on students in these areas. Sometimes that impact can make a big difference in the classroom.


Value-added could be a promising tool, but must be further refined and deployed with extreme caution. There's too much missing from the equation. it can not assess what really inspired the student to acheive.  it can not assess the lesson plans or learn strategies the teacher employed to make the difference for their students.

Value Added Measurement will not evaluate:

  • A teacher's experience in the classroom (current books to share, kids doing homework, and in school building with peers;
  • The grade level, subject or ability of the students the teacher previously taught;
  • The remedial work the teacher must perform to prepare students for the curriculum. War and Peace may be on the reading list, but if your 6th graders are reading on a 3 grade level, it may be difficult to achieve the appropriate outcome;
  • Whether the teacher who taught the student helped them to improve their writing or reading skills;
  • The type of classroom environment created where accelerated students work with students who need help and vica versa;
  • The students who ate breakfast and had a healthy meal the night before the test, or which student slept at home and rested well; 
  • About how the teacher responded when they received ta call in the middle of night from a student who got kicked out of his home and had nowhere to go;
  • The number of students who get extra help from the teacher or request letters of recommendation because they feel they did their best work in my class;
  • How a teacher would teach if they didn't have to teach to the test;
  • The quality of the lesson plan and the rigorous work (not on the test) that got their students really engaged and interested in the topic;
  • How well subject or grade level teachers work together, coach one another and collaborate on site related concerns;
  • Or, all of the material the student doesn't know because it is not on the test.

 

FEA has been a strong supporter of high standards for teaching and learning and is pleased with the reports of thoughtful discussions about instructional practice occurring between teachers and their administrators. But we have continually raised concerns about the inclusion of Value Added Model (VAM) scores as a significant part of every teacher’s evaluation and career prospects.

We all need evaluations, even teachers because it provides a guide for them to set personal achievement goals. But a proper evaluation includes a broad mix of assessment tools and measurements, including but not limited to: observations of classroom practice, portfolios of teachers’ work, student learning objectives, test scores, homework, and surveys of students. If we are serious about education reform, quality teaching and advancing student achievement, then we need to mix it up and add more resources to the education reform toolkit.  


Related Articles:

 

  1. How to Measure Teacher Effectiveness:
    http://www.rand.org/education/projects/measuring-teacher-effectiveness/multiple-choices.html
  2. Value-added Teacher Evaluation Goes on Trial — literally:
    http://www.washingtonpost.com/blogs/answer-sheet/post/value-added-teacher-evaluation-goes-on-trial--literally/2012/01/31/gIQAntC5iQ_blog.html
  3. Do Different Value-Added Models Tell Us the Same Things?:
    http://www.carnegieknowledgenetwork.org/briefs/value-added/different-growth-models/
  4. Problems with the Use of Student Test Scores to Evaluate Teachers:
    http://www.epi.org/publication/bp278/
  5. The Value Added Model:
    http://feaweb.org/value-added-model-or-measurement
  6. Not by “Value-Added” Alone:
    http://www.hepg.org/blog/39
  7. Princeton Study Takes Aim at 'Value-Added' Measure:
    http://blogs.edweek.org/edweek/inside-school-research/2009/06/princeton_study_takes_aim_at_v.html
  8. A Report from the National Academies of Science points out VAM scores have yet to be scientifically validated:
    www7.nationalacademies.org/bota/VAM_Measurement_Issues_Kolen.pdf
  9. Using Value-Added Measures to Evaluate Teachers:
    http://www.ascd.org/publications/educational_leadership/may10/vol67/num08/Using_Value-Added_Measures_to_Evaluate_Teachers.aspx
  10. Value Added and Other Measures:
    http://www.carnegieknowledgenetwork.org/briefs/value-added/value-added-other-measures/

 

 

 0 user(s) rated this page
Login to leave a comment
No Comments yet