top of page

Wasted Money & Time

 

Brookings Institute estimates that the state of Florida has spent $54.3 million per year on testing in 2012 and 2013, at a rate of $41 per student. This does not include the cost of instruction time.

 

A national study conducted by the American Federation of Teachers (AFT) found that "the time students spend taking tests ranged from 20 to 50 hours per year in heavily tested grades. In addition, students can spend 60 to more than 110 hours per year in test prep in high-stakes testing grades. Including the cost of lost instructional time, the estimated annual testing cost ranged from $700 to more than $1,000 per pupil." According to a recent New York Times article, Florida has distinguished itself by spending an average of 60-80 days of the 180-day school year on testing and test prep. Since that clocks in well above the 110-hour national averages estimated by the AFT study, one can assume that, including instruction time, Florida spends at the top end of the estimated range of $700 to $1000 per pupil, and possibly more. In 2011, there were 2,668,156 students in Florida's K-12 public schools. By applying the AFT data, an estimated $2.6 billion is determined, including the cost of instructors and prep time.

 

That is a ridiculous amount of time and money spent on assessment. In other words, 12.5% of the State's $20.7 billion Education budget for 2015 will be spent on assessment rather than education.

 

Included in that estimate, Florida DOE awarded a contract worth $220 million over 6 years to the American Institutes for Research (AIR) to develop our Florida Standards Assessment (FSA) exams. In addition to that money, Florida has paid the State of Utah $5.4 million for the use of their test in 2014-15. Simply put, many people stand to receive a lot of money due to Florida's testing spending.

 

But what benefit has this spending and this testing given us? According to ProCon.org, "[The use of standardized tests] skyrocketed after 2002's No Child Left Behind Act (NCLB) mandated annual testing in all 50 states. US students slipped from 18th in the world in math in 2000 to 31st place in 2009, with a similar decline in science and no change in reading. Failures in the education system have been blamed on rising poverty levels, teacher quality, tenure policies, and increasingly on the pervasive use of standardized tests."

 

So, all of this money and resources spent on testing are supposed to increase accountability among schools and teachers, and that is the argued path to better education. That argument includes several assumptions.

  • It assumes that the test results accurately measure what a child knows. The design of tricks and traps on the tests defies that notion.

  • It further assumes that the child will only fail if they don't meet the core standards, despite the fact that the testing corporations can tell you beforehand through statistical/probability models how many will fail. (See psychometric data analysis)

  • It further assumes that this type of motivation can be translated into a classroom form that educates children and incentivizes teachers. Typically, an incentive would include some sort of merit-pay system, but that is not offered in Florida. Studies indicate that merit pay systems do not work. Research conducted at Vanderbilt University showed that teachers offered bonuses for improving student test results achieved no more improvement than the control group (Link). Similarly, Texas's TEEG study showed that "the attitudes and behaviors of school personnel, school environment, and teacher turnover were certainly affected by these factors [merit-based incentives]. However, evidence suggests that there is no strong, systematic treatment effect of TEEG on student achievement gains. Nor are there consistent associations between TEEG plan design features and student achievement gains" (Link). This shows that teachers are not able to change test performance to achieve the carrot of merit pay. Similarly, research shows they are unable to change test performance to avoid the stick of job termination (Link). Thus we know teacher performance does not correlate with test performance. The strongest predictor of test performance continues to be socioeconomic status. 

  • Wix Facebook page
  • Wix Twitter page
  • Wix Google+ page
bottom of page