In a recent front-page, above the fold article, the Daytona Beach News-Journal reported that students are skipping the writing portion of the Florida Standards Assessment (FSA) at three times the rate of the other two portions. The percentage of students who skipped the writing portion of the FSA contributed to four Volusia County high schools receiving “Incomplete” grades.
Why? Why would so many students completely skip the writing portion of the FSA? Volusia County School Board member Linda Cuthbert, a former high school English teacher, gives a reason that has solid evidentiary support. It was refreshing to read.
But first, a little background. The Common Core standards were written in 2009 by a group of D.C.-based organizations (the National Governors Association, the Council of Chief State School Officers, and Achieve). They were released in 2010 and aggressively promoted with the goal of creating nationwide education standards. States were encouraged to adopt Common Core and measure for compliance using the Partnership for Assessment of Readiness for College and Careers (PARCC) test. Most states adopted Common Core. Florida did not. Well, at least not in name. Florida developed its own standards called (drum roll) Florida Standards which are modeled on the Common Core standards but with some changes. In order to measure compliance with Florida Standards the state licensed a test called SAGE from Utah and rebranded it the FSA. FSA replaced the Florida Comprehensive Assessment Test (FCAT) and was used for the first time in 2014.
FSA has three portions – writing, reading, and mathematics – and students are skipping the writing portion. So why are students skipping it? Here is Cuthbert:
Students are not allowed to use any creative thinking of their own. Therefore, students can find them uninteresting or completely irrelevant to their lives, causing them to have no desire to answer this portion of the test.
BINGO! Students skip the test because it is “uninteresting”, “completely irrelevant”, and because they have “no desire” to do it. Judging from what we’ve seen of these tests this sentiment is understandable. Although this may seem like common sense or a case of “too bad for those kids” it’s actually a critically important issue considering how much depends upon the assumption that the results of these tests are valid.
Perhaps Cuthbert is familiar with the work of education researcher Graham Nuthall who wrote, at the culmination of his 40 year career, that test results are “primarily the result of the students’ motivations and cultural background, and only secondarily about what the student knows or can do.” In other words, Nuthall’s research confirmed Cuthbert’s intuition.
Nuthall’s research reveals that tests have little personal meaning for a significant number of students. He found that the most important factor in testing wasn’t ability, it was student motivation and whether or not that student shared the same values about the test as the tester.
Many students lack motivation to take the test precisely because they do not see the value or relevance of the test in the first place. Therefore these tests measure student motivation and the degree to which the student sees value in the test more so than student ability. In other words, the tests we rely so heavily on produce invalid results.
Let that sink in. These tests are invalid because they measure motivation and values, not what a student knows or can do. The evidence keeps piling up that it is time for a paradigm shift in education. Not new standards or different tests but a fundamentally different approach to education. Until that happens we can expect to see more and more young people decide that the whole alienating process is completely irrelevant to their lives. And that’s a test we do not have to face if we rethink education today.