Although I enjoyed many of the readings throughout this unit, I was drawn to two in particular: George Hillocks Jr.’s “How State Assessments Lead to Vacuous Thinking and Writing” and Tim McGee’s chapter “Taking a Spin on the Intelligent Essay Assessor.” What stood out to me in these readings is that they go well beyond merely demonstrating the negative effects of standardized tests and/or AES systems on student writing; they take it a step further and explore how it actually influences their thinking (although this is more implicit in McGee’s piece).
In my Teaching College English class at UNC Charlotte, I remember Dr. Tony Scott asking us early in the semester why we liked writing–I gave, what many thought, was an intriguing answer. I mentioned how I was fascinated with writing’s connection to thought; how it helps us explore our ideas, the manner in which we learn to communicate our thoughts in a comprehensible fashion; etc. I concluded with an assertion I still heartily believe in (albeit there are exceptions): All great thinkers are not necessarily great writers, but all great writers are great thinkers as well. If you think about it through a certain lens, it makes complete sense. We oftentimes want to break writing down into standard components such as organization, syntax, tone, etc. However, usually when we think something is poorly written, we also object to the line(s) of thought possessed within.
That being said, I had never really cultivated in my mind any direct connection between standardized testing and the influence it has on students’ actual thought processes. Yet, these two pieces definitely demonstrate the perverse influence these kinds of assessments and/or software havw on students critical thinking skills. McGee details how the Intelligent Essay Assessor, which is supposed to read for content, is actually anything but intelligent. Even though he went in with some hopes, they were soon dashed by the machine’s inability to actually read for content. He inverted the order of one essay to make it make no logical sense and provided factually incorrect information in a history essay but included some keywords. Each time, the computer provided a similar score–in the one instance, the score was the same due to a one point reduction in content being balanced by a one point increase in mechanics for the gibberish!
Hillocks Jr.’s research was even more alarming, however. Perhaps this was due to him using the example essays that were used for grade norming/student examples. One exemplary essay provided sparse (if any) support for its claims; sure, it was developed, but the logic of the argument really didn’t hold. The example of the passing exam in Texas directly contradicted itself (sure, Hillocks Jr. takes a slight stab at Bush, but–c’mon–there was a gold-mine worth of humor there he should have taken advantage of!). Think about that–you can directly contradict yourself in an essay and be considered a proficient writer in the state of Texas.
Overall, I started to contemplate how standardized testing and AES may have an even more drastic influence on student agency than I ever had considered. Not only is the writing regimented, but the actual manner of critical thinking is stunted as well. Reading Hillock’s Jr.’s article came at the right time for me–my last batch of articles for my class had been less than stellar. After reading his article, I quickly realized that much of what he was pointing out were the same issues my students were having: an abundance of facts that really didn’t serve as evidence, just padding; claims that directly contradicted previous assertions; not really thinking through an issue but just babbling. At first, I was frustrated by these articles, yet, after reading this article, I started to realize–this is what they were taught to do. In all honesty, I did make some pedagogical mistakes that I also believed influenced the poor results; however, everything he discussed in his article appeared in my students’ writing.
This demonstrated, for me at least, that the influence of poorly designed standardized writing tests extends far beyond the students writing–it corrupts their thinking as well! While this is disheartening, I did seem some light at the end of the tunnel.
Primarily, I saw an opportunity for writing assessment scholars to gain support from other disciplines (The Fellowship of the Non-Vacuous Thinking, so to speak). If we could demonstrate how these learned behaviors on standardized writing tests are “transferring” over to other classes outside of English and composition, we could make an excellent case to create a unified front. Surely scholars in other fields care about student writing; however, if we could demonstrate to them how stunted these tests make their critical thinking skills as well, we might gain more traction. It might be wishful thinking, but I like to believe it is possible.