An article in “The New York Times” points out some interesting things about SAT Writing scoring. Did you know that computers can score essays on standardized tests such as the SAT as well as people can? Does that sound ridiculous to you? While it may sound ridiculous, that doesn’t mean it isn’t true. In fact, according to “The New York Times” article on standardized test scoring, “Computer scoring produced ‘virtually identical levels of accuracy, with the software in some cases proving to be more reliable,’ according to a University of Akron news release.” Yes, you read that right: more reliable.
The e-Rater, the robo-reader developed by ETS, can in fact grade 16,000 essays in 20 seconds. How incredible is that? Quite a bit faster than the average speed of a human being, wouldn’t you say? What’s funny is that what the e-Rater can’t do is measure whether or not the test-taker is writing the truth or not. For instance, if you write that the Civil War preceded the American Revolution, you’re not going to lose credit on the e-Rater’s scan. It seems the e-Rater is no Watson (in reference to IBM’s super computer that can answer questions like a human and even appeared on “Jeopardy”).
And what else? Well, the preceding sentence in this very paragraph would lose marks by the e-rater because the computer doesn’t like when you start a sentence with “and.” Or “or.” The e-rater also doesn’t like short sentences or fragments. What it does like are sentences that start with words like “however” or “moreover.” So, basically, the e-rater isn’t very creative (as you could guess). While English teachers may teach students to never start a sentence with the word “and,” doing so can often be quite powerful and effective. But don’t do it on the SAT Writing section. And don’t start a sentence with “but.” Or “and.” Ahh.
While you’re here, check out this post on SAT Score Choice.