Arizona public school students took standardized tests back in April. In early September, the results were published.
The big takeaway is that, even though most students failed the tests, the proficiency rate increased slightly from the prior school year.
The real takeaway is that we know basically nothing from these results. The state used a brand new test this past year. The only change might be that the new test is 3% easier. In prior years we used AZMERIT. The newer exam for grades 3-8 is the Arizona Academic Standards Assessment, or AASA
I don’t think the general public understands how goofy these tests can be.
Here is the first question of a sample test for 8th grade English, listed on the Arizona Department of Education website.
When a person is reading text in the real world, the cognitive skill of identifying how an individual phrase affects the meaning of the passage is irrelevant. When reading for information, the important skill is to understand and be able to usefully respond to the meaning of the text. It’s sometimes important to consider how a specific part of a text influences the whole meaning, but this question is clumsy and confusing.
While “D” might be the correct answer, it is not clearly the best answer. Answer “B” means nearly the same thing. In the text, Robotina the robot is described as being able to perform “many types of work.” The two phrases cited in the question are plausibly being used to provide examples of the robot’s versatility and intelligence.
Answer “A” is also a good answer given that the entire text is about how robots are interacting with humans. In fact, the full sentence of paragraph 6 reads:
Part of the reason for this seems to be that people not only trust Robotina's impeccable ability to crunch numbers, they also believe the robot trusts and understands them.
In this context, you could argue that the author is emphasizing the interaction more so than the raw performance of the robot in terms of the “meaning of the passage.”
Answer “C” isn’t obviously wrong, either.
So it’s entirely possible that a student could read the text, understand the meaning of the text, be able to use the text in any real-world setting, but miss this particular question. The following questions are similarly awkward, asking students to identify the “central idea” of the text and asking how certain paragraphs “contribute to the passage as a whole."
How does a teacher train students to answer these questions correctly?
I have no idea.
What happens, probably, in schools where students struggle with these tests, is that teachers spend a lot of time drilling students with multiple-choice questions about the central idea of random texts. Which means less time reading and absorbing more substantive (and interesting) texts that would improve vocabulary.
Matt Yglesias recently wrote a piece on his Substack saying that schools should try to teach kids the basics.
I agree with him that schools should stick to the basics rather than try to instill the correct interpretations on current events. However, Yglesias seems to distinguish between teaching the “core skill” of reading and the optional inclusion of “works about history” which might be used to increase student engagement. He wishes to emphasize “learning to read history books” rather than stressing about which titles should be taught.
I don’t think you can learn “core skills” in isolation from a healthy diet of substantive reading. And if you’re going to provide a healthy diet of reading, you’re going to have to decide what texts to read, which can be tricky in these divisive times.
Arizona’s English standards seek to describe every kind of reading skill. The result is a discombobulated test that is damaging to the goal of effective teaching and learning.
Once you get the basics of reading down, the next goal should be to read a variety of topics. There was an interesting study done in the 1980s about reading comprehension. One group included strong readers with low knowledge of baseball. A second group included weak readers with high knowledge of baseball. Both groups read a passage describing an inning of baseball. Who scored better on the reading comprehension test? The weak readers who knew baseball.
This is a very simple study. It doesn’t solve all of our problems. But it highlights the absurdity in trying to teach the “skill” of micro-analyzing text without spending time enriching students with knowledge about the world around them.
If the ESA referendum gets on the ballot, we’re going to spend the next two years arguing about school vouchers. If it doesn’t get on the ballot and universal vouchers become state policy, we’re going to spend the next two years debating the effects of universal school vouchers.
While this is happening, don’t forget about public school teachers across the state of Arizona who are spending countless hours of time, in classrooms and in staff meetings, trying to train students to pass these goofy multiple-choice tests.
Review Questions
What is the central idea of “Proficiency and its discontents”?
Identify two supporting details of the central idea of this text.
How does the paragraph “I have no idea” affect the meaning of the passage as a whole?
If you follow the link and read the text none of the responses really make sense. This is a disservice to students and teachers!