our everyday life

Readability Level of Standardized Test Items & Student Performance

by Anthony Owens

When standardized test items are difficult to comprehend, students may miss them even if they have mastered the content knowledge being assessed. The readability of standardized test items and its impact on test performance is a complex issue because readability is itself complex. Applying a basic understanding of readability to common features of test items helps clarify some of the ways that readability affects test performance.

Readability Problems Facing Test Takers

Readability is an interactive process between student and text, says reading expert John J. Pikulski of the University of Delaware. Specific to test items, student-centered problems include, but are not limited to, the inability to apply reading comprehension strategies and a lack of interest in reading about the topics addressed in test items. Text-centered problems include unclear test items and test item bias. Generally speaking, the higher the readability level of test items, the harder it is for students to understand them and hence to answer correctly.

Reading Comprehension Skills and Readability

Many students lack the ability to apply reading comprehension strategies, such as using context clues to define unknown words encountered in test items. These students miss items they otherwise might answer correctly. Of particular concern are undefined words in test item stems, words on which students are not even being tested. These words may significantly increase the text load on students, according to a 2007 study by the National Center on Educational Outcomes.

Reader Interest and Readability

In addition, the more interested students are in the topics of texts, the lower the readability level of those texts. Unfortunately, standardized tests items often address topics arguably of little interest to many students. For example, one Florida state reading test contained items addressing topics such as "glacial erosion theory, hiking narration, fishing, mammal biology, and even a cell phone user manual," according to a 2012 white paper by the Central Florida School Board Coalition (CFSBC) posted on Valerie Strauss's Washington Post education blog.

Clarity of Wording and Readability

Unclear phrasing raises the readability level of test items, says the NCEO. Unless testing vocabulary skills, the most basic and familiar wording possible should be used. According to the NCEO, "The author's conclusion was a somber reminder of what?" might be rephrased as "The ending of the book was a sad reminder of what?" The use of "not" and "except" within test items also raises the readability level because negatives unnecessarily increase the cognitive demand on test takers.

Test Item Bias and Readability

Test item bias also raises readability level. Biased items are insensitive to geographical and socioeconomic differences between test takers, says the CFSBC. For example, the CFSBC reported finding an item on a Florida state test having to do with the Echidna, an egg-laying mammal indigenous to Australia. A student living in New York would likely have no prior knowledge of this animal, nor would many New Yorkers likely express interest in it. Lack of familiarity or interest raises readability levels.

About the Author

Anthony Owens taught language arts at secondary institutions from 1999 to 2004 and has been teaching writing and literature at post-secondary institutions since 2010. He holds a Master of Education in instruction and learning from the University of Pittsburgh and a Master of Fine Arts in literature and creative writing from Bennington College in Vermont.

Photo Credits

  • Jupiterimages/Comstock/Getty Images