Colloquial Language is More Familiar, But Can be More Easily Misinterpreted When Reporting Test Results
With the proliferation of large-scale state summative assessments and published results, one could say that testing has gone mainstream.
Mass distribution of testing and accountability results to non-technical audiences through, for example, websites like Niche, which is dedicated to providing information on schools to parents, has rapidly changed the manner in which test results are communicated. Out are technical terms like normal curve equivalent; in are more mainstream terms like career- and college-ready. In a sense, large scale summative assessments today must aspire to be more than just reliable and valid–they must also be useful, particularly to lay audiences.
The Benefits and Downfalls of Removing the Technical from Testing
The move toward less technical terms that are more familiar to the intended users of test results is laudable. Removing technical communication barriers is essential if parents are expected to interpret and use results of state summative assessments. However, the introduction of lay terminology comes with its own set of problems.
Familiar Language Isn’t Always Accurate
When more colloquial terminology is used, it might be familiar, but it can also be ill-defined. We recognize that technical terms make communicating to non-technical audiences more difficult because they often have to be explained and are likely to never be understood by the vast majority of such users.
Non-technical terms have the (assumed) advantage of not having to be explained. An implicit underlying assumption is that the non-technical term means the same thing to all users. This assumption has been shown to be false. In a study which examined how people interpret probabilistic words, the researchers found wide variability in how lay audiences interpreted the probability of events labeled as, for example, “usually” or “often” occurring. Clearly, non-technical language has a set of problems all its own.
A case in point is the term “on track” that I’ve seen used hundreds of times over the last decade and occurred in a recent article that appeared in The 72 entitled “90% of Parents Think Their Kids Are on Track in Math & Reading. The Real Number? Just 1 in 3, Survey Shows”. Digging into the article, details are provided to substantiate the title:
“‘The education community continues to use a language that parents don’t speak,’ said Bibb Hubbard, Learning Heroes founder and president. This communication gap creates a significant disconnect in how parents think their children are doing in school versus reality. In its second national survey, Learning Heroes found that 9 in 10 parents think their children are performing at or above grade level in math and reading — but results from the National Assessment of Educational Progress, known as the Nation’s Report Card, shows that only 1 in 3 U.S. eighth-graders are proficient in math and reading.”
The author’s use of the term “on track” in the title is implicitly defined in the article to mean proficient in math and reading on NAEP. The careful reader will certainly find much more to quibble with in the paragraph provided; however, for the purposes here, we wish to dig into just what the term “on track” means.
The term “on track” is probably considered non-technical and is familiar to most adults. For example, on a road trip to California one could ask, “Are we on track to get to San Francisco by Thursday?” Or, during the recent bull market in stocks headlines often declare, “Companies are on track for record earnings.” In each case, “on track” refers to progress toward a destination. In neither case does “on track” mean that the destination has been reached. That is, just because one is “on track” to get to San Francisco by Thursday doesn’t mean one can quit driving. Being “on track” presumes continuing progress toward the destination either based upon some predetermined rate or based upon some “typical” rate of progress.
Most states using the term “on track” in their state summative assessment results use it, for example, in statements like “on track to career and college readiness.” They substitute a student’s current status (like in the above mentioned article) for a scenario that recognizes sufficient future progress as an integral part of reaching the destination that the student is “on track” toward. As a field, however, we rarely, if ever, investigate whether typical progress along different points of the status continuum leads to career and college readiness later in students’ lives. But this type of due diligence seems necessary to support the use of a term like “on track” in this context.
We applaud the push to better communicate state summative assessment results to lay audiences. There are numerous insights based upon the attainment and growth of students that can be understood by non-technical audiences like parents. Doing so, however, requires much greater care than when using technical terms for communicating results.
Particular care must be taken to make sure the use of the terms coincides with what audiences already understand the terms to mean. Absent such care, we’re just replacing techno-jargon with techno-babble.