Data vs. opinions: The New Media Consortium Horizon K-12 Technology Report

This is the second post looking at recently released reports related to K-12 technology. The first one reviewed Education Week’s Technology Counts 2015 “Without data, you’re just another person with an opinion” is a common aphorism. Not that there’s anything wrong with opinions. The problems arise when ideas are presented in a way that doesn’t explain that they are opinions with only limited grounding in facts.

That observation relates to the NMC Horizon Report: 2015 K-12 Edition, the annual report in which opinions and data are often conflated. The Horizon report findings are mostly based on asking a large panel of experts what they believe are the emerging technologies in K-12 education along different time horizons (one year or less, 2-3 years, and 4-5 years). There is nothing inherently wrong with this approach, and the result is a worthwhile thought experiment. The problem is that the large majority of media reports would lead the casual reader to think that the trends identified are based on data, because most reports don’t mention the way that the Horizon report determines the trends. There is a critical difference between reporting that “a group of experts think these trends might happen,” compared to describing trends as if they are based on numbers, leaving out any mention of the methodology, and thereby suggesting that the findings are on far more solid ground than they really are.

The results of this method and the way that the report is widely cited are problematic because some of their predictions can lead to unrealistic expectations. Educators and policymakers who read that game based learning was three years away in 2012 might wonder why their states or schools are so far behind, given that it’s now three years later and they aren’t operating classrooms full of students happily playing games to learn math and reading. (The link in the preceding sentence is particularly useful for reviewing some previous Horizon predictions, which will allow you to assess whether those predictions have been accurate.) Readers who see headlines suggesting that makerspaces are the next big thing in 2015 might wonder why they are seeing no sign of makerspaces being created in their neighborhood school.

Despite these concerns, this year’s Horizons report is valuable, particularly in its analysis of issues related to technology in education—as opposed to the predictions around emerging technologies that are far more widely reported. Compared to previous years, the report devotes far more pages to analyzing key issues, and relatively fewer pages to its technology predictions. Its explorations of trends and challenges make up more than half of the report, and the way it categorizes challenges as “solvable,” “difficult,” or “wicked” is particularly useful. For example, it explores the challenges of “Rethinking the Roles of Teachers” (page 28), which it characterizes as a challenge that “we understand but for which solutions are elusive.” Along these lines, “Scaling Teaching Innovations” (page 30) is a wicked challenge that is “complex to even define, much less address.” Some of these observations echo points made in Education Week’s Technology Counts issue that we discussed in a previous post.

These are valuable insights that are discussed at length. But the value of this report is in the nuanced discussion of issues, and most news articles report just the misleading “time to adoption” trends. If you want to get the value from this report, you have to read it yourself and not rely on what most reviews are saying about it.