Access, activity, and outcomes in digital learning
As we are nearing completion of writing Keeping Pace 2014, we’ve been thinking quite a bit about the different lenses through which to consider the digital learning landscape. Among the words that best describes the landscape is uneven. Three of the key elements of digital learning that may be described as uneven are access, activity, and outcomes. And not only are those elements inconsistent across states, but the level of information that we have about those elements is itself uneven.
Student access to digital learning varies based on the state and local school district in which each student resides. Students in Florida have access to a wide range of supplemental online courses and fully online schools. Students who live in other states with well-supported state virtual schools (e.g. North Carolina, Idaho, Alabama), or robust course choice programs, have access to online courses. Students who live in any of the 30 states that allow statewide online schools have access to those schools. Students who don’t live in those states, but are fortunate to live in a school district that offers online courses (as many do) or an online school (fewer), have access to these opportunities as well. For students who are interested in accessing digital content and tools from a physical classroom, whether they can do so depends on their school or whether they have an alternative education program, or another school in their district, or perhaps a charter school that they can move to. Student access to digital learning varies based on their ZIP code and this creates uneven opportunities
If access is the first lens, then activity is a logical next step. Fully online schools and supplemental online course providers often report (or the state reports) the number of students who attend the school at any time in the year, or start the online course. These numbers often give us a sense for the level of activity, but don’t account for the different rates at which students leave online schools or courses without completing them. States that report the number of online course completions (as opposed to the number of student enrollments) provide a better measure of activity, particularly if they relate course completion rates as well. States and schools that report student mobility rates in online schools paint a more accurate picture of how many students attended the online school for the full year and either graduated or progressed to the next grade level. For digital content and tools used in physical classrooms, levels of activity are difficult to obtain and rarely available publicly.
The final area that we are watching in Keeping Pace is student outcomes. Determining outcomes is a major undertaking, whether based on a longitudinal research study into a small number of schools, or data mining across a state. The synopsis of outcomes studies is—as it has been for years—that some implementations of online schools, online courses, and digital content show success, and others don’t. But for the large portion of digital learning, outcomes are unknown or not publicly available.
Access of students to digital learning has been, should be, and will continue to be a major area of research and reporting. Activity and outcomes are the next two areas which deserve to get equal attention.