Summary data on K-12 online and blended students are estimates
This is the time of year when we have completed the first round of data-gathering for Keeping Pace, and we’re well into looking at the numbers of online and blended schools, and numbers of students taking online and blended courses. This also means that this is the time of year when we are reminded that we really don’t know exact numbers, and even data sets that we consider good have fairly large margins of error.
I’ll use three examples to illustrate:
For one profile, of a state that we have believed had good enrollment information, we have had quite a bit of back and forth trying to get the state’s enrollment numbers in online schools and courses for SY 12-13. We finally were sent from the department of education what we expected to be an accurate spreadsheet showing quite a bit of information about students, courses, enrollments, and course completions.
Then we realized that the numbers show more online course completions than online course enrollments.
We are certainly advocates for the potential of online schools and courses. But even in our wildest optimism we don’t expect that students can complete online courses without starting them.
Of course we don’t think that’s what the state believes. But this example shows that most state data systems aren’t set up to track the numbers of students in online courses, blended courses, blended schools, and other emerging school types and modes of instruction. And when someone at an education agency tries to pull it together (often going above and beyond due to KP researchers begging) —mistakes are sometimes made.
A second example: in our California profile, we have two sets of numbers of online students. One is from the department of education and is self-reported, and one is from the California Learning Resources Network. The two are not extremely different from one another, but the difference is not trivial. And, the numbers show large increases (71% in one case) that we attribute partly to growth, and partly to better data gathering. Unfortunately, we don’t have a good estimate as to how much is real growth, which makes year to year comparisons difficult.
A third example: a state has given us a number of unique online students, with the caveat that the state has no confidence in the number. The reason is that it’s not clear that the schools being queried understand the definition of online that is being used. This situation is rather common, as much of the data collected are self-reported from schools and programs.
We’re not complaining about the lack of good data. It is the current state of the field, much improved over just a few years ago, and in fact a significant part of the value that KP provides to the field is in digging into and interpreting these numbers in the best way possible. It’s getting better, and many smart and dedicated people are working to improve the situation.
But it is worth remembering that all these numbers that you may see—from KP or other sources— come with non-trivial margins of error and caveats that are rarely included in the numbers when they are reported in the media. If anyone gives you a number that they appear certain about, it’s worth digging into their methods. Reporting on the best data that we have is important, but it should be accompanied by the appropriate dose of humility about what we don’t know.