Education Data: Slow Expectations

by Jeff Camp | June 18, 2019 | 1 Comment
featured image

Creeping Toward the Present

America began assembling national data about education over 150 years ago. The US National Center for Education Statistics (NCES) reckons its birth year as 1867, the same year an enterprising entrepreneur patented the invention of bicycles with pedals.

In the old days, essential facts about schools had to be tallied by hand. Lacking standardized accounting systems, surveys were taken school by school, district by district and state by state. It took time to collect, write, deliver, compile, check, interpret, compose, edit, finalize, confirm, print and distribute the data. Delays and errors were inevitable. After all, reports were drafted by chalk and quill, typeset by hand and delivered by stagecoach, or perhaps occasionally by bicycle.

Obviously, we don't live in that world anymore... right?

Slow Expectations

Alas, fast computers are no match for slow expectations. Most businesses today operate in the present, but government organizations largely have yet to change their standards of timeliness. School districts still don't have consistent data systems. As a result, even basic statistics about the education system are still gathered as surveys, which still take years to be compiled and compared.

Fast computers are no match for slow expectations

EdWeek, a subscription journal, is America's most prominent national publication for education leaders, researchers and policymakers. Each year, it releases Quality Counts, a widely-quoted publication that evaluates and compares the strength of public education systems in each state of the nation. On June 4, 2019, announcing Quality Counts 2019, the editors promoted the report as based on "the latest federal data on K-12 spending nationally and how each state is doing in raising and distributing those funds."

The 2019 report relies on the latest NCES data, which is from… wait for it… 2015. It's a time capsule from four years ago. To be fair, EdWeek never claims that Quality Counts is fresh, timely, or even accurate — only that it is based on the latest nationally comparable data available.

It's kind of like awarding high school diplomas on the basis of freshman-year grades.

EdWeek isn't alone, of course. Education systems across the globe have slow expectations. At their freshest, international comparisons (such as PISA and TIMSS) rely on data even more out of date than NCES.

So What?

Speed and accuracy tend to reinforce one another.

What's the rush, some might protest. Schools change slowly. Does it matter whether counties, states and nations have timely data about their school systems? Isn't it better to get facts right than to get them quickly?

This is a false choice. In the age of big data and digital communication, speed and accuracy tend to reinforce one another. For example, Amazon's business model has been built on the capacity to track details accurately in huge volume. By comparison, the data demands of the education system are adorable.

Still, the question is worth a straight answer. When a system cellars data for a few years before uncorking it, what's the harm?

Among the many problems with navigating through the rear-view mirror, the most potent might be plain old confusion. Is education spending rising or falling? Are schools in California funded above the national average or below it? How different are class sizes in Los Angeles from those in Houston or New York? What about teacher salaries? Are pension costs displacing classroom spending in other states to the same degree as they are in California? How do California's investments in school-based health services compare to those in other states?

These are all important questions for policymakers and constituents, but the data necessary to answer them coherently arrive without urgency.

Confusion

Schools are funded by taxes, which are an expression of public trust. Confusion corrodes trust.

The stock market will not expand forever. When the slump comes, education funding will slump with it. The "latest" official figures will become really confusing, because the reported numbers, delayed, will continue rising, as if nothing happened.

Slow expectations for education data undermine real conversations between constituents and their representatives. Most elected figures serve two-year terms — far shorter than the delay built into state data systems. This leaves candidates unable to campaign on hard evidence of results, especially at the state level.

Confusion corrodes trust.

California has lagged other states in education funding since the 1970s, when Proposition 13 and other initiatives flipped the system from relying on local political will to relying on statewide political will. In 2019, the California School Boards Association (CSBA) began a campaign for Full and Fair Funding, calling for the California legislature to raise school funding to the national average by 2020 and to the average of the top 10 states by 2025. Slow data is an Achilles heel for the campaign because it is impossible to accurately identify what level of funding in California would accomplish the benchmark until years after the fact.

How to Speed Up

The good news is that data collection systems don't have to be so slow. It's a choice.

Speeding up will require widespread adoption of shared digital reporting standards — a tech echo of the adoption of the Common Core. The idea is not new. The US Department of Education's nascent Common Education Data Standards project (CEDS) sketches a rough schema for education data. If completed and used, a schema like this could serve as a digital Rosetta stone, making collection and interpretation of education data faster, less laborious, and more meaningful.

Finding the political will to get it done will require a little money and some leadership. California's data systems for education are in dire need of improvement, as we explain in Ed100 Lesson 9.5, but Proposition 4 (passed in 1979) has made it very expensive for the state to set mandatory reporting standards.

Predicting a Shortcut

School districts are already in the prediction business

Even without a crisis, there could still be room for better, faster state-level data with a different approach to the problem. At present, state-level data about the conditions of public education are collected more or less survey-style. Like vote counts immediately following an election, the results are vulnerable to missing ballots or bad counts. The Department of Education has routinely taken the approach of omitting survey data unless and until they are confident about it (at which point the findings are thoroughly stale).

There's another approach: statistical prediction. School districts are already in the prediction business. They have to estimate future enrollment, for example, in order to prepare the right number of classrooms, hire teachers, and build schools where they are needed. They predict the future supply and demand for teachers using workforce data, housing data and tea leaves. Especially at the state level, education statistics tend to change quite gradually, usually driven more by economic and demographic forces than by policy choices. (The EdSource States in Motion project illustrates this slow change graphically.)

The End is Near

The conditions for change may be approaching. When the long bull market comes to an end, school funding will swoon.

In the Great Recession, federal funds helped fill the gap, using the crisis as an opportunity to nudge states into action on the long-discussed work of adopting updated content standards. In the next crisis, adoption of faster and better data standards might be a condition for federal aid. For example, states might be asked to develop and adopt a standards-based public data schema such as CEDS and commit to semi-annual delivery of data, including both forecasts and revisions.

Education is one of many critical factors that can affect kids' future. Health, housing, parenting, safety, relationships, transportation, discrimination and other factors weave together in ways that remain poorly documented and weakly coordinated. If the education system had a decent, timely data system, it would be easier to imagine coordination between school systems and other services for families and kids. Slow expectations don't fit with that picture.

Questions & Comments

To comment or reply, please sign in .

user avatar
norburypta June 26, 2019 at 1:39 pm
You have smushed national and state data delays. Better to deal with one or the other. Data of the right kind is most valuable at the site and District level. Other data, for the most part is useful at the state. National...well?

user avatar
Jeff Camp July 6, 2019 at 5:41 pm
Actually I smushed international in there, too. I agree that local data is the most important most of the time, and the timeliness of that data has improved markedly in most districts. But for policy questions like "how much should we be spending" or "how big is a big class size" it makes sense for data to be comparable and timely. With reasonable data standards and expectations about timeliness this could be solved.
©2003-2024 Jeff Camp
Design by SimpleSend

Sharing is caring!

Password Reset

Change your mind? Sign In.

Search all lesson and blog content here.

Welcome Back!

Login with Email

We will send your Login Link to your email
address. Click on the link and you will be
logged into Ed100. No more passwords to
remember!

Share via Email

Get on Board!
Learn how California's School System works so you can make a difference.
Our free lessons are short, easy to read, and up to date. Each lesson you complete earns a ticket for your school. You could win $1,000 for your PTA.

Join Ed100

Already a member? Login

Or Create Account