California's School Dashboard Has Broken Gauges

by Steve Rees | June 6, 2020 | 4 Comments
featured image

Vital Misinformation

In 2017, the California Department of Education introduced a mechanism to interpret schools’ and districts’ vital signs — the California School Dashboard. Unfortunately, the Dashboard is broken.

Guest Opinion
Steve Rees

Although the Dashboard starts with facts, it often interprets them using twisted logic and funky math. It often flags districts for underperformance that don’t deserve it. And all too often it fails to notice districts with important problems that deserve attention. This matters because flawed data can lead good people to make bad plans.

Dashboard Confessional

How do I know this? For over 20 years, I’ve led a company that’s helped over 240 California districts understand their schools’ vital signs and explain them to others in their school accountability report cards. As a veteran of the accountability wars, I know that the cost of tarnished reputations is serious collateral damage.

I’m not the only critic of this Dashboard. Some are scholars like Morgan Polikoff (USC Rossier School), who co-authored a review of the Dashboard which was published by PACE in September 2018 as part of the “Getting Down To Facts” research. (Click here to listen to his video summary of its key points.)

Another researcher, Paul Warren, who retired recently from the Public Policy Institute of California, criticized the Dashboard’s illogic in a report published June 2018. Warren’s comments have special oomph because he led the Assessment and Accountability Division of the California Department of Education (CDE) from 1999 through 2003. (Click here to read my blog post about this report.)

Other critics include the 125 districts and 9 county offices of education that have quietly demoted the Dashboard’s place in their plans, and instead turned to the CORE Data Collaborative for higher quality evidence. Our own client districts have also relied on stronger evidence we built for them, comparing their vital signs to those of districts whose students are very much like their own. One client, Morgan Hill USD, won an award for excellence from the California School Boards Association for a plan that effectively disregarded the official Dashboard’s diagnosis, relying instead on their own evidence base.

Combining Status and Change is a Logic Error

Two examples of the Dashboard’s half-dozen flaws are perhaps most troubling. As Ed100 explains in Lesson 9.7, California’s Dashboard combines test scores (status) with a comparison to prior year’s scores (change) and assigns a district a single color (performance) based on both values.

This invites trouble because status and change have no relation to each other. Instead of having two perfectly clear and simple measures, the Dashboard blurs them together, obscuring the meaning of both.

Minding the gaps?

The intended mission of the Dashboard is to draw attention to where it is most needed — especially when it comes to patterns of unequal results, usually called gaps, among groups of students. Unfortunately, the way that the Dashboard has been designed doesn't deliver on this mission. Instead of helping to mind the gaps, it obscures them.

As designed, the Dashboard flags a group of students as needing attention when its performance is two “colors” from the “all students” category. Let’s ignore the loss of evidence that results when scalar measures like numbers are reduced to range-and-frequency bins that are colors. The more astonishing error is including a student subgroup in the larger “all students” group you’re comparing it to. Why not directly compare the two subgroups whose differences are of interest? (The rest of the world follows this convention.) No surprise, that’s exactly what the CAASPP reporting site shows, when you click on this link for Performance Trend Reports and then select “ethnicity.

As a result, the state Department of Education and county offices of education often focus their scarce resources on the wrong schools. Districts with egregious gaps are not getting dinged by the Dashboard for large gaps in graduation rates and test scores for numerically large subgroups (e.g., boys and girls, ethnic subgroups). Conversely, many districts are getting flagged for gaps for students with disabilities and English learners, due to the small numerical size of those subgroups of students, combined with their understandably lower scores.

The Highway to Help?

The California Department of Education might begin solving its Dashboard dilemmas by calling on Sean Reardon and his team at the Stanford Education Data Archive and the Stanford Center for Education Policy Analysis. How lucky that within 2.5 hours from Sacramento, they can find the social scientists who are leaders in the measurement of inequality in education

Steve Rees, for more than 20 years, has been helping leaders make better sense of their districts’ and schools’ vital signs. He is founder of School Wise Press and leads their K12 Measures team, which is helping district and school planning teams make smarter use of their numbers.

Questions & Comments

To comment or reply, please sign in .

user avatar
Anna Meza June 8, 2020 at 10:45 pm
From the article regarding the Dashboard flaws, I agree with the statement that colors are crude barriers that block the ability to make sense of vital signs. Numbers are more reliable. As the saying goes, "Numbers tell a story."
user avatar
jroubanis June 8, 2020 at 5:34 pm
In a heated time of American history, when we are struggling to rectify systematic injustices, it seems very appropriate that we bring attention to our broken gauge that possibly OBSCURES learning gaps between different populations. What strikes me is that the districts who would benefit most from accurate data readings are least likely to be able to afford developing their own assessment.
What are some ways that status and change can be aggregated on the Dashboard?
user avatar
Steve Rees June 16, 2020 at 5:54 pm
Well, the only way is to follow the conventions followed by the sciences, by business, and by sports. Report status separately from change. A baseball team can be leading their division by 7 games, but be on a three game losing streak. Do you think those two factors can be cobbled together? Neither do I.
©2003-2024 Jeff Camp
Design by SimpleSend

Sharing is caring!

Password Reset

Change your mind? Sign In.

Search all lesson and blog content here.

Welcome Back!

Login with Email

We will send your Login Link to your email
address. Click on the link and you will be
logged into Ed100. No more passwords to
remember!

Share via Email

Get on Board!
Learn how California's School System works so you can make a difference.
Our free lessons are short, easy to read, and up to date. Each lesson you complete earns a ticket for your school. You could win $1,000 for your PTA.

Join Ed100

Already a member? Login

Or Create Account