Are we mismeasuring our schools?
February 13, 2023
Facebook_icon.pngTwitter_icon.pngLinkedIn_Icon.pngPinterest_icon.pngEmail_share_icon.png
Student test results are often misunderstood — by those inside and outside the world of education. That’s according to Steve Rees and Jill Wynns, authors of the new book, “Mismeasuring Schools’ Vital Signs.”
Veterans of the school accountability wars, Rees (founder of School Wise Press) and Wynns (a former San Francisco USD school board leader) write about their experiences on how to avoid costly mistakes with student data.
As school leaders continue to analyze their CAASPP scores, we asked Rees to share his thoughts on how school leaders should be interpreting the data.
California school leaders just got their most comprehensive look at student academic performance since the start of the pandemic. How should they be looking at these test scores? First, I suggest that when leaders measure progress, they should look at the same kids over time, graduating class cohorts. With evidence organized this way, you’ll be able to see learning rates rather than score levels.
Second, take the longest view possible. Look back to the start of CAASPP in 2015 for those cohorts where it’s feasible. The more evidence you have in hand, the stronger your confidence can be in the quality of that evidence.
Third, use the scale score metric rather than the cruder “percent meeting standard.” Scale scores give you rough precision, and a known range of imprecision.
What kinds of questions do you think school leaders should be asking while looking at this data? Let’s start with the questions CAASPP was not designed to answer. For example, CAASPP data is not well designed to answer instructional questions about individual students and the standards they struggle with. It was not designed to provide instructional guidance to teachers. Nor was it designed to measure student learning greater than 1.5 grade levels above or below the grade level tested.
CAASPP data, however, is designed to answer questions like these:
  • Is our middle school math program enabling our students to continue mastering grade-level skills at the same pace they attained in elementary school?
  • Do our RFEPs students’ scores in math and ELA match those of their English-only peers?
  • Are students from some of our elementary schools, as a whole, doing better than others in math when they get to middle school? If so, what makes their instructional approach more effective?
ADVERTISEMENT
What are some common mistakes school leaders make when measuring performance and interpreting their data? The most common mistake is to rely on the Dashboard for interpretation of CAASPP results. Although the CAASPP itself is a solid assessment, the Dashboard’s interpretation of CAASPP is riddled with errors. Among the ones that I found most damaging to data quality:
  • Melding status and change (two distinct factors that have meaning) into one measure that’s useless;
  • Using colors rather than numbers, which clusters results into range-and-frequency barrels without expressing their scalar differences;
  • Mismeasuring gaps by comparing subgroups to the “all student” group to which they belong;
  • Perpetuating a logic error when evaluating emerging bilingual students (ELs).
The absence of a growth measure in the Dashboard leaves districts to figure it out on their own. But there’s some help from the Smarter Balanced Reporting site which makes growth visible for more or less the same kids over three years.
What made you decide to write this book? We were eager to share stories. Jill, as a 24-year veteran of the San Francisco USD school board, and a past-president of CSBA, had a wealth of stories of mismeasurement to share. I have been leading a small firm, School Wise Press and its K12 Measures team, for 23 years, and over that time served more than 240 districts. The stories I’ve gathered, both by experience and by extensive reading of research (I’m an active member of American Education Research Association), leave me eager to share them.
But we’re also both optimists. We believe that site and district leaders can be persuaded to help the profession of education management join the social sciences, where empirical methods prevail and where knowledge is built from raw data, guided by the disciplines of the sciences. Other professions have left weak traditions behind, among them baseball, psychology and medicine. We believe education management can take the same step into the modern era.
Do you have an example of a school that took a more careful look at its data and was able to improve student outcomes? Sure. Morgan Hill USD in Santa Clara County had been firmly committed to its reading curriculum — the Fountas & Pinnell program, and some of its instructional leaders had been trained in Lucy Calkins’ “Units of Study.” But after seeing the hard facts of the rate at which their early elementary students were not learning to read, and after seeing evidence of districts with students much like their own that were seeing better results, they started talking. That dialogue included considering teaching methods that emphasized decoding fundamentals. This was a civilized debate — not the dogmatic argument that often ensues. The district also elevated its commitment to a higher quality dyslexia screener, “Multitudes,” being developed at UC San Francisco by a multi-university team.
What are three things leaders should start doing to more accurately measure their school’s vital signs? First, if leaders started using comparative methods, identifying just five districts whose students most closely resembled their own, they could ask questions about how their own district or school is doing compared to others like their own.
Second, stop relying on the Dashboard. Use the freedom you have in California to exercise local control when you measure your vital signs, but do it smart and do it right.
Third, make fewer mistakes. This requires recognizing mistakes when you see them. Our book, “Mismeasuring Schools’ Vital Signs,” will help you spot them. This is decidedly not a “best practices” approach. This is a “learn-from-your-mistakes” approach.
ADVERTISEMENT
Contact Us
|
www.acsa.org
© 2023 Association of California School Administrators