Is Accuracy Academic?

, Lindalyn Kakadelis, Leave a comment

ABCs accountability results for the 2005-06 school year are just in … sort of. While official data on public school performance in North Carolina was released late yesterday, it included only reading scores. We’ll have to bide our time until October to find out how our students are doing in math.

In the interim, it’s helpful to brush up on recent changes to the state’s accountability program. These are definitely not the same old ABCs. The tests have undergone yet more revisions – surprising no one. In fact, since its inception in 1996, the ABCs program has been consistent in only one area: its inconsistency.

Unfortunately, constant change poses a bit of a problem for those of us bogged down by an interest in accuracy. Tracking improvement over time with a system that shifts like the sands is a virtual impossibility, a fact acknowledged by the NC Department of Public Instruction: in a recent presentation by DPI’s Accountability Division, staff members admitted that old statistical links were “tenuous.” Any chart showing vast achievement gains over the years is necessarily suspect, leaving the general public in the dark about whether any progress has been made.

And when it comes to navigating through the current raft of results, there’s not much of a road map out there for parents. While it’s true that DPI’s website provides users with copious amounts of information, this material is dense and difficult to decipher. As a result, the media – spoon-fed the official “spin” from DPI – generally gets the last word.

To help you sift through the current round of testing results, I have highlighted some important changes to the ABCs program (next week’s journal will dissect the scores themselves).

This year, modifications to growth formulas are taking place at every grade level. Particularly noteworthy is the fact that 2005 formulas change how (PDF) growth gains are determined. Now, educators evaluate each child’s achievement progress rather than overall achievement gains in a school. Formulas examine 2 years of prior student performance in a specific subject to predict current performance. In order to earn the designation “high growth,” schools must demonstrate that at least 60 percent of students are making expected growth. This means that one child’s top-notch score can no longer compensate for another child’s low score.

Also, as noted earlier, this year’s data is unusual as it excludes math scores (PDF), which are not expected until later this fall. DPI is currently looking at how students performed on the new math test before determining achievement levels – a situation that is fraught with bias. Maybe state education officials have 2001’s math fiasco – when they discovered (due to poor judgment and a flawed system of field-testing) that the new math tests were absurdly easy to pass – seared into their minds. At least that problem will not recur.

When evaluating scores, here’s a final caveat: North Carolina develops all performance tests “in house.” While a majority of states contracts with professional testing companies (that have achievement tests already in development), and include a few state-specific questions to satisfy No Child Left Behind requirements, DPI comprises its own testing company. Education officials employed by the state design questions, field-test these questions, determine levels of achievement, and shun peer comparisons with children outside North Carolina. This provides parents and teachers with precious little diagnostic feedback on students’ ability or specific skills.

Where do we go from here? At a minimum, we ought to replace state tests with independent, field-tested, national examinations of student performance, like the Iowa Test of Basic Skills (ITBS) or another credible and nationally-known achievement test. The ultimate authority to change our testing tool rests with our elected representatives in the North Carolina General Assembly – good news since it means that parents (and voters) have a lot of power.

But the first step on the road to change is getting informed. Next week, I’ll provide my analysis of ABCs test scores and outline what they mean for education in our state. Stay tuned.

Lindalyn Kakadelis is the Director of the North Carolina Education Alliance a network that provides information on K-12 education in North Carolina to the public.