The Norman Transcript

Education

October 6, 2012

A-F grading system: questions and answers

(Continued)

NORMAN —

The SDE’s formula does not accurately measure academic growth in students.

What is the effect of this formula for a high-performing district?

Perhaps to best demonstrate the effect is to give an example based on the preliminary grades the SDE has computed of NPS schools: One NPS school deemed High Performing by federal criteria and which has over 90 percent or more of its 300 students tested at Proficient and Advanced on state exams will likely receive a B because of the formula’s heavy weighting of 24 students’ scores who present unique learning challenges and because of its flawed computation for measuring those 24 students’ academic improvement.

Shouldn’t schools be accountable for students with challenges?

Absolutely and they have been, especially since the No Child Left Behind law. However, because the SDE’s new formula doesn’t accurately reflect the academic gains being made by students, the High Performing NPS school referred to in our example will likely receive a B even though its lower performing students are improving academically.

How do the grades not accurately reflect students’ academic growth?

The SDE’s formula for calculating “state average growth” is what a school must meet to earn credit in the formula for its lower performing students making academic gains. However, this average isn’t a true average because the SDE chose to compute it by only using the test scores of students across the state who posted gains from one year’s subject test to the next year, e.g. they did better on the 4th grade reading test than they did on the 3rd grade reading test. Why is this a problem? Most students will perform at grade level and will do about the same on the state subject tests from one grade to the next grade because the tests are designed to measure grade level mastery of the content. However, in computing average academic growth for its formula, the SDE excluded the scores of students who performed the same from one year’s subject tests to the next, as well as excluded the test scores of students who didn’t perform as well on the next year’s subject tests. This inflated the ‘average’ and resulted in most schools across the state receiving lower grades even if their students were improving academically.

Text Only | Photo Reprints
Education