New York State Of Mind

Posted by on August 13, 2013

Last week, the results of New York’s new Common Core-aligned assessments were national news. For months, officials throughout the state, including New York City, have been preparing the public for the release of these data.

Their basic message was that the standards, and thus the tests based upon them, are more difficult, and they represent an attempt to truly gauge whether students are prepared for college and the labor market. The inevitable consequence of raising standards, officials have been explaining, is that fewer students will be “proficient” than in previous years (which was, of course, the case) – this does not mean that students are performing worse, only that they are being held to higher expectations, and that the skills and knowledge being assessed require a new, more expansive curriculum. Therefore, interpretation of the new results versus those in previous year must be extremely cautious, and educators, parents and the public should not jump to conclusions about what they mean.

For the most part, the main points of this public information campaign are correct. It would, however, be wonderful if similar caution were evident in the roll-out of testing results in past (and, more importantly, future) years.

The fact is that New York officials, city officials in particular, routinely interpret year-to-year changes in proficiency rates in an inappropriate manner. This includes: failure to account for the fact that proficiency rates are a distorted means of expressing test scores; that rates and scores often move in different directions; that changes in the sample of students who take the test and other forms of imprecision are often the primary cause of changes in rates/scores; and that increases in measured performance cannot be attributed to specific policies.

Granted, these types of errors are not at all limited to New York; they occur in virtually every state. Moreover, the new standards are a particularly drastic change, and the test results are in need of unusually careful interpretation vis-à-vis those from previous years. It’s one thing to fail to acknowledge that shifts in the composition of the test-taking sample can influence results. It is a rather more serious misinterpretation to proclaim that there’s been a massive decline in performance due to this year’s lower proficiency rates, which are mostly the result of new tests and standards.

That said, the presentation and interpretation of testing data requires serious caution every year, not just when officials are afraid of public backlash about the results. Hopefully, this whole affair will inspire a reevaluation of how test results are presented to the public and news media.

- Matt Di Carlo


4 Comments posted so far

Sorry, the comment form is closed at this time.

Disclaimer

This web site and the information contained herein are provided as a service to those who are interested in the work of the Albert Shanker Institute (ASI). ASI makes no warranties, either express or implied, concerning the information contained on or linked from shankerblog.org. The visitor uses the information provided herein at his/her own risk. ASI, its officers, board members, agents, and employees specifically disclaim any and all liability from damages which may result from the utilization of the information provided herein. The content in the shankerblog.org may not necessarily reflect the views or official policy positions of ASI or any related entity or organization.

Banner image adapted from 1975 photograph by Jennie Shanker, daughter of Albert Shanker.