When U.S. News and World Report released its 2012 “Best High Schools” ranking, a predictable controversy ensued. Schools that didn’t make the list felt the methodology was unfair. Others, like education blogger Emily Richmond, argued that the ranking was based on too-limited criteria.
Then particular schools’ high rankings were called into question, sometimes by the schools themselves, when it became clear that these rankings were based on faulty data. After a handful of such errors were uncovered, the National Center for Education Statistics (NCES), which houses the Common Core of Data (CCD) database, investigated further. As of June 2012, U.S. News found that a total of 17 nationally ranked gold, silver, or bronze medal high schools (around 0.003 percent of the total that were ranked) had incorrect data in the CCD. These schools were promptly “unranked.”
Errors most commonly occurred in terms of schools’ 12th-grade enrollment numbers and their percentages of economically disadvantaged students.
On the positive side, the ranking methodology had the admirable goal of recognizing high schools that “serve all of their students well, not just those who are college-bound, and produce measurable academic outcomes to show they are successfully educating students across a range of performance indicators.” Schools were first analyzed in terms of how well students performed on state assessments, taking into account the test scores of disadvantaged students. High schools meeting these criteria were then eligible to be ranked nationally in terms of college readiness, defined as student success in Advanced Placement (AP) or International Baccalaureate (IB) programs.
Many, however, would agree that no ranking system is perfect. In a blog post for The Atlantic, Richmond asked the question of how well students at ranked schools fared after graduation. “What percentage of the students at the top-ranking high schools go on to post-secondary success, be it college or the workforce? How many of them require remedial classes as college freshmen? How many years did it take them to earn their degree? These are the sorts of questions educators and policymakers need to ask as part of the broader debate over the future of K-12 education at the local, state and national level.”
Italo, a reader reacting to a U.S. News story about the rankings, shared a similar opinion. “The lists fail to account for parental influence, teacher quality and student improvement. These lists…stigmatize students from schools not on these lists when it comes to getting jobs post graduation.”
The U.S. News ranking is not going away anytime soon. A 2013 ranking is planned, and NCES said it and its state partners are implementing additional quality controls in order to improve the accuracy of the school information in its database.
For the 2012 ranking, U.S. News analyzed 21,776 public high schools in 49 states and the District of Columbia. This was the total number of public high schools that had 12th-grade enrollment and sufficient data, primarily from the 2009-2010 school year, available for analysis.
Be sure to share your opinions about the ranking on the EducationWorld community.