Editorial: College rankings are misleading. So why do we still use them?
Many high school seniors have been opening emails over the past weeks that tell them whether they got into the colleges of their choice. Even as they do so, the criticisms of published college rankings that may have guided their preferences are cropping up — again.
A math professor at Columbia University is challenging the data that the Ivy League school reported to U.S. News & World Report, which earned it the No. 2 ranking this year. The University of Southern California, which seems almost incapable of staying out of trouble for more than a few months at a time, pulled its graduate school of education out of the rankings this year after discovering a “history of inaccuracies†in the data it reported.
A couple of weeks ago, in what must be the granddaddy of fake-data scandals, the ousted dean of Temple University’s business school received a 14-month sentence after he was convicted in federal court of sending bogus information to U.S. News & World Report to boost the school’s prestige. Claremont McKenna College, The George Washington University and many other schools have tweaked data to boost rankings.
A USC official said there were inaccuracies in data reported by the Rossier School of Education going back at least five years.
But the ultimate issue with the rankings doesn’t lie with the cheaters. The problem is the rankings themselves. They can be a counterproductive way for families to pick schools — for example, a much less expensive school might offer an equal or better education than a more highly ranked but costlier one.
The most selective schools — Princeton, MIT and so forth — don’t need rankings to boost their reputation or applicant pool. And the differences between a school that might be 70th on the list and one that might be 90th are unlikely to have much of an effect on a student’s post-graduate prospects or college experience.
Probably few college applicants are aware that the single biggest factor U.S. News uses to rank schools is their reputation among officials at other colleges, who might or might not have deep knowledge of the schools. That accounts for 20% of the score.
The second biggest factor is six-year graduation rates. But since low-income students are far less likely to graduate within that time period — or ever — than middle-class students, this is more an indication of student affluence than academic excellence. In fact, it can have the perverse effect of discouraging colleges from accepting more low-income students, lest it worsen their graduation rates.
An extensive Gallup Poll found in 2017 that alumni who attended prestigious schools are only slightly happier with their choice of college than those who attended schools lower on the list. The biggest factor in student satisfaction with college was whether they had ended up in debt, though U.S. News only gives student debt a 5% weight in the rankings.
U.S. News has made some positive changes in recent years. It dropped student acceptance rate as one of the criteria, which had led colleges to heavily market to students even if they had almost no chance of acceptance. Lower acceptance rates equaled higher rankings. The rankings started including the percent of Pell grant students who graduated within six years — a meaningful statistic indicating whether colleges were helping low-income students complete their education.
UC eliminated the SAT and any other entrance exam for college admissions. That’s too bad. There are better ways to use tests for fairer admission decisions.
But many other factors used in ranking the schools have little meaning to a student’s experience. The rankings use alumni donations as a proxy for students’ happiness with their alma mater. That’s a pretty meager way to measure satisfaction.
What most high school students and parents need to know is whether a college offers a rich choice of courses with good instructors; whether graduates will leave with a load of debt; whether students will feel comfortable and engaged on campus; and whether they’ll be prepared for a fulfilling career.
College administrators bemoan the rankings but they continue participating. They should stop going along with the charade and insist on being partners in drawing up more valid ways to evaluate higher education. What should matter most is how satisfied students and alumni were with their choice. Using the data from the 2017 poll, Jonathan Rothwell, an economist at Gallup, published a study in which he devised an alumni-satisfaction ranking but published only the 25 schools with the highest satisfaction marks. Many of them were among the top-rated in any published ranking, but there were some surprises, including the University of La Verne and Azusa Pacific in Southern California. Rothwell’s study also found that the price of a college didn’t necessarily correlate with how happy alumni were with it.
If colleges and ranking organizations joined forces, though, they could create a uniform polling process for students and alumni that would be far more useful and a better reflection of colleges’ worth, combined with other factors.
A new approach could include specific issues students might find useful, for example: Which schools are more arts-oriented? Which ones specialize in experiential learning? Which ones have lots of extracurriculars, or a friendly, accepting campus environment?
Despite years of criticism, U.S. News and other college rankings publications aren’t going to give up on one of their popular and profitable annual features. It’s up to colleges to stand up and refuse to go along with rankings that fall short, and collaborate on a method that gives students worthwhile information to navigate the bewildering task of picking a college.
More to Read
A cure for the common opinion
Get thought-provoking perspectives with our weekly newsletter.
You may occasionally receive promotional content from the Los Angeles Times.