Do College Rankings Matter? Yes, Despite Flaws.
- A new book on college rankings comes on the heels of several rankings-related scandals.
- The author was president of Reed College, which has boycotted U.S. News rankings for years.
- Rankings methodologies are inconsistent and flawed, he says, and college leaders routinely “game the system.”
- Colleges have attempted to collectively boycott the rankings, but to no avail.
A series of recent scandals has once again thrust college rankings into the national spotlight.
These latest incidents reconfirm that rankings — particularly those published by U.S. News & World Report — have become the tail wagging the dog, causing colleges to adjust their admissions practices, manipulate statistics, and even resort to felonious activities to improve or protect their numerical standings.
In his timely new book, “Breaking Ranks: How the Rankings Industry Rules Higher Education and What to Do About It,” Colin Diver, former president of Reed College and dean of the University of Pennsylvania Law School, offers a unique perspective on this pervasive phenomenon.
Diver spoke with BestColleges about his experience with college rankings and his predictions for their future.
Recent Rankings Scandals Rocked Higher Ed
Four universities, three of which happen to be located within a hundred-mile radius, have come under fire recently, accused of or admitting to fudging numbers related to college rankings.
In April, news broke that Rutgers University had falsified numbers related to its MBA program. It is accused of using an employment agency to land graduates jobs at the university itself, which U.S. News doesn’t count as “real” placement figures. Rutgers students have filed a class-action suit against the university claiming they enrolled under false pretenses.
A month earlier, the University of Southern California withdrew its education school from the U.S. News rankings after it discovered a “history of inaccuracies,” including potentially inflated Graduate Record Examinations (GRE) scores.
Also in March, the former dean of the business school at Temple University was sentenced to 14 months in prison for submitting false data to U.S. News.
Moshe Porat, who was dean at Temple from 1996 to 2018, was found guilty of wire fraud after it was discovered he provided incorrect statistics about the percentage of Temple MBA students taking the Graduate Management Admission Test (GMAT). Inflated stats helped Temple rise to No. 1 among online MBA programs.
And in February, Michael Thaddeus, a Columbia University mathematics professor, dropped a bombshell — on the university’s website, no less — alleging the Ivy League school has repeatedly sent bogus stats to U.S. News. This pattern of “discrepancies,” Thaddeus notes, involves items such as full-time faculty percentages, instructional spending, graduation rates, and student-faculty ratios.
Thaddeus suggests these erroneous stats help explain Columbia’s “dizzying ascent” from No. 18 in 1988 to No. 2 in the most recent U.S. News rankings — a rise, he claims, that rests on a “web of illusions.”
Author and journalist Malcolm Gladwell, a noted critic of college rankings, naturally had a field day with these allegations.
Referencing the much-ballyhooed “Varsity Blues” admissions scandal, Gladwell mused in his online newsletter that “at least one of the schools that parents think are worth cheating to get into, is cheating in order to be on the list of schools worth cheating to get into. My heart hurts.”
‘Massaging Data’ and Rigging Reputational Surveys
These scandals come as no surprise to Diver, who’s kept a keen eye trained on the rankings industry for a quarter-century. Rankings, he said, are “symptomatic of some of the commercialization, competition, and corruption of higher education.”
While the examples at Temple, Rutgers, and (maybe) Columbia may be egregious, colleges routinely engage in more subtle forms of cheating, or “gaming the system,” Diver noted.
“It’s what I call ‘massaging data,'” he said. “It’s not necessarily outright lying. It’s interpreting the somewhat ambiguous instructions on how to count things in a way that’s favorable to your position.”
Diver admits he engaged in mild forms of rankings chicanery while dean at Penn Law. When completing the reputational survey, he assigned a close competitor, one universally considered among the nation’s top 10 law schools, to the bottom quintile.
“I’m sure a lot of deans and college presidents … ding their close peers,” he said.
Such gamesmanship highlights only some limitations of the survey tool. The primary problem with peer surveys is that college leaders often aren’t informed enough to make value judgments about the schools they’re asked to rank.
“By and large, the college presidents and deans who are asked to rank hundreds of peer schools know very little about the vast majority of them, except for their ranking and for their reputation, which is largely built on the ranking,” Diver said. “So it becomes kind of a self-perpetuating echo chamber.”
A shared skepticism among college leaders could explain why the rate of survey completion has dwindled over time. The submission rate used to hover around 60%, Diver said, and is now around 33%.
“That probably suggests something about a loss of interest or faith in that particular method of evaluation,” he said.
Additional Limitations of Rankings
Flawed surveys aren’t the only limitation of college rankings, says Diver. In a Chronicle of Higher Education essay adapted from his new book, Diver criticizes what he dubs the “rankocracy” — publications responsible for proffering annual rankings — for using faulty and inconsistent methodologies.
Publications can’t agree on the criteria used to evaluate colleges, Diver notes, and the relative numerical weights assigned to these variables seem arbitrary. The former U.S. News managing editor even admitted that the “weight used to combine the various measures into an overall ranking lacks any defensible empirical or theoretical basis.”
What’s more, these variables often overlap and reinforce one another. Diver offers the example of SAT scores and graduation rates, which almost perfectly correlate. This “multicollinearity” skews the results because some stats, such as the SAT, have an “outsized influence” on other variables.
“Many of the factors so carefully measured and prominently featured by the magazine,” Diver writes, “are just window dressing.”
Then there’s the continual tinkering with the formulas, which ostensibly reflects a growing sophistication about the matter and a commitment to incorporating expert feedback. But Diver believes it’s instead an attempt to rearrange the rank order and “generate enough drama to keep readers coming back year after year.”
He also contends it’s a way to discourage cheating.
“U.S. News has been engaged in an ongoing Whac-a-Mole exercise with institutions bent on gaming their system,” he writes. “Find a loophole, close it. Find another loophole, close that one. Ad infinitum.”
His chief complaint, though, is that rankings, particularly U.S. News’ formula, essentially measure the wealth of institutions and the students they enroll. It’s no surprise the top 10 national universities are also 10 of the richest schools in America.
“The problem is that virtually all the things they use are in effect highly correlated with wealth and privilege and the privilege of being white, rich, and having highly educated parents,” Diver said.
Even a factor such as graduate indebtedness, which accounts for 5% in U.S. News’ formula, skews in favor of colleges enrolling wealthy students.
“[U.S. News] actually makes it easier for the richest schools to score well on debt because they admit mostly extremely rich kids who don’t need debt,” Diver said.
Failed Attempts at Boycotting U.S. News Rankings
Despite these limitations, college administrators pursue ever-higher rankings like chefs chasing coveted Michelin stars — even if it means compromising their institutional integrity.
Over time, colleges have increased marketing efforts to attract applicants and become more selective. They’ve ramped up early admissions numbers to generate better yield rates. And they’ve given more weight to SAT and ACT scores so their averages rise.
While dean at Penn Law, Diver acceded to the ranking gods by assigning more weight to the Law School Admission Test (LSAT) once U.S. News began ranking law schools in 1989. Peer schools similarly adjusted their admissions strategies.
“I think, frankly, that was educationally dubious,” Diver said, “but it wasn’t dishonest.”
When Diver departed Penn Law for the presidency of Reed College, he left behind the “tyranny of college rankings,” he wrote in his Chronicle essay. He no longer needed to “worry about some news magazine telling me what to do.”
That’s because Reed College had opted seven years earlier not to submit data to U.S. News, effectively boycotting the rankings. Following that decision, the college dropped in the rankings after U.S. News punitively assigned it lower scores in some categories.
Still, Reed stuck to its guns. Today, Reed is ranked No. 62 among national liberal arts colleges. In 2019, one independent analysis concluded that based on U.S. News’ criteria, Reed should’ve been ranked No. 38 that year, but instead it landed at No. 90.
Reed surely wasn’t alone in its disdain for college rankings.
In 2007, the Annapolis Group, a collection of national liberal arts colleges, including Reed, issued a statement in which members committed to not mentioning U.S. News rankings in their marketing materials as a way to calm the “frenzy” around college admissions. They did not, however, commit to withholding data or refusing to complete the reputational survey.
That same year, another sizable group of institutions, coalescing under the banner of the Education Conservancy, issued a similar statement in which they did agree not to submit the survey.
“Reputation can be another word for gossip,” said Trinity Washington University President Patricia McGuire at the time. “We are saying that we will not engage in slandering each other’s institutions or inflating each other.”
Then-editor at U.S News Brian Kelly responded, “If liberal arts college presidents don’t participate, we’ll find other people to survey.”
Predicting the Future of College Rankings
That sentiment sums up why Diver believes rankings are here to stay despite the controversies, higher education’s misgivings, and calls for their demise.
“U.S. News is going to go ahead and rank you no matter what you do,” he said, adding that the 2007 boycott efforts had no “significant lasting impact.”
As long as there’s an admissions frenzy to feed and a voracious audience of consumers eagerly awaiting each year’s freshly reshuffled list, college rankings will persist.
“Rankings … are the handmaiden to the hyper-competitive nature of higher education in the United States and to the hyper-status orientation of elite higher education in the United States,” Diver said, “and as long as those underlying conditions exist, I suspect there will be rankings, and U.S. News will be sitting at the top of the rankings pile.”