Not one of these factors purports to measure the thing that academic quality is centrally about — learning.
A number of North Carolina schools figure in this year’s edition of U.S. News & World Report‘s “America’s Best Colleges” rankings: Duke was tied for fifth, Wake Forest 27th, UNC-Chapel Hill 29th, and N.C. State 86th.
UNC’s ranking was unchanged from last year. Unable to celebrate any upward movement, the school released a statement cheering its “improvement” in one of the factors in the U.S. News calculation: faculty resources. That’s like a basketball coach whose team has had a so-so season issuing a triumphant statement extolling the squad’s improvement in free-throw shooting percentage.
University administrators usually cannot resist commenting on the U.S. News rankings, which are widely read as signifying something important about the quality of education a school offers. If you look closely at the way the magazine calculates its rankings, however, you might well conclude that the whole enterprise is a waste of time.
In preparing its rankings, U.S. News relies on six factors. Four of them are input measurements: financial resources, alumni giving, faculty resources and student quality. One is an output measurement (student retention and graduation rates), and one amounts to a subjective guess (academic reputation). But not one of these factors purports to measure the thing that academic quality is centrally about — learning.
Let’s look at each in turn.
Here’s how the magazine calculates academic reputation, which accounts for a full 25 percent of the entire ranking. U.S. News sends a survey form to the three top academic officials at every college and university in the country. Those individuals are asked to rank, on a five-point scale, the academic reputations of other schools in their category. That is, the heads of liberal-arts colleges rank other liberal-arts colleges, the heads of research universities rank other research universities, and so on. But how much concrete knowledge (if any) are these college officials likely to have about the academic climate at other schools? Whatever numbers may emerge from these reputational assessments may amount to little more than guesswork.
What about student quality (15 percent of the total score)? The way the magazine performs its calculation, schools get more points the more selective they are. And schools that enroll brighter students also score better. But just because a college enrolls a lot of bright students doesn’t necessarily mean they will be well-taught. Students at some of the most elite schools admit that some of their professors have low academic standards and that they can get Bs with little effort. There’s no automatic connection between a typical student’s academic capability and how much he actually learns.
Faculty resources are another criterion. Twenty percent of the overall score is based on a number of faculty-related matters, including compensation, percentage of faculty members with a terminal degree, the percentage of full-time faculty, and the student-to-faculty ratio. But once again, there isn’t necessarily any connection between those factors and how well courses are taught — or how much students learn. Consider compensation, for example. A school with a lot of highly paid “superstar” professors looks great to U.S. News, but we shouldn’t assume that just because they’re paid a lot, they’re correspondingly effective in their teaching. In fact, students often complain that the superstars tend to neglect their teaching obligations in favor of writing and consulting.
Next, U.S. News awards points for high student retention-and-graduation rates. But the fact that a lot of students stay in school and graduate doesn’t necessarily mean they’re learning anything while they’re there. One college, for example, might have generally tough academic standards and therefore have a higher percentage of students who decide to leave, either to enter the job market or to switch to an easier institution. Another school might keep most of its students enrolled through a combination of grade inflation, an undemanding curriculum and professors who do more entertaining than teaching.
Financial resources and alumni giving are likewise poor gauges of educational quality. A school with a big endowment can afford to spend lavishly, but does that guarantee that its students learn more than their counterparts at an institution that has to pinch pennies? And what’s the rationale for linking alumni giving and academic climate?
Taken together, these criteria add up to a lot of smoke and mirrors. And that’s why I maintain that college rankings don’t mean anything.
[George Leef is the director of the Pope Center for Higher Education Learning Policy.]