Stanford Today Edition:May/June, 1997 Section: Features: Stanford Takes on U.S. News Rankings WWW: Stanford Takes on U.S. News Rankings

Can a College Education Really Be Reduced to Numbers?
Stanford challenges the newsweekly for hitting a little below the belt.

By Elaine Ray

From Stanford's palm-studded campus to the cherry-blossomed thoroughfares of Washington, D.C., a clash of values is testing the wills of scholars and publishers. On one side is President Gerhard Casper and a national corps of student activists. On the other is the editorial staff of U.S. News and World Report, which publishes its "America's Best Colleges" issue from the capital every fall.

U.S. News editors insist that their college guide provides a service to parents and prospective students who want to invest their higher education dollars wisely. The fact that the guide is the weekly newsmagazine's hottest-selling issue demonstrates readers' confidence in their product, editors assert. But critics argue that U.S. News' college rankings should be taken about as seriously as a beauty contest or Sports Illustrated's swimsuit issue. They say U.S. News does consumers a disservice by assigning numerical value to things that cannot be quantified.

"It's a fundamentally ridiculous concept to say that you can take a series of numbers, run them through an algorithm and that algorithm will tell you what makes the best college and what makes the second-best college. That's an absurd notion," says Nick Thompson, vice president of the Associated Students of Stanford University (ASSU) and the national coordinator of Forget U.S. News Coalition (FUNC) ­ a national alliance of students. The ASSU was one of several student government organizations across the country that passed resolutions condemning U.S. News' formulas and asking their college administrations to withhold data requested by the magazine. FUNC members met with U.S. News editors in December.

It's not that Stanford fares poorly in the U.S. News rankings game. In the past four years the university has never fallen below sixth place, its current standing among the nation's colleges and universities. In the magazine's 1997 guide to graduate schools, a separate issue published in March, U.S. News ranked Stanford's business school number one; its engineering and education schools were each ranked second best; and the law school was ranked number three. The medical school ranked tenth.

But according to President Gerhard Casper, Stanford's status gives him the credibility to speak out on the rankings. "I hope I have the standing to persuade you that much about these rankings ­ particularly their specious formulas and spurious precision ­ is utterly misleading," Casper wrote in an unpublished letter to the magazine's editor, James Fallows, last September.

Casper says his letter began circulating quietly through the nation's ivory towers, where he believes many college presidents and administrators agree with him. "I had expressed views [many presidents and deans] had held for a long time, but they had just never bothered to express," Casper told Stanford Today. "There are college presidents who utterly dislike what U.S. News does but are worried about picking a public fight," says Casper, who met with Fallows in Washington in early December.

In mid-April, Stanford decided to throw another punch. This year the university will continue to submit objective data to U.S. News, but will withhold subjective reputational votes.

Stanford recently established a site on the World Wide Web ­ ­ that will offer data directly to students and families. "These data, many of them identical to those requested by U.S. News, are available immediately and free of charge, without students' having to wait to buy a copy of U.S. News," Casper wrote in his statement. The president also invited other interested colleges and universities to join him in setting up a new system.

TO OBTAIN DATA FOR ITS 1996 guide, U.S. News sent out a "reputational" survey, asking 4,200 college presidents, deans and admissions directors to rank all institutions in their category. Stanford administrators, for instance, were asked to rank all national universities, assigning them to one of four tiers. This reputational survey accounted for 25 percent of an institution's score.

In addition to the reputational survey, the magazine sent questionnaires seeking data on an institution's students, faculty and financial resources to 1,422 accredited four-year schools. The editors then used that data to measure what they described as "other attributes of academic quality." Each of these attributes ­ selectivity, faculty resources, financial resources, retention, alumni giving and value added ­ was assigned a percentage of the overall score as well. The institution with a score of 100 was ranked number one, and so on.

"Value added," a category introduced last year, was by far the most controversial. Asserting that "researchers have long sought ways to measure the educational value added by individual colleges," U.S. News editors devised this category based on the difference between a school's predicted graduation rate and its actual graduation rate. Value added accounted for 5 percent of an institution's score.

"Research shows that a student with a high SAT or ACT score is more likely to graduate; thus a school enrolling a freshman class with a high test score average has a better chance of seeing a large percent of students graduate," the editors explained in the September 1996 guide. Also included in this measure was the relationship between graduation rates and the amount a school spent on each student's education. "Taking all this into account, U.S. News then calculated which schools produced higher ­ or lower ­ than expected graduation rates and from this derived the 'value added' in the ranking tables," the magazine continued.

Casper characterizes the value added category as "wholly frivolous and nonsensical." In his letter to Fallows, he offered the California Institute of Technology as an example of an institution that is punished for offering a rigorous curriculum. "Caltech is crucified for having a predicted graduation rate of 99 percent and an actual graduation rate of 85 percent. Did it ever occur to the people who created this 'measure' that many students do not graduate from Caltech precisely because they find [the institution] too rigorous and demanding ­ that is, adding too much value ­ for them?" Casper wrote.

Complicating matters are questions about the veracity of the information used in these formulas. A Wall Street Journal article published in 1995 found numerous discrepancies between the information colleges provided to publications such as U.S. News and the information those same institutions provided to credit rating agencies. Providing false information to these agencies is a violation of the law. However, providing inaccurate data to a college guide is not. Many administrators admitted providing questionable information to college guides in an effort to ensure the highest possible ranking.

Even colleges that have the most honorable intentions may have difficulty providing information consistent with other comparable schools. "I think what U.S. News does creates an aura of certainty that is just not justified by the data it collects and presents," Casper says.

In 1995 the nation's leading college guide publishers, including U.S. News, Peterson's, the College Board and Wintergreen/Orchard House, began collaborating on a common set of questions. According to Al Sanoff, managing editor of the U.S. News college guide, some of those questions were included in questionnaires the magazine sent out for its fall 1997 guide. The next step, he says, is for university administrators to put their heads together. "Part of the problem is that perhaps higher education needs to come up with a standardized definition of how it collects data," Sanoff says.

In the preamble to its 1996 rankings, U.S. News editors wrote that they had tightened the magazine's software to help alert them to data that didn't jibe with information submitted in previous years. In addition, the data collected from institutions are now cross-checked against information schools submit to such agencies as Moody's Investors Service or the National Collegiate Athletic Association. Sanoff adds that each year the magazine takes pains to eliminate ambiguities by reworking its definitions and taking into account criticisms from university administrators and others.

However, Robert Bass, chairman of Stanford's Board of Trustees, says the magazine's penchant for tweaking its categories is purely a marketing ploy. "The fact that U.S. News changes the criteria on an annual basis simply highlights the folly of their ranking system," Bass says. "U.S. News makes a lot of money putting out this guide, and a guide that is the same as last year's is not going to sell. So U.S. News has a crass commercial incentive to invent changes and pseudo-measures of the colleges."

Sanoff calls that a "damned-if-you-do-damned-if-you-don't criticism. I think the argument that we're somehow purposefully playing with the numbers to hold people's interest is just not true. If we didn't make the changes, I guess people would say we are pigheaded and rigid."

If the rankings are irrelevant, as Bass describes them, why raise such a fuss? Thompson says institutions like Stanford are so concerned with the numerical rankings that they often set their priorities in the interest of improving them. He cited as an example the trustees' focus on fundraising after U.S. News ranked the university 77th in the "alumni giving" category in the magazine's 1993 issue. As Stanford's rate of alumni giving has improved ­ from 18 percent in 1993 to 31 percent in 1996 ­ so has Stanford's U.S. News rank in that category. In the 1996 issue of the magazine, Stanford ranked 26th in alumni giving. Stanley O. Ikenberry of the American Council on Education in Washington says that though institutions play down the importance of rankings, colleges and universities sometimes use them to shape internal policy decisions.

"When institutions start focusing on the rankings themselves rather than focusing on their own mission and character, they have a negative impact," says Ikenberry, who adds that in a society given to rating everything from automobiles to dentists, college rankings have become an "unavoidable aspect" of our cultural landscape. But the rankings, "distorted criteria" to begin with, do a lot of damage to institutions and society, he says.

"It really is a gross over-simplification and a distortion for all of this great diversity of institutions to be homogenized into a particular set of rankings. The options available to students are much richer than the rankings would suggest," he says.

Mariama White-Hammond, a Boston native who will come to Stanford as a freshman in the fall, says she began paying attention to U.S. News' rankings when she was a high school junior but they played a minor role in her decision to come here. "[U.S. News editors] don't take into account what the difference is between a big school and a small school for the individual. They don't rank things like family atmosphere or support systems for students. Often they don't rank things like the arts programs or even how easy it is to schedule your day. Those were all things that I wanted to look into that weren't on their scale," she says.

White-Hammond adds, however, that many of her contemporaries did not look much deeper than an institution's numerical ranking. "A lot of people went for the Ivy-type name. They often didn't know much about their school, but just that it was considered a top school."

Sanoff argues that many of the magazine's critics underestimate the intelligence and abilities of prospective students and their parents in choosing a university and using the rankings. "Implicit in much of this debate is that somehow students and families are mindlessly choosing colleges on the basis of our rankings system," he says. "People are far more intelligent and sophisticated. They have the wit to use the rankings in tandem with other information and experience."

James Montoya, dean of admission and financial aid, says it is difficult to measure the actual impact of rankings on the university's admissions. Although Stanford dropped from number four in 1995 to number six last fall, U.S. News' rankings do not appear to have dampened the enthusiasm or interest of prospective students at the university. "We do know that our application numbers are up 2.6 percent this year, when our ranking dropped to number six, while [the applications at] Yale, Princeton, Harvard and Duke ­ ranked one through four in 1996 ­ are down 2 to 10 percent," Montoya says.

Nevertheless, critics such as Casper and FUNC insist that U.S. News could better serve its readers by doing away with its numerical rankings and simply providing the data, listing institutions in alphabetical order. If they must rank schools, Thompson says, do so by categories such as class size or per- student spending. Casper points out that information such as the number of faculty that get elected to national academies should be included in the U.S. News questionnaire.

Whether the magazine will accommodate any of these concerns or recommendations remains to be seen. Neither Casper nor FUNC had heard from the magazine's editors regarding their concerns.

Bass voices similar ambivalence about a boycott. "Most of the data are, in fact, available in the public domain one place or another," he says. "I think that a boycott by the major universities would make a statement, but it is unclear whether or not it would change the U.S. News college guide."

Thompson is convinced it would: "If we all act together, the rankings are going to crumble. We firmly believe we're right, and we firmly believe James Fallows knows we're right." ST