I was touched by the depth of feeling these messages contained and the enthusiasm for the College. This is a very good place, and I share the pride that students, alumni/ae, and parents feel when they think about Colby. It is nice to be included on "best of the best" lists. But it is important not to let those lists dazzle us.
Many of the ways Colby is rated by outside publications are impenetrably arbitrary. We don't know how we earned the new Ivy designation, for example, and if it turns out to be an annual event, we are just as likely to be left off next year as to be included. We know from the "best and worst" lists included in The Best 361 Colleges, published by The Princeton Review, that a college can be at the top of a superlative list one year and missing the next"yet the campus got no less attractive, the food no better or worse, the professors no less engaged in their teaching.
Even rankings that purport to be based on objective criteria, such as those in the annual U.S. News & World Report America's Best Colleges guide, are subject to manipulation, both by colleges and by editors who know that rankings must change from year to year in order to sell magazines.
Director of Institutional Research and Assessment Mark Freeman responds to many requests for information from guidebook publishers each year. He also files information about the College with a variety of clearinghouses, including the College Board, the Integrated Post Secondary Education Data System (an arm of the National Center for Education Statistics), and the NCAA. Most guidebooks combine statistical portraits of the College (the size of the student body, mean SAT scores, comprehensive fee, faculty-student ratios, etc.) with deeply subjective material that is compiled by editors who never set foot on our campus.
Hence, Colby is said to be located in an "urban" setting while, on the same page of the same guide, Thomas College is "rural." Colby overlooks "hundreds of acres" of pristine Maine woods (I can see the lights in the Home Depot and Wal-Mart parking lots from my office window). The "average" Colby student is "from within 20 minutes of Boston" or from one of more than 40 states and 60 foreign countries, wears J. Crew and preppy fleece or is a bearded, beaded hippie, is a liberal or a conservative, is happy, is unhappy. . . . You get the idea. There are many college guides for niche-market students that celebrate our commitment to diversity and one that attacks us for being "addicted to the shibboleths of multiculturalism" and for "sacrificing traditional education to feed [our] flesh-eating sacred cows." I'm not sure what any prospective student is supposed to make of that.
The 800-pound gorilla, of course, is U.S. News, part of whose assessment of colleges and universities is based on statistics gathered from the clearinghouses I mentioned above and from an elaborate survey that institutions fill out and submit to the editors. Twenty-five percent of a college or university's score in the U.S. News rankings is based on a survey that is distributed to presidents, deans of faculty, and deans of admissions and asks them, essentially, "Which colleges are the best in the country?" A variety of other measures influence a college's final score, including SAT scores, graduation and retention of students, acceptance rates, class size, and student-faculty ratio.
Virtually all of the indicators in the U.S. News methodology are resource-driven, and colleges that both have more money and spend more money tend to capture the top places on the list. Economies such as those Colby has practiced for years"keeping our staff very small relative to peers', for example"are penalized. The editors have consistently refused to compare apples to apples on such measures as the cost of living in central Maine vs. western Massachusetts or suburban Los Angeles and other regional differences.
Schools that, like Colby, continue to require the SAT also have noted for years that comparing their scores to those from schools that are SAT-optional is another oranges-to-apples situation that distorts the U.S. News rankings and serves readers poorly.
Some colleges decline to participate actively in the rankings. As a group, the NESCAC schools decided a few years ago not to cooperate with an effort to rate athletics programs, and the effort was stillborn. But, tempted as I am sometimes to opt out, I see at least two reasons not to. Doing so alone might prove confusing to prospective students and therefore disadvantageous to Colby. And a protest by one institution"even one as good as Colby"would not be likely to alter in any way the behavior of U.S. News.
But we can and should keep the rankings in perspective by recalling several things. First and foremost, Colby's own market survey research, as well as independent national studies, find consistently that rankings and ratings lie pretty far down the list of things that influence the college application and decision-making process. Academic quality and reputation, the campus visit, and the breadth and excellence of the educational program are far more important. As it turns out, prospective students are considerably more savvy as consumers than we sometimes imagine, and we must listen to them when they tell us what drives their decision making.
Second, college guides and rankings are essentially commercial enterprises, at least as concerned with their own bottom lines as with deepening public understanding of higher education and the college admission process. In the absence of better means of comparing institutions, these enterprises are perhaps inevitable. But we should not dignify them by acting as though they have the public interest solely in mind.
I understand the challenges that college-bound students"especially high-achieving students"and their families face in trying to narrow their choices. The students are inundated with mail from colleges and universities throughout their high school careers. They have told us in focus groups that any serious effort to wade through the material would take all of their waking hours. Ratings and rankings seem to provide focus and clarity in the midst of this flood of information, but they can be highly misleading and superficial.
So what is to be done? College administrations and governing boards need to stay focused on the true metrics of quality and on understanding and evaluating both ends of the educational enterprise"what we provide students by way of excellent faculty and facilities and programs and what we and the students can say and demonstrate about the value of the enterprise when it is over.
We need to stop reinforcing the grip of commercial ratings and rankings systems by acting in ways that appear to validate them. This is hard to do in the very competitive world of college admissions, but it is an important measure of our dedication and seriousness. A great institution, secure in itself and energized by the work that goes on within its walls, doesn't have to live and die by the one-liner.