The trouble with university rankings (1) | Inquirer Opinion
Second Opinion

The trouble with university rankings (1)

/ 05:13 AM June 16, 2023

If university rankings accurately measure the value and worth of academic institutions, then the criteria for those rankings are insightful and revelatory of both what people are looking for in universities, and what universities are looking for in themselves; in other words, their raison d’être.

QS Quacquarelli Symonds reports using the following criteria for the 2023 edition of their World University Rankings, in which the top five universities are Massachusetts Institute of Technology (MIT), Cambridge, Stanford, Oxford, and Harvard, and the Philippine universities that made it to the list are University of the Philippines (UP) (412), Ateneo (651-700), La Salle, and University of Santo Tomas (UST) (both 801-1,000).

  • Academic reputation – 40 percent
  • Employer reputation – 10 percent
  • Faculty student ratio – 20 percent
  • Citations per faculty – 20 percent
  • International faculty ratio – 5 percent
  • International student ratio – 5 percent

For its part, Times Higher Education has the following criteria for their own 2023 edition, in which had the same top five universities, only in a different order: Oxford, Harvard, Cambridge, Stanford, MIT, and the four local universities that received a ranking were Ateneo (351-400), UP (801-1,000), La Salle (1,201-1,500), and Mapúa (1,501+).

ADVERTISEMENT
  • Teaching (the learning environment): 30 percent
  • Research (volume, income, and reputation): 30 percent
  • Citations (research influence): 30 percent
  • International outlook (staff, students, research): 7.5 percent
  • Industry income (knowledge transfer): 2.5 percent

What is clear from all of these metrics is that research is by far the major measurement of a university. For QS, for instance, “academic reputation” is about how a particular university is regarded by “global academics” in terms of “world-class research,” which is defined as “not only … the quality of the research, but the strength of the university in communicating that research, and the strength of the impact the research makes across the world.” Together with “citations per faculty,” research is directly involved in 60 percent of the criteria.

FEATURED STORIES
OPINION

For Times Higher Education, too, research is likewise 60 percent if you add “research” and “citations,” with both “international outlook” and “industry income” also indirectly related to it.

On paper, this looks great; who wouldn’t want to have—or to enroll in—a university that produces “world-class research” and “world-class researchers”?

The problem with these rankings, however, is that they lionize research over the other equally important roles of universities, including teaching and mentoring, as well as their vital functions in civil society and nation-building. A local university will get points for partnering with a European university (international outlook, check!), but it will not when it reaches out to state universities who stand to profit the most from their expertise. It will get points for tailoring courses for foreign students (international student ratio, check!), but not for designing courses for barangay health workers, forest guards, paralegal volunteers, and so on.

Just as importantly, those rankings valorize certain kinds of research in certain kinds of publications over other scholarly output. Articles that get published in international journals, particularly those that are “ISI/Scopus-indexed,” are celebrated; the higher the “impact factor,” the better. Well and good. But as my colleague Jayeel Cornelio, Ateneo’s associate dean for research, recently pointed out, many of these top journals charge exorbitant open access fees or publication charges, making this kind of “world-class research” prohibitive for local scholars, and even if they manage to publish, many Filipinos won’t even have access to them because they, too, have to pay.

It gets more problematic when you consider what’s left out. Columns like this, for instance, are not counted, not even those from distinguished scholars like Randy David, Cielito Habito, and Michael Tan; nor are books by Resil Mojares, Caroline Hau, Ambeth Ocampo—just to name a few. Neither are clinical practice guidelines written for Filipino doctors by Filipino doctors. And what of documentaries, pamphlets, and other material written for national and local audiences? They, too, will be counted far less, if at all, in those university rankings.

If academia were unaffected by the rankings, then there is no problem. The trouble is when universities, scholars, and students realign their priorities and preferences to conform to them. In my next column, I will sketch some of the impacts of these rankings to universities as well as their faculty and students, drawing from my insights as a Philippine-based academic as well as from the mounting global criticism against those rankings.

ADVERTISEMENT

—————-

[email protected]

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our daily newsletter

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.

TAGS: Second Opinion, university rankings

© Copyright 1997-2024 INQUIRER.net | All Rights Reserved

We use cookies to ensure you get the best experience on our website. By continuing, you are agreeing to our use of cookies. To find out more, please click this link.