The fetish for global rankings
In their heart of hearts, Filipinos actually rank universities according to the prowess of their basketball teams, and not just the teams’ performance but with special attention to the design of the teams’ uniforms. Sadly, that doesn’t even include all the other wonderful sports teams, but just the one sport that corners the bulk of every school’s athletic budget.
That is why, in a sense, the global ranking of universities is the serious Filipino’s revenge against the barbarians. It is his way of saying to his heathen brethren that there’s actually another way of looking at universities. That in schools, there’s something else going on and it’s called learning. It’s done by people called teachers, who work together with typically younger people called students. Global rankings are the last desperate gasp of the dying Filipino thinker before he drowns in the sea of ignorance.
And that is why the reading and writing Filipino public understandably panics each time we rank low in these ratings. We realize how little we have invested in the next generation of Filipinos. But I would hold back on the agonizing and instead count our strengths. It’s actually amazing how our universities have thrived despite our ever-precarious economy and perennially corrupted politics.
The most recent 2011-2012 survey by the Quacquarelli Symonds (QS) group is just one of the many university ranking systems administered in the world. Each ranking system scores the schools on a different set of criteria.
For example, the QS criteria rely on the following: peer review (40 percent), review by graduate recruiters (10 percent), number of citations per staff member (20 percent), staff-to-student ratio (20 percent), percentage of overseas staff (5 percent), percentage of overseas students (5 percent).
The first two criteria are actually very subjective. I am not saying that subjectivity itself makes them unreliable, but let us not read too much into “peer review” and “review by graduate recruiters.” The first refers to how fellow academics regard the standing of local schools. Given the large pool of comparators spread all over the world, the best way for a school to project its academic presence is through international publications, which can actually be measured by the third criterion—“number of citations per staff member”—which is based on publications by the school’s faculty. Yet peer review which is highly subjective gets a heavy weight at 40 percent, while cited publications which is less so gets half that weight at 20 percent.
Even worse for “review by graduate recruiters,” which refers to the impression held by non-academics in the employment market. It is based on the marketable skills taught by the schools and desired by employers. The irony is that the loftier the academic mission of the school, the less marketable its skills training. Philosophy, the sciences and humanities are the indispensable core of a university. (Princeton University doesn’t have a law school. Does that make it any less a university?) Hereabouts, where do you think job recruiters rush to? To the philosophy department or to business administration?
I still recall a decade ago when Filipinos went into a fit after the now defunct Asiaweek magazine’s survey of Asian universities gave local universities low rankings. Yet looking at their criteria, they included the size of the school’s endowment and the salaries of its faculty. Such criteria obviously disfavored Philippine schools which funded themselves on the basis of either annual government handouts or student tuition. Moreover, faculty salaries are low due to constraints, both economic (not enough money) and cultural (not enough appreciation for teachers).
There’s actually another test that has been cited by even the New York Times that, had it been applied to the Philippines, would have given us more credit. The Shanghai Jiao Tung University (SJTU) Ranking System was developed on the principle that a great university is a great research university. It relies on academic awards and honors received by faculty, and citations of faculty writing in prestigious publications.
Contrast that to the QS study, which requires data about the number of students, faculty, etc. It is unclear how the QS study acquired their data. Just to take one item, some professors teach at various universities in Manila. To which school did QS attribute their publications? On the other hand, the SJTU ranking relies on faculty productivity that is available online from independent sources.
The chancellor of UP Diliman is Caesar Saloma, himself an internationally published physicist. He has closely monitored these university ranking surveys, including the QS study, to keep Diliman on its toes and improve as an academic institution. Diliman, as the flagship campus of the national university, is duty-bound to support fields that are otherwise left unstudied, a case of the Filipino nation living an “unexamined life.” For instance, how would the surveys appreciate the fact that UP has a pioneering institute of archaeology? And even more recently, a graduate program in medical anthropology?
At the same time, the challenge to the Commission on Higher Education is to keep watch over the proliferation of schools calling themselves “universities.” There are several push factors. One, the politicians, who wish to spiffy up the names of their local colleges. Two, the bureaucracy, which gives incentives for mere teaching colleges to call themselves universities. And three, the burgeoning market of OFW-funded students willing and able to enroll, a case of demand rising up to meet supply. The irony is that power and markets are the antithesis of academia, and it shows how little we appreciate the life of the mind.
* * *
Comments to email@example.com