Rating university quality
The release of the 8th Quacquarelli-Symonds Annual World University Rankings (QS WUR) earlier this week earned front-page coverage in a major daily. The headline announced that no Philippine university made it to the top 300 higher education institutions in the world.
Probably even more distressing to their alumni, the Philippine universities on the list all lost ground compared to their ranking last year. UP dropped from 314th to 332nd; Ateneo from 307th to 360th; La Salle from the 451st to 500th bracket to the 551st to the 600th bracket; UST slipped from the top 600 list to the top 700 list.
QS WUR claims to be the most widely followed system comparing international higher education institutions. Its audience includes academic leaders evaluating universities with which to partner, employers concerned with the competencies of potential recruits, and parents deciding on schools for their children.
Half of the scores came from surveys of academic leaders and employers, with this year’s rankings reflecting the views of over 33,000 academics in 141 countries and over 16,000 employers from 130 countries. The other half considers objective data such as faculty strength, student admission standards, and research output.
Critics of Philippine higher education seized upon the QS ranking results as proof of the government’s inadequate support for the system. Policymakers and the public should know about these efforts to benchmark Philippine universities against the best in the world. We can learn lessons from the schools that make it to the various lists and the factors they focus on to improve quality.
Resources, as critics maintained, do make a difference. Cambridge dislodged Harvard from the top slot last year and retained it this year. Staff reduction at Harvard led to a higher student-faculty ratio and a lower score. A drop in staffing levels similarly cost University of California, Berkeley, to drop out of the top 20 list.
So does a strategy for using available resources effectively. The German government’s Excellence Initiative channeled additional funds to a few universities and succeeded in moving them up the rankings. China had earlier implemented a similar strategy: investing in developing selected flagship institutions into comprehensive research universities that could compete with the elite global universities.
Resource constraints compel government to choose among equally important priorities. With the mass demand for higher education credentials, our politicians chose what was politically expedient. They could have supported the private school system and enabled it to admit more students and freed more funds to strengthen existing private schools. They decided instead to use public money to open more state universities whose quality they could not sustain. Apart from UP Diliman, no other state university made it to the QS list.
Rankings can also mislead and must be seen in proper context. QS WRU acclaims the top 500 universities from among 17,000 higher education institutions around the world. It focuses on a select circle of comparable and competitive World-Class Universities and clarifies what must be done by those aspiring to join this league. Institutions aiming to serve a national or regional market need not submit to the QS criteria.
In the QS system, for instance, a university wins points for raising international student enrollment. The top 50 and the top 100 universities increased overall enrollment by less than 1 percent last year. But the top 100 universities increased the number of foreign students only by 3.8 percent, and the top 50 universities by 6.4 percent.
How many of the country’s nearly 2,000 higher education institutions really want to compete on this front? If they did, the quality of their academic programs would only be one factor among many that potential foreign students would consider. Like foreign tourists, they would also look into the country’s attractiveness, its infrastructure, and its ability to ensure their security.
Society expects the university to meet multiple objectives: ensure the transmission of national culture and values; provide the opportunity for students to develop their different talents; support the country’s need for economic development; promote civic virtues and political cohesion; and, increasingly, prepare graduates for gainful employment. Growing global linkages complicate and extend these objectives.
Ranking agencies cannot determine what the mission or the priorities of the university should be, and they do not claim this right. Neither can they impose a single, immutable quality standard for all universities. Given different objectives, the universities themselves must define quality in terms of fitness for purpose and evaluate their performance according to the degree that they achieve their selected goals.
Despite the reservations about the reliability or the relevance of their results, the ranking agencies serve a purpose. Their value will increase as they develop the benchmarks and metrics for evaluating different kinds of universities. We do need some external body to keep our institutions honest.
Edilberto C. de Jesus is president of the Asian Institute of Management.