WE WISH to clarify certain issues raised in Dr. Flor Lacanilao’s “Democratic governance impedes academic reform” (Inquirer, 03/14/2011). Lacanilao forcefully argued for using the number of publications by UP faculty members in ISI journals as the single most “objective” measure of disciplinal competence and eligibility for academic leadership.
The Institute of Scientific Information (ISI), founded in 1960, publishes indices of what it calls the journals with the highest citation rates. From the very beginning, misgivings have been expressed about the implications of publishing mainly or exclusively in so-called ISI journals due to the anticipated negative effects this may have on the efforts to develop refereed or peer-reviewed publications on a national or regional level. (See W. Gibbs, “Lost Science in the Third World” in Scientific American, August 1995.) It has also been observed that its emphasis on journal publication does not take into account the relatively higher importance of book publications in the social sciences and humanities. Furthermore, journals in the English language from the USA and UK dominate the lists while other important languages of scientific research are not adequately represented. Despite these reservations, it could be accepted that in some fields, particularly in the natural sciences, significant weight could be given to publication in such journals. However, more harm than good would probably result if a uniform weight for ISI publications were to be imposed indiscriminately on all the disciplines in the natural sciences, social sciences and humanities without any consideration of the context, needs and internal dynamism of the disciplines in question. It would be best if the weights to be given to publications in ISI journals were arrived at consensually by the practitioners themselves in the relevant disciplines.
In the meantime, the general rising sentiment of skepticism against the simple measure of ISI publications seems to be a sobering one. The largest scientific research organization in Europe, the Deutsche Forschungsgemeinschaft (German Research Foundation), issued new guidelines last year which emphasized that the most important factor in evaluating applicants for grants is the individual quality of their research work and not the number of their (ISI) publications. The European Association of Science Editors (EASE) has also recently issued guidelines cautioning against the abuse of such measures and has rejected the notion that the quality or impact of the articles could be directly inferred from the journal in which it was published. Though publications in ISI journals may reflect a certain common denominator of “quality,” the question of whether these are actually of “superior” quality is something else. For example, the Russian mathematician Grigori Perelman chose to publish his proof of Poincaré’s conjecture in 2006 in the open source, non-refereed website arXiv. The most reliable measure of determining the quality of scientific work therefore remains the old-fashioned method of reading. It might be more judicious therefore to take a soft approach in considering the significance of ISI journal publications in relation to issues like tenure, promotion, renewal and hiring. The weight given to it might vary among disciplines and even within disciplines, but there should always be a calibrative mechanism in place. A cautious and critical attitude toward the ISI measures therefore has a valid basis and is not tantamount to “coddling mediocrity” or “incompetence,” as Lacanilao seems to believe. Neither is it a mere defense of the “old ways” of doing things or an embrace of “blind nationalism.”
Lacanilao also proposes in his paper to make the number of ISI publications the supremely determinant factor in the “selection” of academic leadership. One could here imagine a database being kept in which the number of ISI publications of each faculty member would be recorded. Leaders at all levels of University administration could then be automatically drawn periodically from among the most highly published individuals. Lacanilao then adds that this leadership would achieve better results by outrightly excluding the “lesser” or “poorly” published members of the academe from any consultative process and by brooking no opposition from them. But how is this to be achieved practically? What changes have to be done to the UP Faculty Manual and to the UP Charter to attain this state of affairs? What could bring about the absolute hegemony of the ISI paradigm and the willing self-subordination of the majority of UP academics to the ISI elite? How are these “reforms” aimed at divesting the majority of their democratic rights exactly to take place?
The UP has always prided itself for academic freedom, its rich traditions of critical dissent and for possessing a culture in which arrogant assertions of inherent privilege and hierarchy, including outright feudal oppression of the members of the community, are generally frowned upon. Many generations of UP academics have worked hard to disentangle the University from its colonial origins by advocating and developing forms of democratic governance. Moreover, the first impulses of modern science can be shown to have been inextricably related to the ideals of human equality and the birth of mass democracy. What could therefore explain this strange phenomenon in which some scientists like Lacanilao have come to reject the values of democratic governance? From whence comes his intemperate belief in elevating what is ultimately a rough and simple tool of research such as the ISI index to the status of being the final arbiter of the life of the mind?
Ramon Guillermo is associate professor at the Department of Filipino and Philippine Literature, University of the Philippines-Diliman. He received his Ph.D. in Southeast Asian Studies from the University of Hamburg, Germany. Email: rgguillermo@up.edu.ph.