This project employs bibliometric methods to analyze and rank the scientific papers performances of the top 500 universities in the world. The selection of the 500 universities for inclusion in this project was based on information obtained from the Essential Science Indicators (ESI). Of the more than 4,000 research institutions listed in ESI, this project first selected the top 700 institutions based on the numbers of published journal articles and number of citations. Non-university institutions were then removed from the list, and the project staff compared the remaining universities to those included in the other ranking programs such as THES, ARWU, and US News. It resulted in the 725 universities for this project. Data used to assess the performances of the universities was drawn from ISI’s ESI and Web of Science (WOS), which included SCI and SSCI, and Journal Citation Reports (JCR).
The concept of authority control was employed to retrieve data indexed under different forms of a university’s name in the aforementioned databases – i.e., the official name, the abbreviated and other possible forms of the names. This project also considered the mergers and splitting of universities (or different campuses in a university system) and included publications by a university’s affiliated institutions such as research centers and university hospitals. This effort ensured the accuracy of each university’s number of published journal articles and the subsequent statistics of their citations.
The 2009 performance measures are composed of eight indicators. The indicators together represent three different criteria of scientific papers performance: research productivity, research impact, and research excellence. Table 1 lists the indicators and shows their respective weightings for the indicators.
Table 1: The Criteria, Indicators, and Their Respective Weightings Used for the Overall Performance Based Ranking
||2009 Overall Performance Indicators
||Number of articles of the last 11 years (1998-2008)
|Number of articles of the current year (2008)
||Number of citations of the last 11 years (1998-2008)
|Number of citations of the last 2 years (2007-2008)
|Average number of citations of the last 11 years (1998-2008)
||H-index of the last 2 years (2007-2008)
|Number of Highly Cited Papers (1998-2008)
|Number of articles of the current year in high-impact journals (2008)
The number of articles published in peer-reviewed academic journals is frequently used to indicate the productivity of a research institution. To objectively represent a university’s on-going and current research productivity, this project employs two indicators: the number of articles in the last eleven years (1998-2008), and the number of articles in the current year (2008).
“Number of articles in the last 11 years” draws data from ESI, which include 1998-2008 statistics of articles published in journals indexed by SCI and SSCI. “Number of articles in the current year” relies on the 2008 data obtained from SCI and SSCI, which were searched between January 1, and January 31, 2009.
The number of citations on a particular academic article within a specific time frame is a commonly accepted indicator for that article’s impact. This project considers both the long-term and short-term impact of a particular research and seeks to provide fairer representation of a university’s research impact regardless of its size and faculty number. Thus, this project measures research impact by: the number of citations in the last eleven years, the number of citations in the last two years, and the average number of citations in the last eleven years.
“Number of citations in the last 11 years” draws 1998-2008 citation statistics from ESI. “Number of citations in the last 2 years” draws 2007-2008 citation statistics from SCI and SSCI at WOS, which include citation statistics updated to the dates of retrieval. “Average number of citations in the last 11 years” is the number of articles in the last eleven years divided by the number of citations in the last eleven years.
This project assesses each university’s research excellence by the following indicators: the h-index of the last two years, the number of Highly Cited Papers from ESI, and the number of articles in high-impact journals in the current year (Hi-Impact journal articles). “H-index of the last 2 years” measures both the quantity and quality of a university’s research via the use of the 2007-2008 data from SCI and SSCI. A university has index h if h of its Np papers in the last two years have at least h citations each and the other (Np – h) papers have ≦h citations each.
“Number of Highly Cited Papers” utilizes data from ESI, which include statistics of “Highly Cited Papers” from 1998 to 2008. ESI defines Highly Cited Papers as SCI /SSCI-indexed papers that are cited most (in the top 1% of the total papers indexed in the same year) within the last 11 years.
“Number of articles of the current years in high-impact journals” employs data from JCR, which supplies the impact factor of each journal in its subject field. The impact factor of a journal is the number of citations to the papers published in that particular journal within the previous two years divided by the number of that journal’s papers within the previous two years. A journal with a higher impact factor means its articles are more frequently cited by other journals, thus suggesting its higher scholarly value. This project defines high-impact journals as journals whose impact factors are ranked as the top 5% of the total journals within a specific subject category. With high-impact journals lists derived from JCR, this project is able to count the numbers of each university’s articles published in high-impact journals by subject.
The procedures for data processing are as follow: First, the project staff conducted authority control on the various forms of a university name and inspected all the SCI/SSCI bibliographic records in which the address field contained one of the forms of the university name. An accurate number of the total articles from a university were obtained after removing duplicate records containing different forms of that university’s name. Second, using SCI/SSCI, this project obtained the total number of citations by adding the number of citations of each article from that university, starting from the article’s inclusion in SCI/SSCI to the date of our retrieval.
Some university systems have several campuses. A few campuses within a particular university system may have been commonly perceived as individual institutions. However, they are indexed in ESI only by the university system name. For example, University of Wisconsin-Madison and University of Wisconsin-Milwaukee are not differentiated in ESI (they are both indexed under “Univ Wisconsin”), although they are often perceived as two individual universities. This project corrected the flaw by manually searching SCI/SSCI in order to identify the actual number of articles and citations of these articles produced by each individual campus. Likewise, this project employed the same manual searching procedures to ensure that the measurement of each university’s Highly Cited Papers had fairly represented the research performance of each individual university campus.
Score Calculation and Sorting
Based on the measurement procedures, this project calculated a university’s score for each of the eight indicators. For each indicator, the university with the highest number received the maximum points (100); the other universities’ numbers were subdivided by the highest number and were converted decimally into their respective scores. For example, if University A had the highest number M for Indicator X, it received 100 for that indicator, while University B with a number of N received (N/M×100) for that particular indicator. Finally, the project calculated the final score of each university by the indicator weightings presented in Table 1 and sorted the universities by their final scores. Universities with the same scores were sorted alphabetically. It should be noted that many universities obtained similar scores, and the slight differences of the final scores must be interpreted carefully. A university’s slightly higher score than another university’s may not necessarily suggest its superiority in scientific research because the two universities might be in very close proximity in the ranking.