Academic Ranking of World Universities
The Academic Ranking of World Universities (ARWU) compiled by the Shanghai Jiao Tong University and now maintained by the Shanghai Rankings Consultancy, has provided annual global rankings of universities since 2003, making it the earliest of its kind. Its initial purpose was to measure the gap between Chinese and "world class" universities. ARWU rankings have been cited by The Economist magazine.[1] It has been lauded for being "consistent and transparent".[2] The education ministers of France, Norway and Denmark traveled to Jiaotong University to discuss and find ways to improve their rankings.[3]
[edit]Methodology
ARWU does not rely on surveys and school submissions. Among other criteria, ARWU includes the number of articles published by Nature or Science and the number of Nobel Prize winners andFields Medalists (mathematics).[4]
One of the primary criticisms of ARWU's methodology is that it is biased towards the natural sciences and English language science journals over other subjects.[4] In addition to the criticisms, a 2007 paper from the peer-reviewed journal Scientometrics suggested that the results could not be independently reproduced.[5]
[edit]G-factor
G-factor ranks university and college web presence by counting the number of links only from other university websites, using Google search engine data. G-factor is an indicator of the popularity or importance of each university's website from the combined perspectives of other institutions. It claims to be an objective peer review of a university through its website—in social network theoryterminology, G-factor measures the centrality of each university's website in the network of university websites.[6]
[edit]Global University Ranking
Global University Ranking measures over 400 universities using the RatER, an autonomous, non-commercial, Russian rating agency supported by Russia's academic society.[7][8] The methodology pools universities from ARWU, HEEACT, Times-QS and Webometrics and a pool of experts formed by project officials and managers to determine the rating scales for indicators in seven areas. It considers academic performance, research performance, faculty expertise, resource availability, socially significant activities of graduates, international activities, and international opinion. Each expert independently evaluates these performance indicators for candidate universities. The rating is the average of the expert evaluations.[9] This ranking raised questions when it placed Moscow State University in fifth place, ahead of Harvard and Cambridge.[10]
[edit]HEEACT—Ranking of Scientific Papers
The Performance Ranking of Scientific Papers for World Universities is produced by the Higher Education Evaluation and Accreditation Council of Taiwan (HEEACT).[11] HEEACT is designed to assess research universities. The indicators are designed to measure both long-term and short-term research performance.
This project employs bibliometrics to analyze and rank the performance of the 500 top universities and the top 300 universities in six fields. HEEACT further provides subject rankings in science and technology fields. It also rank the top 300 universities across ten science and technology fields.[12] The ranking includes eight indicators. They are: articles published over prior 11 years; citations of those articles, "current" articles, current citations, average citations, "H-index", number of "highly–cited papers" and high impact journal articles. They represent three criteria of scientific papers performance: research productivity, research impact, and research excellence.
The 2007 ranking methodology was alleged to have favored universities with medical schools, and in response, HEEACT added assessment criteria.[13] The six field–based rankings are based on the subject categorization of WOS, including Agriculture & Environment Sciences (AGE), Clinical Medicine (MED), Engineering, Computing & Technology (ENG), Life Sciences (LIFE), Natural Sciences (SCI) and Social Sciences (SOC). The ten subjects include Physics, Chemistry, Mathematics, Geosciences, Electrical Engineering, Computer Science, Mechanical Engineering, Chemical Engineering (including Energy & Fuels), Materials Sciences, and Civil Engineering (including Environmental Engineering).[12]
[edit]High Impact Universities: Research Performance Index
The High Impact Universities Research Performance Index (RPI) is a 2010 Australian initiative[14] that studies university research performance. The pilot project involved a trial of over 1,000 universities or institutions and 5,000 constituent faculties (in various disciplines) worldwide. The top 500 results for universities and faculties were reported at the project website.[14] The project promotes simplicity, transparency and fairness. The assessment analyzes research performance as measured by publications and citations. Publication and citation data is drawn from Scopus. The project uses standard bibliometric indicators, namely the 10-year g-index and h-index. RPI equally weighs contributions from the five faculties. The five faculty scores are normalized to place them onto a common scale. The normalized scores are then averaged to arrive at a final RPI.
[edit]Human Resources & Labor Review
The Human Resources & Labor Review (HRLR) publishes a human competitiveness index & analysis annually in Chasecareer Network (ChaseCareer.Net). This system is based on Human Resources & Labour Review Indexes (HRI and LRI), which measure the top 300 universities' graduates' performance.[15]
[edit]Newsweek
In August 2006, the American magazine Newsweek published a ranking of the Top 100 Global Universities, using selected criteria from ARWU and the Times Higher Education-QS rankings, with the additional criterion of the number of volumes in the library. It formed part of a special issue including an article from Tony Blair, then prime minister of the UK, but has not been repeated. It considered openness and diversity as well as distinction in research.[16]
[edit]SCImago Institutions Rankings
The SCImago Institutions Rankings (SIR)[17] since 2009 has published its international ranking of worldwide research institutions, the SIR World Report.[18] The SIR World Report is the work of the SCImago Research Group,[19] a Spain-based research organization consist of members from the Spanish National Research Council (CSIC), University of Granada, Charles III University of Madrid,University of Alcalá, University of Extremadura and other education institutions in Spain.[20]
The ranking measures areas such as: research output, international collaboration, normalized impact and publication rate.[19]
[edit]THE-QS World University Rankings
From 2004 to 2009 Times Higher Education (THE), a British publication, published the annual THE–QS World University Rankings (WUR) in association with Quacquarelli Symonds (QS). THE published a table of the top 200 universities and QS ranked approximately 500 online, in book form, and via media partners.[21]
Notably, this system uses peer review derived from 15,050 scholars and academics, and 5,007 employment recruiters.[22] WUR incorporates international staff and student numbers, citation data fromScopus,[23] and faculty/student ratios.
These rankings are published in the United States by US News & World Report as the "World's Best Universities."[24]
On 30 October 2009, THE broke with QS and joined Thomson Reuters to provide a new set of world university rankings, called Times Higher Education World University Rankings. THE has stated that academic opinion will form part of its new offering.
QS had collected and analysed the rankings data for the prior six years and retained the intellectual property in them. QS published the newly–titled rankings (as the QS World University Rankings) online on September 8, 2010. The rankings appear in book form, and via media partners including US News & World Report, Chosun Ilbo, The Sunday Times and Nouvel Observateur. QS continues to use Scopus, its own data, and its own peer and recruiter review. QS is adding new reports such as the Asian University Rankings,[25] first published in 2009, and the QS World University Rankings by subject.
A frequent criticism of all world university ranking systems is that they contain too little material about specific subjects, something potential applicants are keen to see. In 2011, QS began ranking universities around the world for their provision in individual disciplines. The rankings have been drawn up on the basis of citations, academic peer review and recruiter review, with the weightings for each dependent upon the culture and practice of the subject concerned.
They are published in five "clusters;" engineering; biomedicine; the natural sciences; the social sciences; and the arts and humanities. The material is all on the QS rankings website[26].
[edit]Times Higher Education World Reputation Rankings
Published for the first time in March 2011,[27] the rankings are based on a survey of 13,388 academics over 131 countries - which is the largest evaluation of academic reputation to date.[28] The survey was conducted in eight languages by Ipsos Media CT for Times Higher Education's ranking-data partner Thomson Reuters, and asked experienced academics to highlight what they believed to be the strongest universities for teaching and research in their own fields. The top six universities in the ranking - Harvard University, Massachusetts Institute of Technology, the University of Cambridge,University of California, Berkeley, Stanford University and the University of Oxford - were found to be 'head and shoulders above the rest', and were touted as a group of globally recognised "super brands".[29]
[edit]Professional Ranking of World Universities
In contrast to academic rankings, the Professional Ranking of World Universities established in 2007 by the École nationale supérieure des mines de Paris measures the efficiency of each university at producing leading business professionals. Its main compilation criterion is the number of Chief Executive Officers (or equivalent) in the among the Fortune Global 500.[30] This ranking has been criticized for placing five French universities into the top 20.[10]
[edit]U-Multirank
U-Multirank, a European Commission supported project, will contribute to the EU objective of enhancing transparency about the different missions and the performance of higher education institutions and research institutes. It is due to be completed in June, 2011.[31]
[edit]University Ranking By Academic Performance
First published in 2010, the University Ranking by Academic Performance (URAP) was developed in the Informatics Institute of Middle East Technical University in Turkey and ranked 2,000 universities according to an aggregation of six academic research performance indicators: current productivity (number of published articles), long-term productivity (from Google Scholar), research impact (citations from Institute for Scientific Information), impact (cumulative journal impact), quality (H-index), and international collaboration.[32]
[edit]Webometrics
The Webometrics Ranking of World Universities is produced by Cybermetrics Lab (CCHS), a unit of the Spanish National Research Council (CSIC), the main public research body in Spain. It offers information about more than 12,000 universities according to their web presence (an assessment of the scholarly contents, visibility and impact of universities on the web). The ranking is updated every January and July.
The Webometrics Ranking or Ranking Web is built from a database of over 20,000 higher education institutions. The top 12,000 universities are shown in the main ranking and more are covered in regional lists.
The ranking started in 2004 and is based on a composite indicator that includes both the volume of the Web contents and the visibility and impact of web publications according to the number of external links they received. A wide range of scientific activities appears exclusively on academic websites and is typically overlooked by bibliometric indicators.
Webometric indicators measure institutional commitment to Web publication. Webometric results show a high correlation with other rankings. However, North American universities are relatively common in the top 200, while small– and medium–size biomedical institutions and German, French, Italian and Japanese universities were less common in the top ranks. Possible reasons include publishing via independent research councils (CNRS, Max Planck, CNR) or the large amount of non-English web contents, which are less likely to be linked.