College and university rankings are rankings of institutions in higher education ordered by various combinations of various factors. Rankings have most often been conducted by magazines, newspapers, websites, governments, or academics. In addition to ranking entire institutions, organizations perform rankings of specific programs, departments, and schools. Various rankings consider combinations of measures of funding and endowment, research excellence and/or influence, specialization expertise, admissions, student options, award numbers, internationalization, graduate employment, industrial linkage, historical reputation and other criteria. Various rankings mostly evaluating on institutional output by research. Some rankings evaluate institutions within a single country, while others assess institutions worldwide. The subject has produced much debate about rankings’ usefulness and accuracy. The expanding diversity in rating methodologies and accompanying criticisms of each indicate the lack of consensus in the field. The variety of academic rankings provide a comprehensive overview and insightful overlook of different academic institutions on composite capabilities in academia. Whilst United Nations advocates for the beneficial role that higher education could be the common good of social leverage and educating skills to equip everyone participated, yet college ranking is a transparent tool for a fair evaluation for the public.
Academic Ranking of World Universities
The Academic Ranking of World Universities (ARWU) compiled by the Shanghai Jiao Tong University and now maintained by the Shanghai Ranking Consultancy, has provided annual global rankings of universities since 2003, making it the earliest of its kind. The ranking is funded by the Chinese government and its initial purpose was to measure the gap between Chinese and “world class” universities. In the 2015 Academic Ranking of World Universities there are no Chinese universities in the first 100 of 500 places. ARWU rankings have been cited by The Economist magazine. It has been lauded for being “consistent and transparent” based on an article. The education ministers of France, Norway and Denmark traveled to China to discuss and find ways to improve their rankings. ARWU does not rely on surveys and school submissions. Among other criteria, ARWU includes the number of articles published by Nature or Science and the number of Nobel Prize winners and Fields Medalists (mathematics). Harvard has topped the ranking for years. One of the primary criticisms of ARWU’s methodology is that it is biased towards the natural sciences and English language science journals over other subjects. Moreover, the ARWU is known for “relying solely on research indicators”, and “the ranking is heavily weighted toward institutions whose faculty or alumni have won Nobel Prizes”: it does not measure “the quality of teaching or the quality of humanities.
Center for World University Rankings
This Saudi Arabia-based consulting organization has published yearly rankings of world universities since 2012. Rankings are based on quality of education, alumni employment, quality of faculty, number of publications, number of publications in high-quality journals, citations, scientific impact and number of patents.
This university ranking is owned by the French consulting company and rating agency SMBG. It ranks masters and MBA in its 9 geographical regions (the 5 continents).
G-factor ranks university and college web presence by counting the number of links only from other university websites, using Google search engine data. G-factor is an indicator of the popularity or importance of each university’s website from the combined perspectives of other institutions. It claims to be an objective peer review of a university through its website—in social network theory terminology, G-factor measures the centrality of each university’s website in the network of university websites.
Global University Ranking
Global University Ranking measures over 400 universities using the RatER, an autonomous, non-commercial, Russian rating agency supported by Russia’s academic society. The methodology pools universities from ARWU, HEEACT, Times-QS and Webometrics and a pool of experts formed by project officials and managers to determine the rating scales for indicators in seven areas. It considers academic performance, research performance, faculty expertise, resource availability, socially significant activities of graduates, international activities, and international opinion. Each expert independently evaluates these performance indicators for candidate universities. The rating is the average of the expert evaluations. This ranking raised questions when it placed Russian Moscow State University in fifth place, ahead of Harvard and Cambridge.
HEEACT—Ranking of Scientific Papers
The Performance Ranking of Scientific Papers for World Universities was produced until 2012 by the Higher Education Evaluation and Accreditation Council of Taiwan (HEEACT). The indicators were designed to measure both long-term and short-term research performance of research universities.
This project employed bibliometrics to analyze and rank the performance of the 500 top universities and the top 300 universities in six fields. HEEACT further provides subject rankings in science and technology fields. It also ranked the top 300 universities across ten science and technology fields. The ranking included eight indicators. They were: articles published over prior 11 years; citations of those articles, “current” articles, current citations, average citations, “H-index”, number of “highly cited papers” and high impact journal articles. They representedx three criteria of scientific papers performance: research productivity, research impact, and research excellence.
The 2007 ranking methodology was alleged to have favored universities with medical schools, and in response, HEEACT added assessment criteria. The six field-based rankings are based on the subject categorization of WOS, including Agriculture & Environment Sciences (AGE), Clinical Medicine (MED), Engineering, Computing & Technology (ENG), Life Sciences (LIFE), Natural Sciences (SCI) and Social Sciences (SOC). The ten subjects include Physics, Chemistry, Mathematics, Geosciences, Electrical Engineering, Computer Science, Mechanical Engineering, Chemical Engineering (including Energy & Fuels), Materials Sciences, and Civil Engineering (including Environmental Engineering). The ranking was renamed as National Taiwan University Ranking in 2012.
Human Resources & Labor Review
The Human Resources & Labor Review (HRLR) publishes a human competitiveness index & analysis annually by Asia First Media, previously ChaseCareer Network (ChaseCareer.Net). This system is based on Human Resources & Labour Review Indexes (HRI and LRI), which measure the performance of top 300 universities’ graduates.
In 2004, a couple of educational institutions voiced concerns at several events in regard to the accuracy and effectiveness of ranking bodies or lists. The HRLR ranking was pioneered in late 2005 within a working group in response to those concerns. The team was founded in January 2007, in London, and started compiling and processing data, resulting in the first lists in 2007-2008. The ranking concept is later being adopted for Alumni score on ARWU and many other rankings.
The new HRLR ranking innovative methods sparked intense interests from many institutions and inspired several other ranking lists and scoring which are based on professional, alumni, executives, competitiveness, human capital-oriented aspects. Nevertheless, HRLR remains to be the leader in university ranking with innovative and comprehensive approaches, and not relying merely on those aforementioned aspects.
High Impact Universities: Research Performance Index
The High Impact Universities Research Performance Index (RPI) is a 2010 Australian initiative that studies university research performance. The pilot project involved a trial of over 1,000 universities or institutions and 5,000 constituent faculties (in various disciplines) worldwide. The top 500 results for universities and faculties were reported at the project website. The project promotes simplicity, transparency and fairness. The assessment analyzes research performance as measured by publications and citations. Publication and citation data is drawn from Scopus. The project uses standard bibliometric indicators, namely the 10-year g-index and h-index. RPI equally weighs contributions from the five faculties. The five faculty scores are normalized to place them onto a common scale. The normalized scores are then averaged to arrive at a final RPI.
The Centre for Science and Technology Studies at Leiden University maintains a European and worldwide ranking of the top 500 universities according including the number and impact of Web of Science-indexed publications per year. The rankings compare research institutions by taking into account differences in language, discipline and institutional size. Multiple ranking lists are released according to various bibliometric normalization and impact indicators, including the number of publications, citations-per-publication, and field-averaged impact per publication.
The Nature Index tracks the affiliations of high quality scientific articles published in 68 science journals independently chosen by the scientific community as the journals scientists would most like to publish their best research in. Updated monthly, the Nature Index presents research reports of approximately 9,000 parent institutions worldwide presenting a page of output statistics for each institution along with information on institutions collaborating with the institution in the publication of Index articles. Each of the approximately 60,000 articles in the Index has a dedicated article page with social and mainstream media coverage tracked by Altmetric. League tables of output of institutions can be generated on the fly on a global, regional or country basis and by broad subject area as well as by article count and fractional article count. Compare with other metrics of science (e.g., Impact Factor, h-index), Nature Index is the most prestigious scientific journal ranking with historical reputation on original natural science and life science research.
In August 2006, the American magazine Newsweek published a ranking of the Top 100 Global Universities, using selected criteria from ARWU and the Times Higher Education-QS rankings, with the additional criterion of the number of volumes in the library. It formed part of a special issue including an article from Tony Blair, then prime minister of the UK, but has not been repeated. It considered openness and diversity as well as distinction in research. The ranking has been continued since its merger with The Daily Beast, and currently uses data from the Times Higher Education World Rankings, Webometrics world college rankings from public-research outlet Consejo Superior de Investigaciones Científicas in Spain, and the Shanghai Ranking Consultancy in order to compile its results.
Professional Ranking of World Universities
In contrast to academic rankings, the Professional Ranking of World Universities established in 2007 by the École nationale supérieure des mines de Paris measures the efficiency of each university at producing leading business professionals. Its main compilation criterion is the number of Chief Executive Officers (or equivalent) among the Fortune Global 500. This ranking has been criticized for placing five French universities into the top 20.
QS World University Rankings
The QS World University Rankings are a ranking of the world’s top universities produced by Quacquarelli Symonds published annually since 2004. Along with Academic Ranking of World Universities and THE World University Rankings, the QS World University Rankings is widely recognized and cited as one of the 3 main world university rankings. According to Alexa data, they are the world’s most-viewed global university rankings. In 2016 they ranked 916 universities, with the Massachusetts Institute of Technology, Stanford University, and Harvard University on top. This represented the first time since the inaugural rankings of 2004 that all three top positions were held by US institutions.
The QS rankings should not be confused with the Times Higher Education World University Rankings. From 2004 to 2009 the QS rankings were published in collaboration with Times Higher Education and were known as the Times Higher Education-QS World University Rankings. In 2010 QS assumed sole publication of rankings produced with this methodology when Times Higher Education split from QS in order to create a new rankings methodology in partnership with Thomson Reuters. The QS rankings are published in the United States by U.S. News & World Report as the “World’s Best Universities.” However, in 2014, the U.S. News & World Report launched their own international university ranking titled “Best Global Universities”. The inaugural ranking was published in October 2014.
The QS rankings use peer review data collected (in 2016) from 74,651 scholars and academics and 37,781 recruiters. These two indicators are worth 40 per cent and 10 per cent of a university’s possible score respectively. The QS rankings also incorporate citation per faculty member data from Scopus, faculty/student ratios, and international staff and student numbers. The citations and faculty/student measures are worth 20 per cent of an institution’s total possible score and the international staff and student data five per cent each. QS has published online material about its methodology.
QS published the 2016 QS World University Rankings online on 5 September 2016. The rankings also appear in book form, and via media partners including The Guardian, US News & World Report and The Chosun Ilbo.
QS has added to its main World University Rankings, starting in 2009 with the Asian University Rankings. The QS Latin American University Rankings and the QS World University Rankings by Subject were published for the first time in 2011, as well as a faculty ranking worldwide, Top 50 under 50 and Next 50 under 50 ranking and graduate employment ranking. QS now also publish regional rankings for the Arab Region, Emerging Europe and Central Asia, and the five BRICS nations.
The subject rankings are intended to address the most frequent criticism of all world university ranking systems, that they contain too little material about specific subjects, something potential applicants are keen to see. These rankings have been drawn up on the basis of citations, academic peer review and recruiter review, with the weightings for each dependent upon the culture and practice of the subject concerned. They are published in five clusters; engineering; biomedicine; the natural sciences; the social sciences; and the arts and humanities, and covered 42 subjects in 2016.
QS Asian University Rankings
In 2009, Quacquarelli Symonds (QS) launched a department of the QS Asian University Rankings in partnership with The Chosun Ilbo newspaper in Korea. They rank the top 350 Asian universities and the ranking has now appeared eight times. They release an independent list of rankings each time, different from that of the QS World University Rankings. For three consecutive years up to the 2016/17 edition, the rankings were topped by the National University of Singapore.
These rankings use some of the same criteria as the World University Rankings but they use other measures, such as incoming and outgoing exchange students as well. As the criteria and their weightings are different, the QS World university rankings and the QS Asian University rankings released in the same academic year are different. QS published global universities ranking by different major in different countries, which has special reference value for international students, like Statistics & Operational Research program in China.
QS Latin American University Rankings
The QS Latin American University Rankings were launched in 2011. They use academic opinion (30 per cent), employer opinion (20 per cent), publications per faculty member, citations per paper, academic staff with a PhD, faculty/student ratio and web visibility (10 per cent each) as measures. These criteria were developed in consultation with experts in Latin America, and the web visibility data comes from Webometrics. The 2016/17 edition of the ranking ranks the top 300 universities in the region, and showed that the University of São Paulo in Brazil is the region’s top institution.
Reuters World’s Top 100 Innovative Universities
The ranking is empirical and compiles a methodology that employs 10 different metrics. The criteria focused on academic papers, which indicate basic research performed at a university, and patent filings, which point to an institution’s interest in protecting and commercializing its discoveries. Compiled by the Intellectual Property & Science business of Thomson Reuters, the list uses proprietary data and analysis tools. The process began by identifying the 500 academic and government organizations that published the greatest number of articles in scholarly journals, as indexed in the Thomson Reuters Web of Science Core Collection database. The list was cross referenced against the number of patents filed by each organization during the same time period in the Derwent World Patents Index and the Derwent Innovations Index. Patent equivalents, citing patents and citing articles were included. The timeframe allows for the articles and patent activity to receive citations, thereby contributing to that portion of the methodology. The list was reduced to just those institutions that filed 70 or more patents, the bulk of which were universities. Each candidate university was then evaluated using various indicators including how often a university’s patent applications were granted; how many patents were filed with global patent offices and local authorities; and how often the university’s patents were cited by others. Universities were also evaluated in terms of how often their research papers were cited by patents; and the percentage of articles that featured a co-author from industry. The ranking has the Asia-Pacific edition featuring top 75 institutions across the region and top 25 most innovative governmental institutions in the world.
Round University Ranking
Round University Ranking, or abbreviated RUR Rankings is a world university ranking, assessing effectiveness of 750 leading universities in the world based on 20 indicators distributed among 4 key dimension areas: teaching, research, international diversity, financial sustainability. The ranking has international coverage and is intended to become a tool of choice of the university for the key stakeholders of higher education: applicants, students, representatives of the academic community, university management. The RUR Rankings publisher is an independent RUR Rankings Agency, geographically located in Moscow, Russia. RUR is aimed to provide transparent, comprehensive analytical system for benchmarking and evaluation universities across the borders to the widest possible audience: students, analysts, decision-makers in the field of higher education development both at individual institutional and at the national level.
SCImago Institutions Rankings
The SCImago Institutions Rankings (SIR) since 2009 has published its international ranking of worldwide research institutions, the SIR World Report. The SIR World Report is the work of the SCImago Research Group, a Spain-based research organization consist of members from the Spanish National Research Council (CSIC), University of Granada, Charles III University of Madrid, University of Alcalá, University of Extremadura and other education institutions in Spain.
The ranking measures areas such as: research output, international collaboration, normalized impact and publication rate.
Times Higher Education World University Rankings
From 2004 to 2009 Times Higher Education (THE), a British publication, published the annual Times Higher Education–QS World University Rankings in association with Quacquarelli Symonds (QS). THE published a table of the top 200 universities and QS ranked approximately 500 online, in book form, and via media partners. On 30 October 2009, THE broke with QS and joined Thomson Reuters to provide a new set of world university rankings, called Times Higher Education World University Rankings. The 2015/16 edition of the Times Higher Education World University Rankings rank the world’s 800 best universities, while the 2016/17 instalment will rank the world’s top 980.
On 3 June 2010, Times Higher Education revealed the methodology which they proposed to use when compiling the new world university rankings. The new methodology included 13 separate performance indicators, an increase from the six measures employed between 2004 and 2009. After further consultation the criteria were grouped under five broad overall indicators to produce the final ranking. THE published its first rankings using its new methodology on 16 September 2010, a month earlier than previous years. THE also kick-started THE 100 Under 50 ranking and Alma Mater Index.
The Globe and Mail in 2010 described the Times Higher Education World University Rankings as “arguably the most influential.” Research published by professors at the University of Michigan in 2011 demonstrated that the early THES rankings were disproportionately influential in establishing the status order of world research universities.
Times Higher Education World Reputation Rankings
This ranking was published for the first time in March 2011. The 2016 rankings are based on a survey of 10,323 academics from 133 countries. They rank Harvard University as possessing the world’s most powerful university brand, followed by Massachusetts Institute of Technology and Stanford University. The survey was conducted in eight languages by Ipsos Media CT for Times Higher Education’s ranking-data partner Thomson Reuters, and asked experienced academics to highlight what they believed to be the strongest universities for teaching and research in their own fields. The top six universities in the ranking for 2014—Harvard, MIT, Stanford, Cambridge, Oxford, UC Berkeley—were found to be “head and shoulders above the rest”, and were touted as a group of globally recognised “super brands”.
U-Multirank, a European Commission supported feasibility study, was undertaken to contribute to the European Commission objective of enhancing transparency about the different missions and the performance of higher education institutions and research institutes. At a press conference in Brussels on 13 May 2011, the U-Multirank was officially launched by Androulla Vassiliou, Commissioner for Higher Education and Culture saying: U-Multirank “will be useful to each participating higher education institution, as a planning and self-mapping exercise. By providing students with clearer information to guide their study choices, this is a fresh tool for more quality, relevance and transparency in European higher education.” U-Multirank breaks new ground by producing multi-dimensional listings rating universities on a much wider range of factors than existing international rankings. The idea is to avoid simplistic league tables which can result in misleading comparisons between institutions of very different types or mask significant differences in quality between courses at the same university.
U-Multirank assesses the overall performance of universities but also ranks them in selected academic fields: in 2014 the fields are business studies, electrical engineering, mechanical engineering and physics; in 2015, psychology, computer science and medicine will be added. The universities are tested against up to 30 separate indicators and rated in five performance groups, from ‘A’ (very good) through ‘E’ (weak). The results show that while over 95% of institutions achieve an ‘A’ score on at least one measure, only 12% have more than 10 top scores. Of the 850 universities in the ranking, 62% are from Europe, 17% from North America, 14% from Asia and 7% from Oceania, Latin America and Africa. U-Multirank received €2 million in EU funding from the former Lifelong Learning Programme (now Erasmus) for the years 2013-2015, with the possibility of a further two years of funding in 2015-2017. The goal is for an independent organisation to manage the ranking on a sustainable business model thereafter.
University Ranking by Academic Performance
The University Ranking by Academic Performance, abbreviated as URAP, was developed in the Informatics Institute of Middle East Technical University. Since 2010, it has been publishing annual national and global college and university rankings for top 2000 institutions. The scientometrics measurement of URAP is based on data obtained from the Institute for Scientific Information via Web of Science and inCites. For global rankings, URAP employs indicators of research performance including the number of articles, citation, total documents, article impact total, citation impact total, and international collaboration. In addition to global rankings, URAP publishes regional rankings for universities in Turkey using additional indicators such as the number of students and faculty members obtained from Center of Measuring, Selection and Placement ÖSYM.
U.S. News & World Report’s Best Global Universities Rankings
The U.S. News & World Report’s inaugural Best Global Universities ranking was launched on 28 October 2014, and it was based on data and metrics provided by Thomson Reuters, and are thus methodologically different from the criteria traditionally used by U.S. News to rank American institutions. Universities are judged on factors such as global research reputation, publications and number of highly cited papers. U.S. News also publishes region-specific and subject-specific global rankings based on this methodology.
The annual U.S. News Best Global Universities rankings were produced to provide insight into how universities compare globally. As an increasing number of students are planning to enroll in universities outside of their own country, the Best Global Universities rankings – which focus specifically on schools’ academic research and reputation overall and not on their separate undergraduate or graduate programs – can help those students accurately compare institutions around the world.
The Best Global Universities rankings also provide insight into how U.S. universities – which U.S. News has been ranking separately for more than 30 years – stand globally. All universities can now benchmark themselves against schools in their own country and region, become more visible on the world stage and find top schools in other countries to consider collaborating with.
The overall Best Global Universities rankings encompass the top 750 institutions spread out across 57 countries – up from the top 500 universities in 49 countries ranked last year. The first step in producing these rankings, which are powered by Thomson Reuters InCitesTM research analytics solutions, involved creating a pool of 1,000 universities that was used to rank the top 750 schools. In comparison with US News National University Ranking. The Global University Ranking is focused on the research power and faculty resources for students, while the National Ranking is only focused on undergraduate studies. Therefore, for graduate studies and international students, the Best Global Universities Ranking is a much better reference than National University Ranking.
Inside Higher Ed noted that the U.S. News is entering into the international college and university rankings area that is already “dominated by three major global university rankings”: the Times Higher Education World University Rankings, the Academic Ranking of World Universities, and the QS World University Rankings. U.S. News’s chief data strategist Robert Morse stated “We’re well-known in the field for doing academic rankings so we thought it was a natural extension of the other rankings that we’re doing.”
Morse pointed out that the U.S. News as “the first American publisher to enter the global rankings space”, given Times Higher Education and QS are both British, while the Academic Ranking of World universities are Chinese.
The Webometrics Ranking of World Universities is produced by Cybermetrics Lab (CCHS), a unit of the Spanish National Research Council (CSIC), the main public research body in Spain. It offers information about more than 12,000 universities according to their web presence (an assessment of the scholarly contents, visibility and impact of universities on the web). The ranking is updated every January and July.
The Webometrics Ranking or Ranking Web is built from a database of over 20,000 higher education institutions. The top 12,000 universities are shown in the main ranking and more are covered in regional lists.
The ranking started in 2004 and is based on a composite indicator that includes both the volume of the Web contents and the visibility and impact of web publications according to the number of external links they received. A wide range of scientific activities appears exclusively on academic websites and is typically overlooked by bibliometric indicators.
Webometric indicators measure institutional commitment to Web publication. Webometric results show a high correlation with other rankings. However, North American universities are relatively common in the top 200, while small and medium-size biomedical institutions and German, French, Italian and Japanese universities were less common in the top ranks. Possible reasons include publishing via independent research councils (CNRS, Max Planck, CNR) or the large amount of non-English web contents, which are less likely to be linked.
The Research Center for Chinese Science Evaluation at Wuhan University ranking is based on Essential Science Indicators (ESI), which provides data on journal article publication counts and citation frequencies in over 11,000 journals around the world in 22 research fields.