Inclusion in Rankings

QS World University Rankings® first began in 2004 and one of the first challenges was to identify an initial list of institutions to study further. For simple practical reasons, it would have been impossible to execute a methodology such as that set forth in these pages for every university in the world. At a UNESCO event in 2011 it was estimated that there are around 20,000 universities in the world. Beginning with the world’s top 500 universities based on citations per paper, the list has evolved since 2004 in response to a number of stimuli:

  • Domestic Ranking Performance – the QS Intelligence Unit tracks a growing number of domestic rankings in an attempt to ensure prestigious universities are not excluded
  • Survey Performance – respondents to the Academic and Employer Reputation Surveys are invited to suggest any institutions they feel may have been omitted
  • Geographical Balancing – acknowledging that universities have different priorities and characteristics in different parts of the world, the balance of institutions from given countries and regions is periodically reviewed
  • Direct Case Submission – from time to time institutions approach QS directly to request inclusion, QSIU evaluates each case on its merits drawing comparison against institutions already included in the ranking and, subject to certain pre-requisites and performance indicators being met is open to including additional institutions

In 2012 the surveys featured over 3,000 institutions, with over 700 being evaluated at either an indicator or overall level in the QS World University Rankings®.

We recognise that higher education institutions can be very different from one another, but maintain that there is validity in comparing one against another as they usually have a certain number of common objectives – for most these include the pursuit of cutting-edge research and the education of first-rate students. There are certain kinds of institution that may appear in other evaluations but are excluded either entirely or partly from our study. These are:

Research Institutes

Whilst this study does look at research metrics it was considered inappropriate to include research institutes that do not have students. Notable exclusions on this basis include CERN in Switzerland, CNRS in France, the Max Planck Institute in Germany and the Russian Academy of Sciences. It is worth noting that, in countries where much of the research takes place in such separate facilities, the research measures for the universities themselves sometimes underestimate the research strength of the faculty members.

Single Faculty Institutions

Institutions that focus on only one of our five broad faculty areas tend to be smaller and more intensive and also feel the full influence of any factors that affect their area of strength. These institutions are able to appear in faculty area and indicator tables but are excluded from our overall list. Notable cases, include the Karolinska Institute in Sweden, HEC Paris, and Bocconi in Italy.

Single Level Institutions

Institutions that operate at either undergraduate only, or more commonly postgraduate only level have certain natural advantages in areas such as student faculty ratio or citations per faculty that would lead to anomalous placing in our overall table. Again these are permitted to appear in faculty area or indicator tables, but are excluded from the aggregate list. Notable exclusions include Cranfield University in the UK, GIST (Gwangju Institute of Science & Technology) in South Korea and Jawaharlal Nehru University in India.

Institutions traditionally operating at one level, but recently introducing degree-level programs at the other, can be considered for inclusion a minimum of three years after the first class graduate from programs defined as within at least two of our five broad faculty areas.

Survey Solicitation or Promotion

QS encourages universities and other higher education stakeholders to help us build the universe of respondents to our academic and employer reputation surveys. Each institution is welcomed to submit its lists of contacts on an annual basis directly to the QS Intelligence Unit (QSIU) as part of the data collection cycle.

In exceptional circumstances where this is not possible, then subject to approval by QSIU, an institution will be able to circulate the link to the Sign-Up facility among their contacts of academics and employers.

Academic Sign-Up Facility link

Employer Sign-Up Facility link

The following rules apply:

  • Lists of contacts have to be provided using the designated QSIU form. The Institution is not allowed to make any formatting changes, e.g. swap columns or send information in a different type of file etc. Failing that, such lists may be rejected by QSIU Rankings Team;
  • The link to the Sign-Up facility can only be circulated if the institution uses QSIU Templates, which can be downloaded here.


It is not permitted for a university or any other higher education stakeholder to promote or distribute direct links to the academic or employer surveys
.

Institutions can only mention QS Academic and Employer surveys to their contacts in case of:

  • Emailing their contacts to ask for the permission to pass their contact data to QSIU using the QSIU Template.
  • Emailing their contacts to inform that their details have been passed to QSIU.
  • Emailing their contacts to inform that the academic and employer surveys are open.

The email templates can be downloaded here

Institutions are permitted to mention the QS academic and employer surveys in their communication to the contacts only by using the QSIU templates provided.

It is strictly forbidden to solicit or coach specific responses from expected respondents to any survey contributing to any QS ranking. Institutions found to be doing so in the opinion of the QSIU will be sanctioned.

The approach we have adopted is intended to obviate the effect, if any, of such campaigns by excluding new data from any non-compliant institution, and to neutralize the effect of any such campaigns on our survey findings.

The rankings use five years of survey data, which for the forthcoming 2019 rankings will mean information gathered from 2014 to 2018 inclusive. For affected institutions, the affected year shall be omitted and replaced by the records collected in the most updated year not included already. The same principle will apply to any institution contravening our policies in the years to come.

QS aims to provide an inclusive and accurate ranking. However, in cases of recurrent activity of this nature, we will first apply the above approach to the survey index in question, and we may later consider disqualifying an institution from the ranking altogether. QS runs sophisticated screening analysis to detect anomalous responses, and routinely discards invalid responses. Any attempt to manipulate the results, or to solicit responses, may result in the disqualification of all responses for that survey for that year, invalid or otherwise, where the source cannot be verified as entirely independent.

The above policy takes effect from December 2017

False Responses

In an attempt to build the largest and most representative sample for its surveys, QSIU casts the net as wide as possible, but extensive checking and validating of response data takes place to check for the possibility of institutions attempting to influence their position through submitting additional responses on their own behalf.

As policy, not only are responses found to be invalid discounted from consideration but any institution found to be engaging in such activity will attract a further penalty in the compilation of results for the given indicator.