Explore Resources Sign DORA
Menu Close
Explore Resources Sign DORA

Even on their own terms university rankings have failed in India

This guest blog is republished with permission from the LSE Impact Blog, first published in March 2026 here

University rankings have become a key measure of success for India, even at the highest political levels. Yet, as in other countries ranking regimes have created distortions in research practices. Exploring the data behind India’s “ranking mania” Muthu Madhan argues rankings have failed to deliver on their promises.

University rankings have become an all-consuming passion in Indian Higher Education. Prime Minister Narendra Modi recently touted the growing number of Indian institutions in the Quacquarelli Symonds (QS) rankings as a significant achievement. However, political interest in global academic rankings as indicators of institutional success has been decades in the making.

Former Prime Minister Manmohan Singh and President Pranab Mukherjee often expressed concern over the absence of Indian universities in the top 200 of global rankings. During his presidency, Mukherjee directly engaged with ranking agencies such as QS and THE, inviting their representatives to address institution leaders to encourage closer collaboration. This momentum led to the establishment of National Institutional Ranking Framework (NIRF) in 2015. Since then, rankings have evolved into powerful governance tools.

selective resistance overlooks deeper problems that are far more significant than the contested issues.

Yet, this normalisation has been accompanied by strategic selectivity. Since 2020, seven prominent Indian Institutes of Technology (IITs) have boycotted the THE rankings citing opacity in its methodology (as if the other ranking systems they participate in are models of clarity). More recently, the Birla Institute of Technology and Science, Pilani, has joined the boycott. Such selective resistance overlooks deeper problems that are far more significant than the contested issues.

Private universities’ rapid rise in bibliometrics

The most visible consequence of the ranking regime is the rapid rise of bibliometric output among Indian private institutions, which have now surpassed government-funded institutions in publication counts (Fig.1).

Figure 1. Publication growth between 2015 and 2024 among the top 50 Indian institutions by number of papers in 2024, ordered by absolute growth (Data source: SciVal).

Comparing the lists of top fifty academic institutions by Scopus-indexed papers (via SciVal) between 2015 and 2024 reveals a stark rise in private universities. In 2015, there were only twelve private institutions in the top fifty; by 2024, this increased to twenty-five, with twenty being newcomers to the list. Notably, five of the top ten institutions in 2024 were not in the top fifty a decade prior.

Their volume growth in the past ten years is staggering: Saveetha Institute of Medical and Technical Sciences (rank #1; from 749 in 2015 to 12,140 papers in 2024), Lovely Professional University (#6; 473 to 6,473), Chitkara University (#7; 101 to 6,049), Chandigarh University (#9; 71 to 5,966), and Graphic Era University (#10; 150 to 4,672).

The same trend appears in highly cited papers (HCP) as well. In 2024, thirteen of the fifteen institutions with at least 15% of their papers among the top 10% HCP were private, and fourteen of the top 15 by HCP volume were private. (Fig.2).

Figure 2. Growth in field-weighted top 10% highly cited papers (2015-2024) among the top 50 Indian institutions by number of papers in 2024, ordered by absolute growth (Data source: SciVal).

Private universities have significantly increased their volume of collaborative papers, especially in the HCP category. For example, the top seven (Fig.2) have produced 19,978 HCP during 2020–2024. Of these, 11,652 (58%) involved international collaboration, and 3,532 (30.3%) had at least one author from Saudi Arabia. Other major collaborators are China, South Korea, and Malaysia. The data also reveals dense collaboration among the seven and with King Saud University and King Khalid University in Saudi Arabia (Fig.3).

Figure 3. Collaboration among the top seven institutions and with two Saudi Arabian institutions in highly cited papers (HCP) 2020–2024.

Meho, in his analysis of papers from fast-growing institutions in countries such as India and Saudi Arabia, notes that the collaboration patterns reflect “ecosystems optimized for mutual metric enhancement“. For context, of the 1.46 million Scopus-indexed Indian papers published during 2020-2024, 22.5% involved international collaboration. Saudi Arabia is the second-largest collaborator after the United States.

Self-citations and retractions

Among the top fifty, fourteen had self-citation rates ≥ 15% in at least two years during 2015–2024. (Fig.4). Also, as seen in Scopus, many within the top fifty are represented in the more than 4,800 Indian papers retracted during the same time frame. Saveetha leads India in both retractions and institutional self-citations. The university remains undeterred by criticism of these practices and dismisses the charges as inconsequential. Significantly, Saveetha attempted to silence critics through legal action, but the court did not side with it.

Figure 4. Institutions in the top 50 by number of papers in 2024 with ≥15% self-citations in at least two years during 2015-2024.

Funding gaps and cash for publications

There is large gap in research funding between public and private universities in India. For example, leading government institutions such as IISc Bengaluru (~$133.42 million) and IIT Madras ($116.34 million) receive millions of dollars in sponsored research funding, whereas private institutions that show high growth in publications, such as Saveetha ($0.37 million) and Lovely ($0.03 million), receive only a fraction – data seen in NIRF 2025. The role of tuition fees in supporting research activities in private universities remains unclear in current reporting systems.

A cursory analysis of research policies of private universities in the top fifty reveals, many of them have institutionalised ‘cash-for-publication and citation’ schemes. Incentives are tied to journal indexation, CiteScore and Journal Impact Factor, institutional h-index etc. Also, payments are made for citations; for example, Chandigarh University reportedly pays ₹50 (~£ 0.4) for each citation a paper receives.

Government institutions have largely refrained from cash incentives for publications. The 2019 proposal by government leaders to introduce cash rewards for papers and patents was anathema to many scientists. Professor Balaram, former Director of the Indian Institute of Science, called it a “hare-brained scheme,” adding that “whoever thought of this is completely ignorant of the history of scientific publishing.” The proposal was not pursued. This restraint, which has protected government institutions from hyper-prolific publishing, is a value that ought to be preserved.

That said, most institutions, whether government or private, offer some form of publication-related reward. As Derek Lowe put it, “it may well seem vulgar to provide some of those rewards in wads of cash … but the more I think about it, the more it feels to me like a difference of degree and not of kind.”

No journal is perfect, and no database is infallible.

Nandita Quaderi (Editor-in-Chief of Web of Science) observes that in citation metrics “high impact” and “high quality” are no longer synonymous, and says even reputable journals are not immune to papermills.

Journals delisted from or ‘on hold’ in the Web of Science for quality concerns including Elsevier’s own Chemosphere, Science of the Total Environment, and Heliyon continue to be indexed in Elsevier-owned Scopus. The same entity acting as both journal publisher and evaluator creates a profound conflict of interest.

These distortions produce misleading citation-based recognitions. The Scopus-based algorithmically generated World’s Top 2% Scientists 2025 list includes more than 6,000 Indian researchers. In contrast, the Highly Cited Researchers list, which identifies authors from the top 1% of highly cited papers in Web of Science and applies human review to exclude the signatures of abnormal practices includes only six from India. (Table.1)

When crude normalisations compound quantitative surrogates of research output, the results stop making sense. The QS citations-per-faculty rankings illustrate this. Between 2015 and 2025, half of the twenty-five institutions that achieved a perfect score of 100 appeared only once or twice before disappearing from the list. Only two were from India. IISc Bengaluru maintained a score of 100 for seven consecutive years before dropping in 2025, while Anna University, Chennai surged from 30 in 2022 to 100 in 2025, only to decline again in 2026. This parity is difficult to reconcile with reality as they differ vastly in structure and resources.

What rankings in India reveal academia

When metrics are treated as gospel, especially when they are abused, they cease to be a map and become a mirage.

Indian private universities have shown that publication volumes can be scaled without research ecosystems or sustained funding. Overwhelmed by this proliferation, journals have succumbed to indiscriminate publishing.

This radical growth has inevitably triggered investigations into corrupt practices linking Indian universities to paper mills and citation cartels. By penalising retractions, the government’s NIRF effectively concedes that malpractice is widespread. This trend subverts the very intent of introducing academic rankings, and fundamentally misleads stakeholders.

Rankings are notorious for producing distortions, a fact now widely acknowledged in India after more than a decade of the ranking regime. Yet, trapped Indian institutions chase them for symbolic gains. Paradoxically, enthusiasts expect ranking agencies to fix the very absurdities they have themselves created. It is a classic vicious cycle. As Gioia and Corley observed, “In some significant sense all the things wrong with the rankings matter considerably less than the plain fact that the rankings matter.” Ignoring serious distortions to promote rankings as objective indicators will undermine the credibility of Indian universities.

What we need is more than a selective boycott. We must immediately untie publication-based surrogate metrics from academic evaluations at all levels. And we must delink ranking from funding allocations and policy decisions. India needs an effective National Institutional Ranking Data Framework that generates diagnostic evidence to support holistic development of higher education and research, and help students make informed choices about where to study. University rankings as they currently exist clearly fail to do this.

This article gives the views of the author, not the position of the LSE Impact Blog or the London School of Economics. 

About the author

Muthu Madhan

Madhan is a librarian with over 25 years of experience in research and academic libraries. He is currently the director of the Global Library at O.P. Jindal Global University in Sonipat, India. He is also a visiting scholar at the DST Centre for Policy Research, IISc, Bengaluru. His research interests focus on the intersection of scientometrics and science policy and open access to scholarly information. https://orcid.org/0000-0003-1651-4180

Share This

Copy Link to Clipboard

Copy