We Set Out to Quantify U.S. Academic Contributions to Medicines. The Results Stunned Even Us
Over the past 20 years, American universities have played an increasing role in finding new drugs

In the 1990s, a famous English expatriate scientist had a yellowed cutout posted on his office door. It read, “British science is alive and well … and living in the United States.”
In the future, a similar article might reflect China or other nations successfully exploiting the decay in American research and development.
Behind nearly every prescription filled in America lies a powerful engine of innovation, fueled by the research conducted within the nation’s universities. Picking up a new prescription at the pharmacy represents the culmination of a decades-long choreography between the private, public, and academic sectors that drives this country’s medical innovation and ensures the most cutting-edge care and technology are available here.
Over the past few decades, the foundational science that informs the basic understanding of the human body’s ailments or the applied research to develop treatments has nearly always begun in an American research university. While pharmaceutical companies, like Merck and Bayer, have become household names, rarely is there an understanding of the academic roots of a new drug or therapy: foundations that in most cases date back years or decades.
To quantify this impact, we launched a research project, which is currently preprint, that sought to ask questions about the value of America’s universities to society. We used the narrow area of pharmaceutical drugs as a representative example.
Many things within academia are measured, from graduation rates and alumni earnings to the economic impact of a campus. Research outputs — which propel everything from medicine and technology to the military and environmental protection — provide another opportunity to assess contributions from the nation’s ivory towers. By identifying the inventors of drugs approved by the Food and Drug Administration (and the key patents curated in the FDA Orange Book), our review revealed that from 2020 to 2024, universities contributed patents underpinning 50% of FDA-approved drugs. Even more stunningly, 87% of those academic breakthroughs came from American institutions.
Achieving a patent listed in the FDA Orange Book is a high bar that requires rigorous review. These key patents can determine when a pioneering medicine loses its proprietary protection and can be offered as a generic formulation. Historically, pharmaceutical companies have preferred to keep these valuable jewels in-house given the financial returns, and our earlier research shows that prior to the 21st century, private sector companies dominated pharmaceutical patents and inventorship. This dominance began to crack in the new millennium, with academic inventors and entrepreneurs playing an increasing role as we progressed through the 2000s and 2010s.
These new findings have profound implications. From an economic standpoint, pharmaceutical products account for a substantial and rising portion of American consumer spending and economic activity. The 10 largest pharmaceutical companies have a combined market capitalization of more than $2 trillion. Our updated findings demonstrate that over the past 50 years, these companies have become increasingly reliant upon academic inventors to provide new medicines.
This value is even more remarkable given that our new findings are limited to applied research and do not include the impact of fundamental research performed by many academic scientists. Previous work from our team and others has demonstrated that NIH-funded basic science has contributed to the development of more than 90% of new medicines, vaccines, and devices.
It must also be recognized that American dominance of drug development allows the nation to determine — indeed, to dictate — the diseases studied, and interventions developed. A recent poll in Nature found that 75% of scientists in America were considering leaving the country. Were the nation to allow its academic enterprise to wither, decisions about which diseases to treat and which therapies to develop will be made elsewhere.
The rise of American domination began in the post-war era based on the vision of Vannevar Bush, who led the Manhattan Project and served as an adviser to President Franklin D. Roosevelt. Bush, in his seminal report, “Science — The Endless Frontier,” advocated for the need for basic research to ensure national and economic security. This far-sightedness allowed the United States to assert its leadership of the biotechnology revolution that began 50 years ago. The nation’s public and economic health benefited from both training and retaining the world’s top minds.
Looking to the future, China has been investing heavily in both academic research and pharmaceutical development and seeks to displace American hegemony. Such changes have occurred before: The roots of the pharmaceutical industry were largely located in European nations, especially the United Kingdom and Germany, through the first half of the 20th century.
American institutions and the innovators within them have succeeded based on a contract between the federal government and U.S. research universities. The result of major funding for research created the country we know today and is why America has been the dominant player in the development of drugs to treat disease and improve lives of people around the globe. The nation’s research universities are indispensable to pharmaceutical innovation, and continued federal support for academic research is essential to maintain U.S. leadership in global drug development and broad economic growth.
Michael Kinch, Ph.D., is chief innovation officer at Stony Brook University. Kevin Gardner, Ph.D., is vice president for research and innovation at Stony Brook University.