Clash of Titans: How the Government & Industry’s Battle on Drug Pricing Control Misses the Big Picture

drug prices

Key Takeaways

  • Government and industry investment per drug has been relatively the same in the last decade.
  • While the government plays a significant role in basic research, it is also involved in drug development and clinical trials alongside prominent industry players.
  • The Inflation Reduction Act (IRA), while noble in its cause to reduce drug prices for consumers, will likely result in downstream disruptions in research and development.

For those interested in a good fight, the intense showdown brewing between the government and pharmaceutical companies should not disappoint. In August, the government unveiled 10 prescription drugs that will undergo Medicare price negotiations as part of the Inflation Reduction Act (IRA). The industry’s response? Even before the announcement, some of the biggest names in the business – including Merck, Bristol Myers Squib, and Novartis – filed lawsuits declaring the provisions “unconstitutional,” “extortion,” and tantamount to “political Kabuki theatre.”

As more pharmaceutical and biotech companies grapple with the implications of the IRA, these recent activities bring into focus a timeless discourse that affects us all deeply: the roles of the government and industry in drug development.

Government vs Industry

Researchers at Bentley University recently published two insightful articles comparing investments by the US National Institutes of Health (NIH) and the pharmaceutical industry in 356 drugs approved between 2010 and 2019. Their analysis revealed that when assessing the NIH’s expenditures using the industry’s conventional methods of cost calculation, the two were relatively comparable. The NIH spent roughly $1.7 billion per approval on 86 first-in-class drugs, which was only slightly higher than the industry average of $1.6 billion.

Considering the government’s substantial financial investment and pivotal contributions to nearly all drugs on the market today, should it have more say in drug pricing? 

A Symbiotic Relationship

What is getting lost in the noise and increasing contentiousness of today’s debate is that there is an ecosystem at play. The government and industry work symbiotically to keep the wheels of the therapeutic innovation engine spinning.

The delicately balanced ecosystem that fuels this innovation works best when the NIH lays the foundational groundwork that enables the industry to advance the right drugs from bench to bedside.

Areesha Saif

The current understanding is that the government only bears responsibility for preliminary, basic research aimed at comprehending the fundamental aspects of disease. This knowledge is then handed over to industry, which transforms copious amounts of data and research into viable drugs that are tested in clinical trials, with the ultimate aim of making them accessible to patients.

Bentley’s research has illuminated a much more nuanced landscape, revealing that the demarcations between government and industry are considerably less distinct. Notably, the NIH persists in supporting drug development beyond the initial stages, engaging in applied research tailored to specific applications or objectives, as well as phased clinical development to establish the safety and efficacy of these therapies in humans. While the NIH’s contributions toward basic research significantly dwarf those for applied and clinical research, Bentley’s analysis demonstrates that the NIH does, in fact, allocate roughly 10% of what industry invests in phased clinical development.

It’s worth noting that the NIH’s investment in clinical development is primarily centered around the logistical infrastructure of clinical trial networks, patient registries, and postdoctoral research. This groundwork allows companies to efficiently orchestrate their trials without having to painstakingly establish the foundations. If we keep in mind that it can take 10.5 years on average for a drug to get from Phase 1 to regulatory approval, it’s not far-fetched to say that the industry has grown considerably dependent on such initiatives to curb the amplification of costs and time commitment inherent in the process of drug development.

So, while the public sees a shouting match upfront, it’s evident that, behind the scenes, the government and industry have learned to work together quite effectively through a risk-sharing process refined over the decades. But how much risk is each player really taking on?

The Roles of Each in Drug Development

First, it’s essential to recognize just how unpredictable the game of drug development can be. What starts as a vast pool of tens of thousands of potential molecules during early research eventually gets whittled down to just 1 or 2 candidates in clinical trials. In other words, picture yourself at a racetrack where you’re allowed to bet on only 1 out of 50,000 horses – even with scientific experiments that can rule out some of the losers, those are some pretty bad odds, and it doesn’t stop once a drug enters the clinic. There remains a daunting 90% probability that a drug will stumble somewhere along the thorny clinical path, ultimately failing to reach the patients it was designed to benefit.

This uncertainty is built into the drug development landscape for industry players, yet they are by no means impervious to its consequences. With billions of dollars invested, the fate of pharmaceutical companies – and, at times, their very survival – hinges upon the favorable outcomes of clinical trials. Consider a company like Goldfinch Bio, which shut down less than a year after it released mixed Phase 2 data for its lead asset GFB-887.

In contrast, the government serves not to make drugs but to enable their development by industry. A failed candidate, while regrettable, adds to an ever-expanding pool of scientific knowledge, which, in itself, is a victory for the government as a champion of free-flowing ideas over individual products. The hope is to accumulate enough of this knowledge to eventually pave the way for a successful candidate that addresses an unmet medical need, regardless of the company responsible.

That being said, the pharmaceutical industry is adept at leveraging its position within the ecosystem to its advantage. For example, it is standard industry practice to use a 10.5% cost of capital to quantify the expected returns investors forgo while a drug is in development; in contrast, the range is 3-7% for government investment. While accounting for the cost of capital is an established practice, 10.5% is arguably quite high on the spectrum of expected returns. The industry also includes spending on failed clinical trials, which adds an additional layer of complexity as it depends on success rates across clinical phases, the average cost per phase, and the number of drugs that fail in each phase, all of which can vary wildly depending on the figures used and the company using them.

With this context, let’s revisit the initial question.

How Much Should The Government Decide On Drug Prices?

The answer: it’s complicated.

First, we must acknowledge the bizarre reality that Medicare, despite being the largest buyer of prescription drugs in the United States, has lacked the ability to influence pricing until the passage of the IRA just last year.

Imagine a scenario where someone on your street runs a lemonade stand, and you are their most frequent customer. In a world where we’re constantly seeking deals on everything from airline tickets to house hunting, it’s reasonable for you to want to leverage your substantial lemonade-buying power to negotiate a more favorable price. This is precisely what the government was unable to do.

Since the passing of the IRA, pharmaceutical companies have been urging the government to refrain from intervening in drug pricing while reaping significant revenue from government orders – an irony that should not be lost in the ongoing debate.

If the government is justified in having some say, the question then becomes how much?

Consequences Of The Inflation Reduction Act

One of the industry’s main points of contention is that the new process is less a negotiation and more a “gun to the head,” as Pfizer CEO Albert Bourla put it. If a company refuses to accept Medicare’s proposed price, it faces a noncompliance tax that starts at 186% of a drug’s annual revenue and increases to a maximum of 1,900%, potentially amounting to hundreds of millions of dollars per week. The government has argued that participation in Medicare is voluntary, but opting out of a program with the country’s largest medication buyer is hardly viable.

Indeed, despite the ongoing lawsuits against the government to halt the process, all drugmakers affected by the first round of price cuts have begrudgingly agreed to participate. In fact, many of these companies will likely be compelled to adapt by discontinuing programs, realigning strategic priorities, and, in some cases, raising prices to offset potential government price cuts down the line. This will place a burden on payers, exacerbate disparities in patient access, and hinder innovation in drug development — all of which run contrary to the intended goals of the IRA.

Although the initial round of negotiations involves a mere 10 drugs — slated to increase to a total of 60 by 2029 — their ripple effect is already reshaping the industry’s understanding of drug development. Genentech’s CEO recently shared that the new law is causing the company to reconsider drug launches in diseases with fewer patients. These are usually faster due to smaller-scale trials, but their lower sales potential has become much more consequential in a post-IRA world.

With so much on the line, industry and government are gearing up for a long, bitter war that is only beginning. 

Final Comments

While the industry loves to tout its altruistic aim of saving lives, it ultimately operates within a profit-driven framework. When these profit margins are endangered due to aggressive government negotiations or price caps, it could discourage companies from translating the NIH’s research into marketable drugs. If we remove NIH funding from the equation, the industry would need to double its current investment per drug, resulting in higher prices, fewer development programs, and lasting damage to the healthcare system.

Appreciating the true interconnectivity between industry and government means accepting that the taxpayer contributes significantly to drug development yet doesn’t shoulder the entirety of the burden. As with most relationships, the dynamic between the NIH and industry is messy, and quantifying their financial contributions to the broader ecosystem is far from straightforward.

Irrespective of one’s position on the government vs. industry debate, we can all agree that the US, despite its imperfections, is unquestionably the epicenter of global R&D innovation. The delicately balanced ecosystem that fuels this innovation works best when the NIH lays the foundational groundwork that enables the industry to advance the right drugs from bench to bedside.

Hopefully, this simple yet powerful truth can reshape the contentious dispute into a productive negotiation among partners, rather than adversaries, who share the common goal of helping patients. If not, the government and those it governs may get a lot more than they bargained for.

 

Areesha Saif
Areesha Saif

Areesha is a Biopharma consultant supporting pharmaceutical and biotech companies with corporate strategy, product commercialization, and market expansion. She is deeply passionate about all things biotech and is particularly interested in the intersection of science and business. She earned her Bachelor's and Master's degrees in chemistry from the University of Cambridge and currently resides in Chicago.

Scroll to Top