How Much Is Enough?

Most people know that research and development (R&D) plays a crucial role in building U.S. security and economic health. However, for the past several decades, federal spending on R&D has been declining. The president's infrastructure proposal and the Endless Frontiers Act would increase federal R&D spending and help reverse this damaging trend, particularly in regard to basic research.

Engineer Vannevar Bush, the author of the 1945 report the Endless Frontiers Act commemorates, defined basic research as producing “general knowledge and understanding of nature and its laws.” It has no immediate commercial application (which is why companies spend less on it). Years or even decades can pass between discovery and a product. Progress in fundamental physics in the 1920s led to applied research only in the 1950s, creating the semiconductor industry we depend on today. Governments are best placed to absorb the risk and lack of immediate returns required for basic research, which is why the shortfall in federal R&D is so damaging.

But how much should the federal government spend? One easy answer is to compare the spending of other nations, in particular China. The simple metric is that China has increased its spending for years while U.S. spending has remained flat. That China is closing the research gap has profound implications for which country will have a military and economic advantage. However, this comparison does not capture differences in the countries’ return on investment, as in general R&D spending by the United States is more productive than China’s. Despite this, it is clear that based on what China is spending, the United States should spend more if it wishes to keep up.

R&D spending has characteristics that are not apparent in the numbers. Overall, U.S. private spending on R&D is healthy, but business spending emphasizes "D," development and applied research that will produce returns to the company. This is what companies should be spending on to maintain profitability, but the result is underinvestment in basic research.

Federal R&D funding is imbalanced. After Congress decided in the 1990s to fund health-related research, investments in the National Institutes of Health (NIH) made the United States a global leader in biotechnology. Congress did not provide similar funding to the "hard sciences"—physics, chemistry, math, materials, and others that are supported by the National Science Foundation (NSF). These are crucial for national security. The NIH gets roughly five times the funding provided to the NSF. To be clear, this is not an argument for cutting NIH funding, but rather to bring NSF funding up to the same level. Additionally, in the 2000s, defense spending (a major component of federal R&D) was refocused onto the development of technologies to help win wars in Iraq and Afghanistan. The result of these trends from the 1990s and the 2000s is a shortfall in the R&D that we need to win today's contests.

There are encouraging signs that the shortfall may be ending. While the previous administration proposed $134 billion for federal R&D spending in 2020, the Biden administration is now asking for almost double that number—around $250 billion. To put that in context, big tech firms in R&D-intensive industries devote up to 20 percent of revenue to R&D; if we use national GDP as a proxy for revenue, the United States would need to spend $4.2 trillion. This will never happen. If we use federal "spending" as a proxy—$6.5 trillion in 2020—a share for federal R&D that matched R&D-intensive companies would still be $1.3 trillion, or 10 times the 2020 proposal. This is also an unfair comparison, since the federal government has many "fixed costs," like defense or social services. Looking only at discretionary spending, which the previous administration had budgeted as $1.486 trillion, 20 percent of a research-intensive U.S. budget would be $297 billion.

Another way to gauge how much of an increase is needed is to look at what we have spent in past conflicts. In some cases, this was distorted by a single, expensive program, like the atomic bomb in World War II or the Apollo program in the 1960s. Peak spending in 1965 during the moon race was 11.7 percent of total federal outlays, equivalent to about $565 billion today (noting that nondiscretionary expenses were smaller at that time). Peak federal R&D spending in the Ronald Reagan years, at the height of the confrontation with the Soviets, reached 1.2 percent of GDP, the equivalent of $258 billion today (noting that spending on health research was a smaller percentage of the total).

These numbers point to spending between $250-$300 billion, but after underinvesting for decades, the United States has some catching up to do. This is not as simple as dumping new money into the national innovation base, which will need to be refunded, expanded, and restructured to fit how innovation works today. Innovation is driven by the private sector and is increasingly global. This restructuring will take time, which points to the second issue. U.S. spending on R&D was consistently strong from 1958 to 1990 during the confrontation with the Soviets. R&D investment was seen as integral to the national defense. To compete with China, we must plan for years of funding, not just a one-off replenishment.

For perspective on whether the United States can afford to increase R&D spending, a generally accepted figure is that the United States spent more than $2 trillion in direct costs for its counterterrorism wars. Estimates of indirect costs suggest an annual average of more than $330 billion per year. Had the United States used some of these trillions to fund additional R&D, we would be much better placed for the contest with China. Whatever the outcome of these conflicts, we cannot seek the peace dividend that usually follows the end of U.S. wars. The United States is going from one set of contests into another, to a more dangerous set where national security requires more spending on R&D. From this perspective, the $250 billion proposed for 2021, while reasonable, may be the low end of the scale.

James Andrew Lewis is a senior vice president and director of the Strategic Technologies Program at the Center for Strategic and International Studies (CSIS) in Washington, D.C.

Commentary is produced by the Center for Strategic and International Studies (CSIS), a private, tax-exempt institution focusing on international public policy issues. Its research is nonpartisan and nonproprietary. CSIS does not take specific policy positions. Accordingly, all views, positions, and conclusions expressed in this publication should be understood to be solely those of the author(s).

© 2021 by the Center for Strategic and International Studies. All rights reserved.

Image
James Andrew Lewis
Senior Vice President; Pritzker Chair; and Director, Strategic Technologies Program