Cycle Times and Cycles of Acquisition Reform

CSIS Briefs

Available Downloads

The Issue

Acquisition reform occurs in cycles. For example, to increase acquisition speed, the most recent cycle restructured the Pentagon to reduce and decentralize the Office of the Secretary of Defense’s (OSD) oversight of major defense acquisition programs (MDAPs). Using a qualitative and quantitative analysis of past reform cycles and MDAP cycle times (i.e., the time to field new capabilities), this analysis observes that: 

  • Even though OSD oversight activities take time, they do not appreciably slow down MDAP acquisition speed;
  • Instead, strong, centralized OSD oversight may reduce MDAP cycle times and cycle time growth; and
  • The Pentagon has historically fielded new MDAP capabilities at average speeds that are comparable to external benchmarks.

Based on these findings, recent reforms—which reduced and decentralized OSD oversight—may not increase MDAP acquisition speed. And although the Pentagon does field some MDAPs quite slowly, reformers should not use the experience of worst-case programs to motivate future reforms of the entire acquisition system.


Introduction

In defense acquisition, reform is constant. Over the past six decades, reforms have been initiated, implemented, and evaluated, only to be initiated all over again. This pattern—and its repetition throughout history—has led some to describe acquisition reform as a “never-ending cycle” whereby discrete periods of time are characterized by different initiatives.1Although these initiatives consistently seek to reduce cost, shorten schedules, and increase performance, reformers’ priorities have varied throughout history. Today’s reformers, for example, are focused primarily on acquisition speed (e.g., see the National Defense Authorization Act (NDAA) 2016 Secs. 804, 810, 821, 823, 825 and NDAA 2017 Secs. 805, 806, 807, 901).2

Reformers’ focus on speed is due, in part, to perceptions that U.S. technological advantage vis-à-vis its adversaries is eroding and that the timelines to field new capabilities are dramatically different between the Department of Defense (DOD) and the U.S. private sector.3 To evaluate those perceptions, this brief compares MDAP cycle times to external benchmarks. It also evaluates recent reforms’ potential to increase acquisition speed by comparing cycle time statistics across the various cycles of acquisition reform.

Acquisition Speed

Today’s reforms aim to speed up the acquisition process as defined by DOD Directive 5000.1. The traditional process, depicted in Figure 1, consists of several milestones. At each milestone, senior DOD officials review progress and determine whether programs should continue to the next phase. Traditionally, officials from the OSD have reviewed and approved DOD’s largest programs (i.e., MDAPs).

DOD typically initiates MDAPs at milestone B, after which full-scale system engineering begins. Next, DOD reviews system designs at milestone C. Pending milestone approval, programs begin low-rate production and system testing. Once test results are satisfactory, DOD certifies that programs have reached initial operating capability (IOC) and that systems are ready for use.

Today’s reforms aim to shorten the time spent between program initiation and IOC. To achieve this objective, reformers created alternative acquisition pathways (e.g., NDAA 2016 Sec. 804’s “middle tier acquisition”) that largely eschew traditional, OSD-led oversight activities.4 Reformers also delegated much of OSD’s authority to conduct MDAP milestone reviews back to the military services.5

Given their attention to speed, reformers’ focus on OSD oversight is unsurprising. Oversight—which often takes the form of reporting requirements and reviews—can lengthen program schedules by adding activities that take time to complete. For example, the Government Accountability Office found that, in a sample of 24 programs, staff spent an average of two years completing the steps necessary to pass an OSD-led milestone review and 5,600 total staff days documenting that work.6 Relatedly, RAND found that 5 percent of a program office staff’s time was dedicated to regulatory and statutory compliance,7 and researchers at the George Washington University found that between 5 and 40 percent of a contractor’s time was spent complying to oversight requirements.8 By decentralizing and delegating acquisition oversight, today’s reformers hope to reduce the time that programs dedicate to OSD-led oversight activities, thereby shortening the duration between program initiation and IOC. DOD, through its National Defense Strategy, has embraced reformers’ focus on speed and affirmed that it must “deliver performance at the speed of relevance.”9

Reform Cycles

Importantly, today’s focus on speed is not unique. Rather, recent moves to decentralize OSD oversight follow nearly six decades and multiple cycles of prior acquisition reform. Although the specifics of each reform initiative are distinct and complex, from a macroscopic perspective it is possible to characterize past cycles according to the mechanisms that reformers employed. This brief focuses on one mechanism—OSD oversight’s centralization or decentralization—which has been both the focus of prior research and which uniquely affects MDAPs.10 The brief acknowledges, however, that reformers sometimes employ multiple mechanisms simultaneously and that these mechanisms may interact in non-simple, non-obvious ways. This analysis, therefore, provides just one perspective on acquisition reform cycles and MDAP cycle times.Acknowledging these limitations, Table 1 identifies eight reform cycles—including today’s—and classifies those cycles according to their preference for centralized or decentralized oversight.11 These cycles are also summarized briefly below:

  • McNamara Reforms: Secretary Robert McNamara leveraged authorities granted by the DOD Reorganization Act of 1958 to centralize OSD control over military service budgets and major program decisions.12

  • Defense Systems Acquisition Reform Council: Deputy Secretary David Packard created the Defense Systems Acquisition Reform Council (DSARC) to limit OSD involvement in the acquisition process. Through the DSARC, OSD assessed programs at discrete milestones but otherwise delegated management responsibility to the military services.13

  • Brown Strengthens Control: In response to Packard’s “management by objective” approach, Secretary Harold Brown sought to regain and centralize OSD authority over the acquisition process.14

  • Acquisition Improvement Program: In response to Brown’s tighter OSD control, Secretary Caspar Weinberger and Deputy Secretary Frank Carlucci initiated the Acquisition Improvement Program to enable the “controlled decentralization” of OSD’s authority.15

  • Defense Acquisition Board: Congress initiated a series of reforms—including the creation of an undersecretary of defense for acquisition—aimed at centralizing and strengthening OSD control over the acquisition process.16 Toward this end, OSD established the Defense Acquisition Board to oversee MDAPs throughout their lifecycle.17

  • Mandate for Change and Transformation: During this extended period—which spanned nearly two administrations—OSD emphasized deregulation and management streamlining but not scrupulous oversight of early program decisions.18 DOD also heavily relied on Total System Performance Responsibility (TSPR) contracts during this period. These contracts delegated a significant amount of authority and responsibility to DOD contractors and in doing so eroded the department’s ability to conduct rigorous oversight.19

  • Weapon Systems Acquisition Reform Act: Responding to cost growth during the prior cycle, Congress implemented a series of reforms aimed at centralizing OSD authority—especially over early program milestones.20 OSD’s Better Buying Power initiative attempted to further strengthen program management throughout the system lifecycle.21

  • Restructuring AT&L: Today’s reformers intend to increase acquisition speed and strengthen DOD’s technological edge by splitting up the Office of the Under Secretary of Defense for Acquisition, Technology, and Logistics into two separate offices. To reduce cycle times, the procurement-focused office has delegated much of its oversight authority to the military services.22

These cycles provide a framework for assessing DOD’s historic acquisition speed. Specifically, by classifying programs according to reform cycle or cycle type (i.e., centralized or decentralized oversight), it is possible to observe past reforms’ macroscopic impact on acquisition speed. This analysis can then be used to inform expectations for today’s reforms and to help benchmark DOD’s future “speed of relevance.”

Cycle Times

Acquisition speed can be assessed using two variables: cycle time and cycle time growth. Cycle time is the time elapsed between program initiation (typically milestone B, but sometimes milestone C) and IOC.23Cycle time growth is the percent change in a program’s estimated and actual cycle time.24 Cycle time, therefore, represents the speed with which DOD fields new capabilities. Cycle time growth represents the accuracy with which DOD is able to predict that speed.

Using data from the Defense Acquisition Management Information Retrieval (DAMIR) System and RAND’s Defense Systems Cost Performance Database,25 cycle time and cycle time growth were calculated for all MDAP programs and subprograms for which data was available.26 MDAPs represent DOD’s most costly and complex programs; therefore, in many ways, they are not representative of much of the technology that DOD acquires. However, MDAP data is readily available. Furthermore, changes to OSD oversight affect MDAPs more significantly than any other programs. For these reasons, this analysis and its conclusions are limited to MDAPs only.

Additionally, several assumptions were made when collecting and labeling data. Most importantly, it is assumed that MDAPs are most significantly affected by the policies in place at program initiation.27 Therefore, even if MDAPs spanned more than one reform cycle, they are classified according to the cycle in which they were initiated. It is also important to note that MDAP schedule data is not always reliable or of high quality; therefore, many other assumptions were also required to collect data and these assumptions may affect the analysis results. For more detail on the data collection and analysis assumptions used in this brief, please refer a forthcoming report on this topic, as well as to the detailed endnotes provided at the conclusion of this brief.28

Ultimately, schedule data was collected for over 200 active and complete MDAP programs and subprograms that DOD initiated from fiscal year (FY) 1963 to the present.29 Using this data, it can be observed that despite numerous reform cycles, acquisition speed has remained relatively constant throughout history. As can be seen in Figures 2 and 3, acquisition speed—at least for complete MDAPs—has not significantly changed over time.30 Although an association between speed and initiation date for active programs was observed, these differences are not attributed to distinctions between active and complete MDAPs.31 Rather, it is more likely that active programs may have optimistically estimated their cycle times and may be too immature to have yet experienced much cycle time growth. To avoid this maturity bias in subsequent analysis, only complete MDAPs or active MDAPs that are at least five years past their initiation date were included.32

Overall, DOD has historically fielded MDAPs with average cycle times of 6.9 years and with 31.3 percent cycle time growth. DOD’s historic median for cycle time and cycle time growth was 6.6 years and 15.4 percent cycle time growth, respectively. These statistics stand in contrast to much of the rhetoric surrounding recent reforms, which frequently suggests that DOD’s historic cycle times are much longer.33

The data also yields insights when viewed through the lens of historic reform cycles. Table 2 shows that earlier reform cycles (i.e., cycles #1-3) had lower mean and median cycle times relative to more recent cycles (i.e., cycles #4-6). Although the distribution of cycle times was significantly different between groups, this difference cannot be attributed to changes in oversight type, since OSD oversight was both centralized and decentralized during both periods.34 Future research, therefore, should explore alternative explanations for the difference between early and later reform cycles.

Tables 2 and 3 also show that cycle #7, which immediately preceded today’s reforms, marked an increase in acquisition speed compared to the cycle immediately prior (i.e., cycle #6). In this instance, differences in both the cycle time and percent cycle time growth distributions were statistically significant.35 This finding suggests that the reforms implemented during cycle #7 positively impacted program outcomes, at least as compared to the cycle immediately prior. Further, this improvement does not suggest an urgent need to reform the acquisition process, as reformers ultimately did by decentralizing OSD oversight in cycle #8.

Additionally, the historical data shows little evidence that decentralizing OSD oversight actually increases acquisition speed. Instead, as shown in Table 4, MDAPs initiated during periods of decentralized oversight reached IOC an average of 15.6 months slower than MDAPs initiated during periods of centralized oversight. Furthermore, the disparity in medians was also substantial, with MDAPs initiated during periods of decentralized oversight reaching IOC 15.6 months slower. The difference in distributions was also statistically significant: suggesting that decentralizing OSD oversight may not be an effective mechanism for reducing MDAP cycle time.36

Similarly, the data shows that MDAPs initiated during periods of decentralized oversight experienced an average of 15.6 percent more cycle time growth. The difference in medians—10.5 percent—was also substantial: however, the difference in distributions had a lower level of statistical significance than the comparisons described above.37 That said, these results still suggest that decentralizing OSD oversight may not be an effective mechanism for reducing MDAP cycle time growth.

This analysis suggests that even though OSD oversight activities take time, they do not result in appreciably longer MDAP cycle times or higher rates of cycle time growth. Instead, it seems likely that other technical factors—such as system type and complexity—may determine an MDAP’s critical path and schedule duration. Additionally, the data suggests that strong, centralized OSD oversight may reduce cycle times and cycle time growth—perhaps by serving as a “check” on the military services’ tendency to “sell” their programs using optimistic cost and schedule estimates.38

Overall, the data also suggests that today’s reforms—cycle #8, which decentralized OSD oversight—may not increase MDAP acquisition speed in the future. Given the National Defense Strategy’s intent to “deliver performance at the speed of relevance,” this outcome seems troubling.39 Thankfully, by comparing the historic MDAP data to external benchmarks, there are some indications that—despite decades of reform—DOD’s acquisition system, on average, may already field systems at the “speed of relevance.”

Even though oversight activities take time, they do not result in appreciably longer MDAP cycle times or higher rates of cycle time growth; instead, strong, centralized OSD oversight may reduce cycle times and cycle time growth.

Assessing the “Speed of Relevance”

To assess whether DOD fields systems at the “speed of relevance,” DOD cycle times were compared to external benchmarks from the U.S. private sector and China’s People’s Liberation Army (PLA). The comparisons are limited, however, by the availability and quality of open-source data. The best option, therefore, is to compare the data set of over 200 MDAP cycle times to a handful of benchmark systems with rough schedule estimates.

To estimate non-DOD cycle times, the analysis leverages a DARPA report that contains data on the U.S. private sector and uses open-source reporting on PLA systems. In both instances, it is assumed that the dates reported are consistent with the definitions of program initiation and IOC that were used for MDAPs. For PLA systems in particular, program initiation dates were identified using media reports which stated when the PLA began system development or issued contracts. Such assumptions, of course, limit the ability to draw definitive conclusions. As such, U.S. private-sector and PLA cycle times were used only as rough benchmarks for the “speed of relevance.”

Acknowledging these limitations and using DARPA’s U.S. private-sector data, commercial aircraft cycle times increased from approximately four to seven years since 1965.40 Commercial vehicle cycle times decreased during this time, from approximately seven to two years.41 As shown in Table 5, DOD’s mean aircraft and vehicle cycle times are consistent with the U.S. private sector, but DOD’s worst-case MDAPs significantly exceeded private-sector cycle times. As above, Table 5 contains all complete MDAPs and active MDAPs initiated between FY 1963 and FY 2014 for which data was available.Based on limited, open-source data on example PLA systems, DOD average cycle times, for the most part, appear to outpace comparable PLA systems—even though the PLA frequently accelerates technology development using espionage, intellectual property theft, and foreign military procurement.42 For example, although DOD’s mean cycle time for aircraft is 6.6 years, the PLA appears to have fielded the J-20 and the Y-20 in approximately 15 and 10 years, respectively.43 Compared to the DOD aircraft shown in Table 5, these example PLA cycle times are closer to DOD’s worst-case cycle time for aircraft.

DOD’s mean cycle time for subs and ships—7.5 years—also appears to outpace some open-source PLA examples. For instance, the PLA appears to have fielded both the Type 093 Shang-class submarine and the Type 052A destroyer in approximately 10 years.44 Notably, the PLA appears to have fielded its new aircraft carrier, the Type 001A Shandong (CV-17), rather quickly, in approximately five years.45 Compared to DOD capabilities, however, many of these benchmark systems appear inferior by at least some performance metrics.46 In each example, however, the PLA’s cycle times do appear to outpace DOD’s worst-case cycle times.

While these comparisons are limited by the availability and quality of data, examples of U.S. private-sector and PLA cycle times provide a rough benchmark for the “speed of relevance.” Using this benchmark, it appears that DOD’s acquisition system has, on average, historically fielded MDAPs at the “speed of relevance.” Note, however, that several of DOD’s worst-case cycle times did significantly exceed the cycle times of benchmark systems.

For these worst-case programs, further study is needed—ideally using rigorous qualitative methods such as process tracing—to map a program’s activities from initiation to IOC and to identify bottlenecks that could be avoided in future programs.47 Reformers should be cautioned, however, against using these worst-case programs to assess the performance of the entire acquisition system. A worst-case MDAP may have been slowed down by a myriad of factors (e.g., issues with requirements, personnel, funding, contracts, or the industrial base) that do not affect other programs in the same way. So, although DOD should learn from and address these issues on a case-by-case basis, the experience of worst-case programs should not be used to motivate future reforms of the entire acquisition system.

It appears that DOD’s acquisition system has, on average, historically fielded MDAPs at the “speed of relevance.”

The Future for Reform

This brief demonstrates the utility of using acquisition history to improve the defense community’s understanding of current and future reforms. Using a mix of qualitative and quantitative analysis, this brief observes that reforms which decentralize OSD oversight do not appreciably decrease MDAP cycle time. Instead, that centralized OSD oversight may help reduce cycle times and cycle time growth. Based on these findings, recent reforms—which instead decentralized OSD oversight—may be ill-suited to achieve their objective of increasing speed, at least for MDAPs. However, MDAPs are DOD’s most costly and complex programs and do not represent all of the technology that DOD acquires. Acquisition reform itself is complex, and countless factors besides OSD oversight—including workforce, industrial base health, budget, and regulations—all affect acquisition speed in non-simple and non-obvious ways. This analysis contributes but one perspective on reform cycles and cycle times within an extensive history of acquisition reform.

Morgan Dwyer is a fellow in the International Security Program and deputy director for policy analysis in the Defense-Industrial Initiatives Group at the Center for Strategic and International Studies (CSIS) in Washington, D.C. Brenen Tidwell was a research intern with the Defense-Industrial Initiatives Group at CSIS. Alec Blivas was a program coordinator with the International Security Program at CSIS.

This material is based upon work supported by the Acquisition Research Program under Grant No. HQ00341910011. The views expressed in written materials or publications, and/or made by speakers, moderators, and presenters, do not necessarily reflect the official policies of the Department of Defense nor does mention of trade names, commercial practices, or organizations imply endorsement by the U.S. government.

An earlier version of this paper was published in the Proceedings of the Seventeenth Annual Acquisition Research Symposium.

CSIS Briefs are produced by the Center for Strategic and International Studies (CSIS), a private, tax-exempt institution focusing on international public policy issues. Its research is nonpartisan and nonproprietary. CSIS does not take specific policy positions. Accordingly, all views, positions, and conclusions expressed in this publication should be understood to be solely those of the author(s).

© 2020 by the Center for Strategic and International Studies. All rights reserved.

Please consult the PDF for references.

Alec C. Blivas

Former Program Coordinator, International Security Program

Brenen Tidwell

Former Research Intern, Defense-Industrial Initiatives Group

Morgan Dwyer