Defense Acquisition Research Journal #91

January 2020

We were also told that OUSD(AT&L)/Acquisition Resources and Analysis (ARA) historically performed checks on Service-submitted SARs, but that they typically did not have enough time between submission and approval to conduct a thorough validation. 13 All of the draft SARs arrived at OSD in the same season. About a week after the data arrived, ARA met with each PM for about 1 hour, at which time ARA could ask questions. They felt that this process was insufcient. Select changes could reduce the accident rate. The best way to improve OSD’s review is probably not solely to add more time. While more time might help, OSD would probably also beneft from specialized tools to help them analyze the draft SAR data and quickly compare them to budget submissions, prior year SARs, and general rules about how acquisition programs typically behave. Proposing improvements to that process is beyond the scope of this article. A Thought Experiment: JLTV To illustrate the kind of reporting that would be necessary to improve both oversight and data utility for cost analysts, we looked at the JLTV program. We determined that, at the beginning of the program, as many as seven distinct subprograms might have been appropriate, as indicated in Table 2. The full analysis is in our completed report (Davis et al., 2017).

TABLE 2. SUGGESTED INITIAL JLTV SUBPROGRAMS

Number Subprogram

Description

1

Utility

Base design Utility Vehicle

2

General Purpose Vehicle (GPV)

Base design GPV

3

Heavy Guns Carrier (HGC)

Base design HGC

4

Close Combat Weapons Carrier (CCWC)

Base design CCWC

5

P3I – Common

P3I common to all variants

6

P3I – HGC and CCWC

P3I specifc to HGC and CCWC

7

Support equipment

Trailers, armor kits, etc.

This would not be practical if a Nunn-McCurdy breach could be triggered by any one of them.

53

Defense ARJ, January 2020, Vol. 27No. 1 : 28-59

Made with FlippingBook Learn more on our blog