Defense Acquisition Guidebook: Production Date:16-September-2013
Defense Acquisition Guidebook: Production Date:16-September-2013
Guidebook
 Production Date:16-September-2013
DEFENSE ACQUISITION GUIDEBOOK -
Foreword
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           2
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Acquisition Strategy.
Chapter 6, Human Systems Integration , addresses the human systems elements of the
systems engineering process. It will help the program manager design and develop
systems that effectively and affordably integrate with human capabilities and limitations;
and it makes the program manager aware of the staff resources available to assist in
this endeavor.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           3
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
program decisions and tailoring based on program type and acquisition category,
executive-level decision forums and the tenets and processes of Integrated Product
Teams (IPTs), program assessments, and periodic reporting. Additional chapter topics
include exit criteria, independent assessments, Acquisition Baseline Plan development
and management, and periodic reports for Major Acquisition Programs and Major
Automated Information Systems programs. The chapter also addresses Should-Cost
with a focus on controlling the cost of the actual work that the Department is doing and
expects to do.
Chapter 11, Program Management Activities , explains the additional activities and
decisions required of the program manager, not otherwise discussed in other chapters
of this Guidebook.
Chapter 12, Business Capability Life Cycle , provides guidance for executing the
Business Capability Lifecycle (BCL) and acquisition of defense business systems
(DBS). BCL is the overarching framework for the planning, design, acquisition,
deployment, operations, maintenance, and modernization of DBS.
Chapter 13, Program Protection , provides guidance and expectations for the major
activities associated with Program Protection.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           4
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
DEFENSE ACQUISITION GUIDEBOOK
Chapter 1 -- Department of Defense Decision Support Systems
1.0. Overview
1.0. Overview
1.0.1. Purpose
1.0.2. Contents
1.0.1. Purpose
This chapter provides background information about the environment in which the
Department of Defense must operate to acquire new or modified materiel or services.
1.0.2. Contents
Section 1.1 presents an overview of each of the three, principal, decision support
systems used in the Department of Defense to acquire materiel and services, and
describes the integration of those systems. Sections 1.2 through 1.4 provide details of
each of these systems: Section 1.2 discusses the Planning, Programming, Budgeting,
and Execution process, employed by the Department of Defense to conduct strategic
planning and make resource allocation decisions; Section 1.3 discusses the Joint
Capabilities Integration and Development System used to determine military capability
needs; and Section 1.4 discusses the formal Defense Acquisition System used to
acquire that capability.
The Department of Defense has three principal decision-making support systems, all of
which have been significantly revised over the past few years. These systems are the
following:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           5
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Planning, Programming, Budgeting and Execution (PPBE) Process - The
Department's strategic planning, program development, and resource determination
process. The PPBE process is used to craft plans and programs that satisfy the
demands of the National Security Strategy within resource constraints.
Illustrated together in Figure 1.1.F1, the three systems provide an integrated approach
to strategic planning, identification of needs for military capabilities, systems acquisition,
and program and budget development. The next three sections provide brief
introductions to each of these decision support systems.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           6
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
1.2. Planning, Programming, Budgeting and Execution (PPBE) Process
The purpose of the PPBE process is to allocate resources within the Department. It is
important for program managers and their staffs to be aware of the nature and timing of
each of the events in the PPBE process, since they may be called upon to provide
critical information that could be important to program funding and success.
In the PPBE process, the Secretary of Defense establishes policies, strategy, and
prioritized goals for the Department, which are subsequently used to guide resource
allocation decisions that balance the guidance with fiscal constraints. The PPBE
process consists of four distinct but overlapping phases:
Planning. The planning phase of PPBE is a collaborative effort by the Office of the
Secretary of Defense and the Joint Staff, in coordination with DoD components. It
begins with a resource-informed articulation of national defense policies and military
strategy known as the Defense Planning Guidance (DPG). The DPG is used to lead the
overall planning process.
Budgeting. The budgeting phase of PPBE occurs concurrently with the programming
phase; each DoD Component submit’ s it’s proposed Budget Estimate Submission
(BES) simultaneously with its POM. The budget converts the programmatic view into
the format of the congressional appropriation structure, along with associated budget
justification documents. The budget is focused on one year, but with considerably more
financial details than the POM. Upon submission, each budget estimate is reviewed by
analysts from the office of the Under Secretary of Defense (Comptroller) and OMB.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           7
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Their review ensures that programs are funded in accordance with current financial
policies, and are properly and reasonably priced. Proposed budget changes are
presented to leadership for review and decisions are documented in the Resource
Management Decision (RMD) document. DoD Components use the RMD to update
their BES data sets which are then incorporated into the Departments Budget and
FYDP and submitted to OMB as part of the President’s budget request.
Execution. The execution review occurs simultaneously with the program and budget
reviews. The execution review provides feedback to the senior leadership concerning
the effectiveness of current and prior resource allocations. Metrics are used to support
the execution review to measure actual output versus planned performance for defense
programs. To the extent performance goals of an existing program are not being met,
the execution review may lead to recommendations to adjust resources and/or
restructure programs to achieve desired performance goals.
JCIDS plays a key role in identifying the capabilities required by the warfighters to
support the National Security Strategy, the National Defense Strategy , and the National
Military Strategy. Successful delivery of those capabilities relies on the JCIDS process
working in concert with other joint and DOD decision processes. JCIDS procedures
support the Chairman and Joint Requirements Oversight Council (JROC) in advising the
Secretary of Defense on identifying and assessing joint military capability needs. JCIDS
is a joint-concepts-centric capabilities identification process that allows joint forces to
meet future military challenges. The JCIDS process assesses existing and proposed
capabilities in light of their contribution to future joint concepts. The JCIDS process was
created to support the statutory requirements of the JROC to validate joint warfighting
requirements. JCIDS is also a key supporting process for the DOD acquisition and
Planning, Programming, and Budget Execution (PPBE) processes . The primary
objective of the JCIDS process is to ensure the capabilities required by the joint
warfighter to successfully execute the missions assigned to them are identified with their
associated operational performance criteria. This is done through an open process that
provides the JROC the information they need to make decisions on required
capabilities. The requirements process supports the acquisition process by providing
validated capability needs and associated performance criteria to be used as a basis for
acquiring the right weapon systems. Additionally, JCIDS provides the PPBE process
with affordability advice supported by the capabilities-based assessment (CBA) , and
identifies capability gaps and potential materiel and non-materiel solutions. While it
considers the full range of doctrine, organization, training, materiel, leadership and
education, personnel and facilities (DOTMLPF) solutions, for purposes of this
Guidebook, the focus is on the pursuit of "materiel" solutions.
JCIDS acknowledges the need to project and sustain joint forces and to conduct
flexible, distributed, and highly-networked operations. JCIDS is consistent with DoD
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           8
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Directive 5000.01 direction for early and continuous collaboration throughout the
Department of Defense. JCIDS implements a capabilities-based approach that
leverages the expertise of government agencies, industry, and academia. JCIDS
encourages collaboration between operators and materiel providers early in the
process. JCIDS defines interoperable, joint capabilities that will best meet the future
needs. The broader DoD acquisition community must then deliver these technologically
sound, sustainable, and affordable increments of militarily useful capability to the
warfighters.
JCIDS informs the acquisition process by identifying and assessing joint military
capability needs which need a materiel solution; these identified capability needs then
serve as the basis for the development and production of acquisition programs. JCIDS
is fully described in CJCS Instruction 3170.01 , signed by the Director of the Joint Chiefs
of Staff. This instruction establishes the policies for JCIDS, and provides a top-level
description of the process. A supplementary on-line manual, the JCIDS Manual ,
provides the details necessary for the day-to-day work in identifying, describing, and
justifying joint warfighting capabilities. The manual also includes the formats that
describe the content required for each JCIDS document.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           9
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                            Figure 1.3.F1. JCIDS and Defense Acquisition
Figure 1.3.F1 depicts several key points. First, JCIDS is based on a series of top-down
analyses ultimately derived from formal strategic-level guidance, including the National
Security Strategy, National Defense Strategy, National Military Strategy and the Report
of the Quadrennial Defense Review . Second, these analyses assess existing and
proposed capabilities in terms of their contribution to emerging joint warfighting
concepts. Moreover, rather than focusing on the capabilities of individual weapon
systems in isolation, the analyses assess capabilities in the context of integrated
architectures of multiple interoperable systems. Third, from these overarching concepts,
the JCIDS analysis process identifies capability gaps or shortcomings, and assesses
the risks associated with these gaps. These gaps may be addressed by a combination
of materiel and/or non-materiel solutions (non-materiel solutions would be changes to
doctrine, organization, training, leadership and education, personnel, and facilities).
Fourth, recommended materiel solutions, once approved, lead to acquisition programs.
JCIDS documents are provided for these programs at each acquisition milestone and
guide the subsequent development, production, and testing of the program. Further
information on Capabilities-Based Assessment , as well as the nature and role of the
Initial Capabilities Document , Capability Development Document , and Capability
Production Document can be found in the JCIDS Manual .
For Acquisition Category I and IA programs, and other programs designated as high-
interest, the JROC reviews and validates all JCIDS documents under its purview. For
Acquisition Category ID and IAM programs, the JROC makes recommendations to the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           10
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
DAB, based on such reviews. Section 181 of title 10, United States Code , establishes
JROC responsibilities. The Vice Chairman of the Joint Chiefs of Staff chairs the JROC,
and is also a member of the DAB. The Vice Chiefs of each military service are members
of the JROC. Section 841of the FY11 National Defense Authorization Act expanded the
role of the combatant commanders (or when designated, their deputies) as members of
the JROC on matters related to their area of responsibility or when functions of that
command are being considered by the Council. The "Expanded JROC" staff brings
together key stakeholders from across the department and Interagencies, when
appropriate, to shape decisions in support of the Joint warfighter. These stakeholders
provide advisory support to the JROC. This same Act specifically designated the
following officials of the Department of Defense as civilian advisors:
The Defense Acquisition System is the management process for all DoD acquisition
programs. DoD Directive 5000.01, The Defense Acquisition System , provides the
policies and principles that govern defense acquisition. DoD Instruction 5000.02,
Operation of the Defense Acquisition System , establishes the management framework
that implements these policies and principles. The Defense Acquisition Management
System is an event-based process. Acquisition programs proceed through a series of
milestone reviews and other decision points that may authorize entry into a significant
new program phase. Details of the reviews, decision points, and program phases are
found in Enclosure 2 of the DoDI 5000.02 . The Instruction also identifies the specific
statutory and regulatory information requirements for each milestone and decision point.
One key principle of the defense acquisition system is the use of acquisition categories,
where programs of increasing dollar value and management interest are subject to
increasing levels of oversight. DoD Instruction 5000.02 Enclosure 3 identifies the
specific dollar values and other thresholds for these acquisition categories. The most
expensive programs are known as Major Defense Acquisition Programs (MDAPs) or
Major Automated Information System (MAIS) programs. MDAPs and MAIS programs
have the most extensive statutory and regulatory reporting requirements. Some
elements of the defense acquisition system only apply to weapon systems, some
element only apply to automated information systems, and some elements apply to
both. DoD Instruction 5000.02, Enclosures 2, 3, and 4 provide specific details.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           11
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
The Under Secretary of Defense for Acquisition, Technology, and Logistics
(USD(AT&L)) is the Defense Acquisition Executive (DAE). The USD(AT&L) reviews
ACAT ID and IAM programs and is the Milestone Decision Authority (MDA). A Defense
Acquisition Board (DAB) , chaired by the USD(AT&L), provides advice on critical
acquisition decisions. DAB members are senior officials from the Joint Staff, the Military
Departments, and staff offices within OSD.
The DAE may delegate decision authority for an MDAP or a MAIS to the DoD
Component Head, who may, and generally will, delegate decision authority to the
Component Acquisition Executive. Such delegation makes an MDAP program an ACAT
IC program and a MAIS program an ACAT IAC program.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           12
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
DEFENSE ACQUISITION GUIDEBOOK
Chapter 2 - Program Strategies
2.0. Overview
2.0. Overview
2.0.1. Purpose
2.0.2. Content
2.0. Overview
This chapter discusses the development and management of program strategies (i.e.,
the Technology Development Strategy and the Acquisition Strategy (AS)) for
Department of Defense acquisition programs. It addresses the information requirements
that the Program Manager must consider in preparing the TDS and the AS,
respectively.
2.0.1. Purpose
The purpose of this Chapter is to provide information and guidance needed to develop a
Technology Development Strategy and to develop and maintain a program-level
Acquisition Strategy. A programs strategy should be developed organically by the
Program Management Office in collaboration with related communities and
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           13
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
stakeholders.
2.0.2. Content
Section 2.1 describes Program Strategies in the broad sense. Section 2.2 discusses
Program Strategy Documentation Requirements; Section 2.3 discusses the relationship
of the Program Strategy to other program documents; Section 2.4 discusses the
relationship of the Program strategy to the Request for Proposal; Section 2.5 discusses
Security Classification Markings for Program Strategies; Section 2.6 describes the
Program Strategy approval process; and Section 2.7 is a high level summary of some
fundamental differences between an acquisition plan and an Acquisition Strategy.
Section 2.8 addresses the Technology Development Strategy/Acquisition Strategy
outline .
Program strategies include the Technology Development Strategy (TDS) and the
Acquisition Strategy (AS).
Well-developed program strategies optimize the time and cost required to satisfy
approved capability needs. Program strategies should be exploratory in nature. That is,
they should express clearly the Program Managers approach to developing and/or
procuring the material or service-from a business, contracting, and programmatic point
of view. The focus of each strategy should be on the rationale for the approach, not
solely a description of the source itself. The strategy should not be a repetition of
statute, policy, or regulation. It should describe what actions are being taken-and
to what end.
The Technology Development Strategy (TDS) must be approved prior to entry into the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           14
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Technology Development Phase and, in most cases, precedes the formal Acquisition
Strategy (AS). Two exceptions are:
The TDS serves as the basis for program acquisition activities in the Technology
Development Phase, moving toward a Milestone B decision. The TDS should serve as
an information baseline for efforts that continually evolve during the progression through
the acquisition management system and be incorporated into the initial Acquisition
Strategy (AS), as appropriate.
When submitting TDS and AS documents, DoD acquisition policy and associated
business practices require Program Managers to describe their business strategies in
substantial detail to include overall approach, contract types, source selection
procedures, expected competition and incentive structures.
The level of detail described below should be included in all TDS and AS documents to
ensure that the Milestone Decision Authority may make well informed assessments of
the efficiency and effectiveness of the business arrangements that are planned. If this
information is not provided, program strategy approval will be delayed until it is made
available.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           15
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    3. Major Contract(s): Identify the number and type of contracts anticipated.
         o a. For each major contract planned (greater than $40 million [then-year
             dollars] for an Major Defense Acquisition Program and greater than $17
             million for a Major Automated Information System program) describe: what
             the basic contract buys; how major deliverable items are defined; options,
             if any, and prerequisites for exercising them; and the events established in
             the contract to support appropriate exit criteria for the phase or
             intermediate development activity.
         o b. Indicate whether a competitive award, sole-source award, or multiple-
             source development with down select to one production contract is
             contemplated. Describe how the strategy changes from core (initial) to
             subsequent increments. If a sole source is chosen, identify the exception
             to full and open competition that applies and provide justification for the
             duration and timing of the sole-source procurement.
         o c. Identify any special contracting considerations. Discuss any unique
             clauses/special provisions that will be included in the contract. Identify any
             special test and evaluation, unique tooling, or other similar contractual
             requirements.
         o d. Identify any other pertinent information that may ensure understanding
             of the contracting strategy to include, but not limited to, projected use of
             Government Furnished Property, plans to re-use hardware and software,
             safety office review/involvement, period of performance/length of contract,
             and contract format.
         o e. If a cost-type contract is to be used, provide information (an explanation
             of technical risk and the steps required to remediate the risk) with
             supporting documentation to support the Milestone Decision Authority's
             mandatory assessment that:
                  i. The program is complex and technically challenging that it would
                      not be practicable to reduce program risk to a level that would
                      permit the use of a fixed price contract.
                  ii. The complexity and technical challenge of the program is not the
                      result of failure to meet the requirements established in section
                      2366a of Title 10, United States Code.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           16
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    5. Technical Data Management: The strategy for Acquisition Category I and II
       programs shall assess the long-term technical data needs for the system and
       reflect that assessment in the Technical Data Rights Strategy that is included in
       both the TDS and the AS. The Technical Data Rights Strategy shall assess the
       data required to design, manufacture and sustain the system, as well as to
       support recompetition for production, sustainment or upgrades. It will also
       address the merits of a price-based option for the future delivery of technical data
       and intellectual property rights not acquired upon initial contact award and
       consider the contractors responsibility to verify any assertion of restricted use
       and release of data.
    6. Sustainment: The AS should provide an overview of the sustainment-related
       contract(s) and performance-based agreements with government and industry
       providers describing how the integrated product support package will be acquired
       for the system being supported. The discussion should include the
       contract/agreement and length along with: major terms and conditions;
       performance measures being used; and the portion of the system covered with
       the associated sustainment-related functions, plus hardware and data covered in
       each contract/agreement.
Each increment must be a militarily useful and supportable operational capability that
can be developed, produced, deployed, and sustained. Block upgrades, pre-planned
product improvement, and similar efforts that provide a significant increase in
operational capability are managed as separate increments.
If a major defense acquisition program requires the delivery of two or more categories of
end items which differ significantly from each other in form and function, the Defense
Acquisition Executive may designate such category of end item as a major subprogram
for the purposes of acquisition reporting under title 10 Unites States Code. An example
of the intended use for subprograms would be the designation of a satellite (subprogram
#1) and the affiliated ground control station (subprogram #2) under a total program
composed of both elements.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           17
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
mechanisms are very similar.
Content of other documents, such as the Systems Engineering Plan, Life Cycle
Sustainment Plan, Program Protection Plan, and Test and Evaluation Master Plan
should all align with the TDS or AS content, with minimal overlap.
Until the Milestone Decision Authority has approved the program strategy (TDS or AS),
the formal RFP cannot be released, nor any action may be taken that would commit the
program to a particular contracting strategy.
The efforts defined in the approved program strategy for a given phase of the
acquisition life cycle must align with efforts to be put on contract for that phase.
The TDS/AS Outline presented at 2.8 in this chapter of the Guidebook describes the
structure for a Program Strategy document.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           18
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
information" in accordance with DoD Directive 5230.24 . Additionally, if the document
contains proprietary information, or is competition sensitive, it should be so marked and
appropriately handled.
In addition to displaying the correct markings, it is a good idea for a TDS or Acquisition
Strategy to have a distribution statement. An example follows:
For ACAT IC and IAC programs, MDA is delegated to the appropriate CAE by the
USD(AT&L).
For Major Defense Acquisition Programs (MDAPs), MDA approval of the Program
Strategy document is required prior to release of a Final Request for Proposal.
Programs may not proceed beyond a major milestone decision point (A, B, or C), the
pre-Engineering and Manufacturing Development (pre-EMD) review, or the Full-Rate
Production (FRP) Decision/Full Deployment Decision review without an MDA-approved
Strategy.
For ACAT ID, ACAT IAM, and OSD Special Interest programs, program strategy
documents are initially submitted to the office of the Director, Acquisition Resources and
Analysis (ARA) within the office of the Under Secretary of Defense for Acquisition,
Technology and Logistics (USD(AT&L). ARA coordinates the documents with the
appropriate stakeholders prior to submitting to the USD(AT&L) for final approval.
Submittal of program strategies should be in accordance with the notional timelines
specified in the Defense Acquisition Board Preparation section of Chapter 10 .
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           19
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
2.7. Acquisition Strategy versus Acquisition Plan
An Acquisition Plan is prepared by the Contracting Officer and formally documents the
specific actions necessary to execute the approach delineated in the approved
Acquisition Strategy. The Acquisition Plan serves as the basis for contractual
implementation as referenced in Federal Acquisition Regulation (FAR) Subpart 7.1 and
Defense Federal Acquisition Regulation Supplement (DFARS) Subpart 207.1 .
The Federal Acquisition Regulation requires acquisition planning for all Federal
procurements, and the Defense Federal Acquisition Regulation Supplement requires
Program Managers to prepare written Acquisition Plans (APs) for most acquisitions
exceeding $10 million. An AP is execution-oriented and contract-focused-- normally
relating to a singular contractual action; an Acquisition Strategy covers the entire
program and may reflect the efforts of multiple contractual actions.
There is no DoD-level rule that precludes the Program Manager from preparing a single
document to satisfy both requirements. In fact, FAR 34.004 dealing with major systems
acquisition requires that the Acquisition Strategy "qualify" as the AP. However, in
practice, DoD Components often prefer to provide a more general Acquisition Strategy
to the Milestone Decision Authority (MDA) for approval and choose to prepare a
separate, more detailed AP. If a separate AP is prepared, it may not be approved until
after the Acquisition Strategy has been approved.
The distinctions between the requirement for the Acquisition Strategy and the
requirement for the AP are summarized in table 2.7.1.1.F1 .
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           20
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
 Table 2.7.1.1.F1. Summary of Distinctions between the Acquisition Strategy and
                                Acquisition Plan
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           21
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
       Level of Detail Strategy level. Needed by     Execution level. Provides the
                       MDA for decision-making. Also detail necessary to execute
                       planning level for some       the approach established in
                       discrete information          the approved acquisition
                       requirements.                 strategy and to guide
                                                     contractual implementation
                                                     and conduct acquisitions.
       Content         Prescribed by DoD Instruction Prescribed by FAR 7.1 ;
                       5000.02 ; additional guidance DFARS 207
                       in the Defense Acquisition
                       Guidebook
       Individual      PM                            Person designated as
       Responsible                                   responsible.
       for Preparing
       the Document
2.8.1. Purpose
2.8.4. Tailoring
This guideline is intended as just that, a guideline. While it attempts to shed light on all
relevant strategic business aspects of a program, it may fail to solicit information a
Program Manager (PM) feels is vital to their chain-of-command. Therefore, PMs are
empowered to add where necessary. Adherence to the spirit in which this guideline was
crafted should yield a document that provides insight into the PMs thoughts and thought
processes.
As directed in the April 20, 2011 Principal Deputy Under Secretary of Defense
(Acquisition, Technology, and Logistics) memorandum " Document Streamlining -
Program Strategies and Systems Engineering Plan ," the structure for the body of a
Program Strategy document follows. Each program strategy should also include a title
page, signature/approval page, and a table of contents. The primary sections included
in the body of the outline are:
    1. Purpose
    2. Capability Need
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           22
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    3. Acquisition Approach
    4. Tailoring
    5. Program Schedule
    6. Risk and Risk Management
    7. Business Strategy
    8. Resources
    9. International Involvement
    10. Industrial Capability & Manufacturing Readiness
    11. Life-cycle Signature Support
    12. Military Equipment Valuation
Detail on expected content for each of these topics is described in the following
sections.
2.8.1. Purpose
State the reason the program strategy (i.e., the Technology Development Strategy or
the Acquisition Strategy) is being prepared or updated (e.g., milestone review, full rate
production decision, change in strategy, etc.).
Summarize the requirement. Indicate the key operational and sustainment requirements
for this system (i.e., the time-phased capability requirements as described in the Initial
Capabilities Document, Capability Development Document, Capability Production
Document, Requirements Definition Package, and/or Capability Drop). Highlight system
characteristics driven by interoperability and/or joint integrated architectures, capability
areas, and family- or system-of-systems.
Summarize the expected operational mission of this program. Identify the user and
summarize the users Concept of Operations (CONOPS). Indicate how the program fits
into current and future integrated architectures.
If TDS, also summarize the Net-Centric Data Strategy. [Starting with Milestone B, the
Net-Centric Data Strategy is included in the Information Support Plan.]
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           23
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                                 CONSIDERATIONS
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           24
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                                           NOTES
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           25
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                     Figure 1. Example OV-1 Illustration
NOTES
For Milestone B, provide a reference design concept for the product showing major
subsystems and features (one or more drawings as needed to describe or illustrate the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           26
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
expected features of the product; see the example in Figure 2).
Indicate whether the program strategy will be evolutionary or single step to full capability
and rationale for selection. Note: If this program employs an evolutionary acquisition
approach, this strategy will primarily apply to the current increment, while occasionally
addressing some topics in the context of the overall program.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           27
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                                           NOTES
Indicate whether this is a New Start program. Verify that the appropriate Congressional
notifications have been completed for a New Start. (Reference DoD 7000.14-R, DOD
Financial Management Regulation , Volume 3, Chapter 6 for guidance on new start
determinations.)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           28
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                                           NOTES
Indicate whether this is a joint program. If so, specify the joint nature and characteristics
of the program. Identify the Service(s) or DoD Components involved, state the key
Service-specific technical and operational differences in the end item deliverables, and
provide the principal roles and responsibilities of each DoD Component in the
management, execution, and funding of the program.
If this strategy supports the Milestone B or C decision, in a table showing quantity per
year, indicate the total planned production quantity and provide the LRIP quantity.
Summarize the Low-Rate Initial Production (LRIP) plan. If the planned LRIP quantity
exceeds ten percent of the total planned production quantity, provide the justification.
(Not applicable to software-intensive programs without production components.)
2.8.4. Tailoring
Consistent with statutory and federal regulatory requirements, the Program Manager
(PM) and Milestone Decision Authority (MDA) may tailor the phases and decision points
to meet the specific needs of the program. If tailoring is planned, state what is being
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           29
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
proposed and why.
List all requests for either regulatory policy waivers or waivers permitted by statute.
Include a table similar to notional Table 1.
NOTE
                                                WAIVER REQUESTS
                    Type
                                                 Required by
     Requirement (Regulatory Granting
                                       Rationale   (date or                                                 Status
     to Be Waived     or     Authority
                                                    event)
                  Statutory)
2.8.5.1. Interdependencies
Explain and justify any urgency if it results in needed tailoring for example if it
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           30
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
constitutes justification for not providing for full and open competition.
Summarize the analysis justifying the proposed program schedule (list analogous
programs or models used to derive schedule).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           31
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                                 CONSIDERATIONS
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           32
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                             Verification Review (SVR);
                           o Operational and developmental test events including
                             initial operational test and evaluation (IOT&E) and
                             live fire test and evaluation (LFT&E);
                         o Production quantities for each year;
                         o maintenance plans, depot maintenance core
                             capabilities stand-up, Training Plan, Source of
                             Repair Assignment Process (SORAP),
                         o identify the activation schedule for each site in the
                             supply chain required to support the system
                             including the maintenance sites (including depots)
                             and training sites
                         o Environment, Safety, and Occupational Health
                             (ESOH) plans events
                         o draft RFP for LRIP, final draft FRP AS submission to
                             MDA staff;
                         o Full-Rate Production Decision Review (FRP DR);
                             and,
                         o initial operating capability (IOC) and full operational
                             capability (FOC)
                  4. If for an FRP AS, the schedule should minimally include:
                         o contract events such as award dates, contract
                             definitization, planned exercising of contract line
                             item numbers, and Integrated Baseline Review (IBR)
                         o Production quantities for each year;
                         o maintenance plans, depot maintenance core
                             capabilities stand-up, Training Plan, Source of
                             Repair Assignment Process (SORAP),
                         o identify the activation schedule for each site in the
                             Production quantities for each year;
                         o maintenance plans, depot maintenance core
                             capabilities stand-up, Training Plan, Source of
                             Repair Assignment Process (SORAP),
                         o identify the activation schedule for each site in the
                             supply chain required to support the system
                             including the maintenance sites (including depots)
                             and training sites
                         o planned or anticipated future increments;
                         o post-implementation review (PIR); and,
                         o initial operating capability (IOC) & full operational
                             capability (FOC).
2.8.5.1. Interdependencies
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           33
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
of the interdependencies with program activity on the critical path. If any memorandums
of agreement are required to formalize these relationships/ interfaces, list them in the
format presented in Table 2. Identify the interface (i.e., the system this product
interfaces with); the agency that owns the other system; the authority (e.g., PEO, CAE,
delegated PM) responsible for controlling the interface (i.e., the individual who can set
the requirement; direct the solution to the interface issue; and direct who provides the
funding for the solution); the required by date; and the impact if not completed.
Summarize the approach used to identify, analyze, mitigate, track, and control
performance/technical/manufacturing cost, schedule, sustainment, and programmatic
risk throughout the life of the program.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           34
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                                           NOTES
List and assess any program interdependency issues that could impact execution of the
acquisition strategy. If the program is dependent on the outcome of other acquisition
programs or must provide capabilities to other programs, the nature and degree of risk
associated with those relationships should be specified. Summarize how these
relationships and associated risk will be managed at the PM, PEO, and DoD
Component levels.
List the key program technologies, their current technology readiness levels (TRL), the
basis for including a technology (e.g., available alternative or low-risk maturation path) if
it is below the TRL 6 benchmark for Milestone B, and the key engineering and
integration risks. NOTE: Key technologies should include those technologies that are
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           35
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
part of the system design and those associated with manufacturing the system.
NOTES
Identify any risks that have been deferred to future increments. Explain why these risks
were deferred and whether any residual risks remain in this increment.
CONSIDERATION
             This section should include, but not be limited to, the risks
             associated with threats as described in section 2.8.2.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           36
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
should identify principal manufacturing (if applicable), sustainment, and operational
risks, and it should summarize mitigation plans, to include key risk reduction events.
2.8.7.5.5. Sources
2.8.7.5.11. Warranty
2.8.7.5.13. Leasing
2.8.7.5.15. Payment
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           37
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
2.8.7.5.16. Other Relevant Information
NOTES
                  •    Competitive prototyping.
                  •    Dual-sourcing.
                  •    Unbundling of contracts.
                  •    Funding of next-generation prototype systems or subsystems.
                  •    Use of modular, open architectures to enable competition for
                       upgrades.
                  •    Use of build-to-print approaches to enable production through
                       multiple sources.
                  •    Acquisition of complete technical data packages.
                  •    Periodic competitions for subsystem upgrades.
                  •    Licensing of additional suppliers.
                  •    Periodic system or program reviews to address long-term
                       competitive effects of program decisions.
                  •    Other
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           38
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
competition or other competitive approaches will be used.
Indicate how the results of the previous acquisition phase impact the competition
strategy for the approaching phase.
CONSIDERATIONS
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           39
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                                           NOTES
Summarize the research conducted and the results of market research. Indicate the
specific impact of those results on the various elements of the program. Summarize
plans for continuing market research to support the program throughout development
and production.
Market research information provided in the strategy should be sufficient to satisfy the
requirements of 10 United States Code (USC) 2366a and 10 USC 2366b . For more
information, see Federal Acquisition Regulation (FAR) Part 10 , Market Research , and
Defense Federal Acquisition Regulation Supplement (DFARS) section 210.001 ).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           40
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                                 CONSIDERATIONS
Indicate whether advance procurement of long lead items is planned. List highest dollar
value items. The Acquisition Strategy must clearly indicate the intention to employ
advance procurement. [NOTE: The MDA must separately and specifically approve
advance procurement if authorization is sought prior to the applicable milestone
decision.]
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           41
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                                           NOTES
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           42
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
2.8.7.4. Sustainment Strategy
The details of program sustainment planning are included in the Life Cycle Sustainment
Plan, which will be prepared and approved as a separate document. This portion of the
Program Strategy document should:
Specify the contracting strategy to provide product support throughout the system life
cycle. The sustainment strategy should reflect the Maintenance or Support CONOPS
and consider: impacts to system capability requirements; responsiveness of the
integrated supply chains across government and industry; maintaining long-term
competitive pressures on government and industry providers; and providing effective
integration of weapon system support that is transparent to the warfighter and provides
total combat logistics capability.
CONSIDERATIONS
                  •    Type contract and length along with major terms and conditions
                  •    Performance measures being used (including the extent to
                       which it is traditional transaction based/process focused and
                       performance-based/outcome focused)
                  •    Sustainment related functions, hardware or data covered in
                       each contract
                  •    Portion of system covered by performance based product
                       support strategy
State the assumptions used in determining whether contractor or agency support will be
employed, both initially and over the life of the acquisition, including consideration of
contractor or agency maintenance and servicing (see FAR Subpart 7.3 ), support for
contracts to be performed in a designated operational area or supporting a diplomatic or
consular mission (see FAR section 25.301 ); and distribution of commercial items.*
* Note: Items marked with an asterisk (*) in this section are not required for the
Technology Development Phase or Technology Development Strategy.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           43
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         functions;
    •    How the support concept ensures integration with other logistics support and
         combat support functions to optimize total system availability while minimizing
         cost and the logistics footprint;
    •    How the product support strategy will ensure the selection of best value support
         providers, maximize partnering, and advocate integrated logistics chains in
         accordance with DoD product support objectives;
    •    How manpower and spares will be optimized;*
    •    Efforts to ensure secure and integrated information systems across industry and
         government that enable comprehensive supply chain integration and full asset
         visibility;*
    •    Dedicated investments needed to achieve continuous improvement of weapon
         system supportability and reduction in operating costs;
    •    How performance expectations (as defined in performance agreements) will be
         compared to actual performance results (post Milestone C);*
    •    If Interim Contract Support (ICS) is planned, the ICS requirements, approach,
         and a plan to transition to normal sustainment support.*
    •    If the strategy includes contractor logistics support (CLS), indicate how CLS
         contract flexibility will support the sustainment concept;* and
    •    How the program will ensure product support integration throughout the system
         life cycle.
For each contract with an estimated total value greater than $40 million dollars for an
MDAP or greater than $17 million dollars for a MAIS, including all options.
Provide a table (see example Table 3) that identifies the purpose, type, value,
performance period, and deliverables of the contract.
                                                MAJOR CONTRACTS
                                                                                   Performance             Major
     Contract               Purpose                Type            Value
                                                                                      Period               Deliverables
Specify what the basic contract buys; how major deliverable items are defined; options,
if any, and prerequisites for exercising them; and the events established in the contract
to support appropriate exit criteria for the phase or intermediate development activity.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           44
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Identify the contract type(s) and period(s) of performance. The acquisition strategy shall
provide the information necessary to support the decision on contract type. (See FAR
Part 16 and Section 818, Public Law (P.L.) 109-364 for additional direction.)
NOTES
CONSIDERATION
Address the alignment of the contract (s) with the overarching acquisition strategy and
the competition strategy.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           45
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Indicate whether a competitive award, sole source award, or multiple source
development with down select to one production contract is planned.
If expecting to use other than full and open competition, cite the authority and indicate
the basis for applying that authority, identify source(s), and explain why full and open
competition cannot be obtained.
Specify breakout plans for each major component or sub-system as well as spares and
repair parts.
Assess the comparative benefits of awarding a new contract vice placing a requirement
under an existing contract. ( 10 USC 2306 , 10 USC 2304 .)
If planning to award a new indefinite delivery contract, indicate how many contracts are
planned to be awarded. If a single award is planned, explain why multiple awards are
not feasible. Indicate the ordering period.
    •    Provide the specific incentive structure. Indicate how the incentive structure will
         motivate contractor behavior resulting in the cost, schedule, and performance
         outcomes required by the government for the contract and the program as a
         whole.
    •    If more than one incentive is planned for a contract, the strategy should explain
         how the incentives complement each other and do not conflict with one another.
Summarize the financial reporting that will be required by the contractor on each
contract, including requirements for EVM.
Identify the source selection evaluation approach (e.g., Trade-off or Lowest Price
Technically Acceptable) and briefly summarize planned procedures ( 10 USC 2305 ).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           46
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Highlight the considerations influencing the proposed source selection procedures.
Indicate how these may change from phase to phase.
State the timing for submission and evaluation of proposals. Identify the criteria that will
be used to select the winning bidder. Indicate how those criteria reflect the key
government goals for the program.
2.8.7.5.5. Sources
List the known prospective sources of supplies or services that can meet the need.
Consider required sources of supplies or services (see FAR Part 8 ), and sources
identifiable through databases including the government-wide database of contracts and
other procurement instruments intended for use by multiple agencies available at
https://www.contractdirectory.gov/contractdirectory/ .
    •    small business,
    •    veteran-owned small business,
    •    service-disabled veteran-owned small business,
    •    HUBZone small business,
    •    small disadvantaged business, and
    •    women-owned small business concerns, and
    •    specify how small business participation has been maximized at both the direct
         award and subcontracting levels (see FAR Part 19 ).
If applicable, identify the incumbent contractors and the contracts affected by the
bundling.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           47
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                                           NOTES
When FAR Subpart 19.7 applies, the acquisition strategy should establish maximum
practicable individual socio-economic subcontracting goals, meaningful small business
work, and incentives for small business participation.
Summarize the rationale for the selection of the planned subcontract tier or tiers.
Indicate how prime contractors will be required to give full and fair consideration to
qualified sources other than the prime contractor for the development or construction of
major subsystems and components.
CONSIDERATION
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           48
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
2.8.7.5.8. Special Contracting Considerations
Identify any special contracting considerations: list any unique clauses or special
provisions (e.g., any contingent liabilities (i.e., economic price adjustment or business
base clauses, termination liability, etc.)) or special contracting methods (see FAR Part
17 ) included in the contract; list any special solicitation provisions or FAR deviations
required (see FAR Subpart 1.4 ).
Identify the engineering activities to be stated in the RFP and required of the contractor
to demonstrate the achievement of the reliability and maintainability design
requirements.
Provide a table (see example Table 4) to specify how the sustainment key performance
parameter thresholds have been translated into reliability and maintainability design and
contract specifications. Table 4, as presented here, is a sample. The actual format of
this table may be varied to suit the nature of the procurement or to add additional
requirements. The reliability threshold is often expressed as Mean Time Between
Failure (MTBF). Use the appropriate life unit’s (e.g., hours, cycles, etc.). "MTTR" is
"mean time to repair;" "N/A" may be entered if an item is not applicable.
2.8.7.5.11. Warranty
Indicate whether a warranty is planned, and if so, specify the type and duration;
summarize the results of the supporting Cost Benefit Analysis. (See FAR Subpart 46.7
and DFARS Subpart 246.7 .)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           49
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                                           NOTES
If this strategy is for Milestone C or later, indicate whether the production program is
suited to the use of multiyear contracting ( 10 USC 2306b ). Indicate any plans for
multiyear contracting and address compliance with 10 USC 2306c and Office of
Management and Budget (OMB) Circular A-11 .
NOTES
2.8.7.5.13. Leasing
Indicate whether leasing was considered (applies to use of leasing in the acquisition of
commercial vehicles and equipment) and, if part of the strategy, economically justify that
leasing of such vehicles is practicable and efficient and identify the planned length of
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           50
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
the lease.
NOTES
Quantify the extent to which the program is implementing modular contracting ( 41 USC
2308 ).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           51
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                                  CONSIDERATION
2.8.7.5.15. Payment
Identify financing method(s) planned and whether these provision(s) will be flowed down
to subcontractors. Indicate if early progress payments will be traded off for lower prices
in negotiations.
Provide any other pertinent information that may enhance understanding of the
contracting strategy.
2.8.7.6. Technical Data Rights Strategy (formerly the Data Management Strategy)
2.8.7.6.4. BCA with Priced Contract Option for Future Delivery of Technical Data
2.8.7.6. Technical Data Rights Strategy (formerly the Data Management Strategy)
Summarize the Technical Data Rights strategy for meeting product life-cycle data rights
requirements and to support the overall competition strategy.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           52
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                                            NOTE
Analysis of the data required to design, manufacture, and sustain the system as well as
to support re-competition for production, sustainment, or upgrade. The strategy should
consider, but is not limited to, baseline documentation data, analysis data, cost data,
test data, results of reviews, engineering data, drawings, models, and Bills of Materials
(BOM).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           53
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                                            NOTE
CONSIDERATION
Specify how the program will provide for rights, access, or delivery of technical data the
government requires for the systems total life cycle sustainment. Include analysis of
data needs to implement the product support life cycle strategy including such areas as
materiel management, training, Information Assurance protection, cataloging, open
architecture, configuration management, engineering, technology refreshment,
maintenance/repair within the technical order (TO) limit’s and specifically engineered
outside of TO limit’s, and reliability management.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           54
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                                 CONSIDERATIONS
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           55
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
2.8.7.6.3. Business Case Analysis (BCA) with Engineering Tradeoff Analysis
The business case analysis calculation, conducted in concert with the engineering
tradeoff analysis that outlines the approach for using open systems architectures and
acquiring technical data rights.
CONSIDERATIONS
2.8.7.6.4. BCA with Priced Contract Option for Future Delivery of Technical Data
The cost benefit analysis of including a priced contract option for the future delivery of
technical data and intellectual property rights not acquired upon initial contract award.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           56
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                                            NOTE
Analysis of the risk that the contractor may assert limitations on the governments use
and release of data, including Independent Research and Development (IRAD)-funded
data (e.g., require the contractor to declare IRAD up front and establish a review
process for proprietary data).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           57
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                                  CONSIDERATION
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           58
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                  contract                       contractors
                                TD related to
                                 commercial
      Commercial
                 Commercial TD      items     Unlimited in FFF and OMIT; other
      TD License
                     only      (developed at         rights as negotiated
        Rights
                                   private
                                  expense)
                                     Any
      Commercial                               As specified in the commercial
                  Commercial     commercial
          CS                                     license customarily offered
                    CS only       CS or CS
       Licenses                                          to the public
                               documentation
TD = Technical Data
CS = Computer Software
Summarize how the contract(s) will be administered. Include how inspection and
acceptance corresponding to the work statements performance criteria will be enforced
(see FAR Part 42 ).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           59
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
2.8.7.7.3. Delivery/Performance Period Requirements
2.8.8. Resources
2.8.8.2. Cost
2.8.8.3. *Should-Cost*
2.8.8. Resources
Provide a copy of the programs "Investment Program Funding and Quantities" Chart
(see Figure 4), with a current "as of date." A template and instructions for the
development of this chart are provided at:
https://ebiz.acq.osd.mil/DABSchedule/Questions.aspx?text=IPT (login with password or
Common Access Card required).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           60
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
        Figure 4. Example "Investment Program Funding and Quantities" Chart
If the chart reflects funding shortfalls, indicate how they will be addressed and state the
programmatic impact if they are not.
If the program is jointly funded, provide a separate chart reflecting the funding
contributions required of each joint participant.
Provide and briefly explain funding support from the Working Capital Fund.
If multiple program increments are in progress, funding will be tracked separately for
each increment (e.g., for subsets of the program that will be subject to a separate
Acquisition Program Baseline). Provide separate charts for each increment.
2.8.8.2. Cost
Indicate the established cost goals for the increment and the rationale supporting them.
If a Technology Development Strategy, indicate the Affordability Target that has been
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           61
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
established for the program (initially, average unit acquisition cost and average
operational support cost per unit). The affordability target should be presented in the
context of the resources that are projected to be available in the portfolio(s) or mission
area(s) associated with the program under consideration. For new start programs,
provide the quantitative analytical basis for determining that the resources expected to
be available in the portfolio/mission area can support the program under consideration.
Employ a graphic to illustrate.
2.8.8.3. *Should-Cost*
Summarize the application of should-cost analysis to the acquisition. Identify the should-
cost initiatives that have been planned for the program. Specify how the associated
"should cost targets" will be used as a basis for contract negotiations and contract
incentives, and to track contractor, PEO, and PM performance.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           62
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                                 CONSIDERATIONS
Explain how the cost management approach adequately considers funds management.
Identify any contingent liabilities (award fee, special incentives, economic price
adjustment, business base clauses, termination liability, etc.) planned for or associated
with the program. Identify which contingent liabilities have been funded. Summarize the
plan to obtain approval for any unfunded contingencies (see DFARS 217.171.a.(4) and
217.172.(e) ).
Summarize plans to control program costs, specifically Program Acquisition Unit Cost,
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           63
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Average Procurement Unit Cost, and Life-Cycle Cost. List and describe cost control
tools and processes.
Summarize the process to update estimates (e.g., x months before each decision
review or x months before beginning each increment).
Indicate specific lines of programmatic authority. Show how the authority chain meets
the requirements identified in DoD Directive 5000.01, paragraph E.1.1.26 .
Indicate the planned organization to effectively manage the program and ensure all
stakeholders are involved (Integrated Product Teams (IPT), boards, reviews, etc.). If
applicable, indicate how the contractor will be involved in program IPTs. Summarize the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           64
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
anticipated business management relationship between (1) the program office and the
contractor, and (2) the program office and other government agencies.
NOTE
Indicate any limitations on foreign contractors being allowed to participate at the prime
contractor level.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           65
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                                            NOTE
Summarize plans to increase the opportunity for coalition interoperability as part of the
developing DoD program.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           66
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                                 CONSIDERATIONS
Summarize any plans for cooperative development with potential international partners.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           67
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
List any international agreements planned or existing (e.g. MOAs, MOUs, etc.) in place
and identify any current contracting activities with potential international partners.
CONSIDERATION
Specify the potential (MS A) or plans (MS B; MS C) for Foreign Military and/or Direct
Commercial Sale and the impact upon program cost due to program protection and
incorporation of exportability features.
CONSIDERATION
Summarize the results of industrial capability analysis (public and private) to design,
develop, produce, support, and, if appropriate, restart the acquisition program.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           68
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                                 CONSIDERATIONS
CONSIDERATIONS
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           69
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
and maintain access to competitive suppliers for critical areas at system, subsystem,
and component level (e.g., requiring an open-systems-architecture or a make-or-buy
plan). List critical items and their sources.
When the analysis indicates that the needed industrial capabilities are in danger of
being lost, the strategy should indicate whether government action is required to
preserve the industrial capability. The strategy should also address product technology
obsolescence, replacement of limited-life items, regeneration options for unique
manufacturing processes, and conversion to performance specifications at the
subsystems, component, and spares levels.
NOTE
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           70
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Life-cycle signature support funding requirements will be reflected in the program
funding summary (see Paragraph 2.8.8.1 and Figure 4).
CONSIDERATION
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           71
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                                           NOTES
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           72
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
2.8.12. Military Equipment Valuation
( NOTE: The unit cost can be calculated by summing the estimated cost of the end item
with the estimated costs of all associated government furnished equipment, training
manuals, technical data, engineering support, etc., NOT including spares and support
equipment. For additional information, see:
    •    http://www.acq.osd.mil/pepolicy/training_tools/quick_reference_tools.html ; or
    •    http://www.acq.osd.mil/pepolicy/training_tools/bfma_instructions.html .)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           73
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
DEFENSE ACQUISITION GUIDEBOOK
Chapter 3-- Affordability and Life-Cycle Resource Estimates
3.0. Overview
3.2. Affordability
3.0. Overview
3.0.1. Purpose
3.0.2. Contents
3.0.1. Purpose
3.0.2. Contents
Section 3.1 provides introductory background material intended for a general audience.
It describes the concept of program life-cycle cost, and provides definitions of terms
used by the DoD cost community. It also introduces the concepts of total ownership cost
and fully burdened cost of delivered energy.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           74
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
The next five sections are more specialized; they discuss the specific milestone review
procedures, expectations, and best practices for a variety of topics related to acquisition
program affordability, cost, and manpower:
Section 3.2 describes the basic policies associated with the consideration of affordability
in the acquisition process and offers parameters for preparing affordability analyses and
constraints on investments. This section also explains the Department's full-funding
policy.
Section 3.4 describes the role of both DoD Component cost estimates and independent
cost estimates in support of the DoD acquisition system.
Section 3.7 is intended for less experienced cost analysts working in the acquisition
community. This section, which is tutorial in nature, provides a recommended analytic
approach for preparing a life-cycle cost estimate for a defense acquisition program.
3.1.1. Introduction
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           75
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
3.1.1. Introduction
Both DoD Directive 5000.01, The Defense Acquisition System, and DoD Instruction
5000.02, Operation of the Defense Acquisition System, make reference to life-cycle cost
and total ownership cost. This section of the Guidebook explains the meaning for each
of these terms. The terms are similar in concept but somewhat different in scope and
intent. For a defense acquisition program, life-cycle cost consists of research and
development costs, investment costs, operating and support costs, and disposal costs
over the entire life cycle. These costs include not only the direct costs of the acquisition
program but also indirect costs that would be logically attributed to the program. In this
way, all costs that are logically attributed to the program are included, regardless of
funding source or management control.
The concept of total ownership cost is related but broader in scope. Total ownership
cost includes the elements of life-cycle cost as well as other infrastructure or business
process costs not normally attributed to the program. Section 3.1.5 defines and
describes this concept in more detail.
Program cost estimates that support the defense acquisition system normally are
focused on life-cycle cost or elements of life-cycle cost. Examples of cases where cost
estimates support the acquisition system include affordability analyses, establishment of
program cost goals for Acquisition Program Baselines, independent cost estimates, or
estimates of budgetary resources. However, for programs that are pre-Milestone A or in
the Engineering and Manufacturing Development Phase, cost estimates that are used
within the program office to support system trade-off analyses, such as evaluations of
design changes or assessments of energy efficiency, reliability, maintainability, and
other supportability considerations, may need to be broader in scope than traditional
life-cycle cost estimates to support the purpose of the analyses being conducted.
Moreover, for mature programs (in transition from production and deployment to
sustainment), cost estimates may need to be expanded in scope to embrace total
ownership cost concepts in order to support broad logistics or management studies.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           76
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
3.1.2. Life-Cycle Cost Categories and Program Phases
DoD 5000.4-M, DoD Cost Analysis Guidance and Procedures, provides the DoD
definitions of cost terms used in describing system life-cycle costs. Life-cycle cost is the
sum of the following four major cost categories, where each category is associated with
sequential but overlapping phases of the program life cycle:
    1. Research and development costs associated with the Materiel Solution Analysis
       phase, the Technology Development phase, and the Engineering and
       Manufacturing Development phase;
    2. Investment costs associated with the Production and Deployment phase;
    3. Operating and support costs associated with the sustainment phase; and
    4. Disposal costs occurring after initiation of system phase out or retirement,
       possibly including demilitarization, detoxification, or long-term waste storage.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           77
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
3.1.3. Life-Cycle Cost Category Definitions
The following sections summarize the primary cost categories associated with each
program life-cycle phase.
Research and Development costs are estimated and presented using the following
categories:
Technology Development Phase [Note: For programs with extensive prototyping and/or
preliminary design activities that occur before Milestone B, the Technology
Development Phase should be expanded with lower level cost categories, similar to the
categories used in the EMD Phase]
Complete definitions and further details are provided throughout MIL-STD-881C, Work
Breakdown Structures for Defense Materiel Items. Note the following:
    •    The Standard expands the Prime Mission Product category into more detailed
         elements. These lower level elements vary by product commodity (such as
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           78
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         aircraft, electronic system, missile system, sea system, or surface vehicle).
    •    Supportability analysis that defines the requirements for the logistics elements is
         part of Systems Engineering and planning and management associated with the
         logistics elements is part of Program Management.
    •    In most cost estimates, the Engineering Change Orders element is added to the
         Standard taxonomy to allow a contingency for design or other scope changes.
    •    In most cost estimates, the first four EMD elements shown above are subtotaled
         and displayed as Flyaway, Rollaway, Sailaway, or other similar term. The
         remaining EMD elements are often grouped together and labeled as "Acquisition
         Logistics," "Product Support Package," or other similar term.
    •    The Training element includes training equipment and devices, training course
         materials, and training services.
    •    Specialized facilities (fixtures, test chambers, laboratories, etc.) are considered
         part of the Work Breakdown Structure (WBS) element that they support. General
         brick and mortar type facilities are part of the Industrial Facilities element.
    •    Specialized contractor support is considered part of the WBS element that it
         supports. Contractor support associated with the service, maintenance or launch
         of prime mission systems is part of the Operational/Site Activation element.
An abbreviated version of the above format is used in Budget Exhibit R-3, RDT&E
Project Cost Analysis, to display budget justifications and financial reporting for
Research, Development, Test and Evaluation projects with budgets greater than $1
million in either budget year. See DoD 7000.14 R, Financial Management Regulation,
Volume 2B, Chapter 5.
Investment consists of production and deployment costs incurred from the beginning of
low rate initial production through completion of deployment. This typically includes
procurement costs associated with producing and deploying the primary hardware,
systems engineering and program management, product support elements associated
with production assets, military construction, and operations and maintenance
associated with the production and deployment phase.
Investment costs are estimated and presented using the following categories:
Procurement
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           79
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Peculiar Support Equipment
Training
Industrial Facilities
Operational/Site Activation
Military Construction
Complete definitions and further details for the Procurement elements are provided
throughout MIL-STD-881C, Work Breakdown Structures for Defense Materiel Items.
Note the following:
    •    The Standard expands the Prime Mission Product category into more detailed
         elements. These lower level elements vary by product commodity (such as
         aircraft, electronic system, missile system, sea system, or surface vehicle).
    •    Supportability analysis that defines the requirements for the logistics elements is
         part of Systems Engineering, and planning and management associated with the
         logistics elements is part of Program Management.
    •    In most cost estimates, the Engineering Change Orders element is added to the
         Standard taxonomy to allow a contingency for design or other scope changes.
    •    In most cost estimates, the first four procurement elements shown above are
         subtotaled and displayed as Flyaway, Rollaway, Sailaway, or other similar term.
         The remaining procurement elements are often grouped together and labeled as
         "Acquisition Logistics," "Product Support Package," or other similar term.
    •    The Training element includes training equipment and devices, training course
         materials, and training services.
    •    Specialized facilities (fixtures, test chambers, laboratories, etc.) are considered
         part of the Work Breakdown Structure (WBS) element that they support. General
         brick and mortar type facilities are part of the Industrial Facilities element.
    •    Specialized contractor support is considered part of the WBS element that it
         supports. Contractor support associated with the service, maintenance or launch
         of prime mission systems is part of the Operational/Site Activation element.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           80
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
budget year. (See DoD 7000.14 R, Financial Management Regulation, Volume 2B,
Chapter 4.)
O&S consists of sustainment costs incurred from the initial system deployment through
the end of system operations. This includes all costs of operating, maintaining, and
supporting a fielded system. Specifically, this consists of the costs (organic and
contractor) of manpower, equipment, supplies, software, and services associated with
operating, modifying, maintaining, supplying, training, and supporting a system in the
DoD inventory. This includes costs directly and indirectly attributable to the system (i.e.,
costs that would not occur if the system did not exist), regardless of funding source or
management control. Direct costs refers to the resources immediately associated with
the system or its operating unit. Indirect costs refers to the resources that provide
indirect support to the system (including its manpower or facilities). For example, the
pay and allowances for a unit-level maintenance technician would be treated as a direct
cost, but the cost of medical support for the same technician would be an indirect cost.
Operating and Support costs are estimated and presented using the following
categories:
Unit-Level Manpower
Operations Manpower
Unit Operations
Operating Materiel
Support Services
Temporary Duty
Maintenance
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           81
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Organizational Maintenance and Support
Intermediate Maintenance
Depot Maintenance
Sustaining Support
Indirect Support
Installation Support
Personnel Support
Further details and complete definitions are provided in the Operating and Support
Cost-Estimating Guide promulgated by the Director, Cost Assessment and Program
Evaluation.
Disposal costs are the costs associated with demilitarization and disposal of a military
system at the end of its useful life. Depending upon the characteristics of the system,
demilitarization and disposal costs may be significant, so it is important to consider the
costs early in the systems life cycle. Costs associated with demilitarization and disposal
include disassembly, materials processing, decontamination, collection/storage/disposal
of hazardous materials and/or waste, safety precautions, and transportation of the
system to and from the disposal site. Systems may be given credit in the cost estimate
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           82
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
for resource recovery and recycling considerations.
The disposal cost category is intended to be used to ensure that design and other
decisions made early in the program consider their effects on the specific long-term
disposal costs that can be logically attributed to the program. Disposal costs of a more
general nature, such as the removal of unexploded ordnance at a training range, would
normally not be attributed to a specific aircraft program that in the future may participate
in training exercises at that range.
Disposal costs may be estimated and presented using the following categories, subject
to tailoring for the circumstances unique to each program:
Demilitarization
Reclamation of Parts
Storage
The application of life-cycle cost categories to program phases may need to be modified
for programs with evolutionary acquisition strategies. DoD Instruction 5000.02,
Operation of the Defense Acquisition System, Enclosure 2, paragraph 2, describes the
evolutionary acquisition approach for acquisition programs. In an evolutionary approach,
the ultimate capability delivered to the user is provided in increasing increments.
Evolutionary acquisition strategies (1) define, develop, produce, and deploy an initial,
militarily useful capability (Increment 1) based on proven technology, demonstrated
manufacturing capabilities, and time-phased definition capabilities needs; and (2) plan
up front for subsequent development, production, and deployment of increments
beyond the initial capability over time (Increments 2 and beyond).
For a program with evolutionary acquisition, a question often arises concerning the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           83
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
scope of the life-cycle cost estimate presented at a milestone review. Although the
situation may vary somewhat depending on individual circumstances, the life-cycle cost
estimate should attempt to address as much of the program, including known future
increments, as can be defined at the time of the initial (Increment 1) milestone review.
Any exclusions for portions of the program that cannot be defined at that time should be
clearly identified.
The application of life-cycle cost categories and program phases (as described in
Section 3.1.2) may need to be modified to account for the evolutionary acquisition
strategy. Figure 3.1.4.F1 depicts a notional profile of annual program expenditures by
cost category for a program with evolutionary acquisition.
As explained earlier, total ownership cost includes the elements of a program's life-cycle
cost as well as other related infrastructure or business processes costs not necessarily
attributed to the program in the context of the defense acquisition system. Infrastructure
is used here in the broadest possible sense and consists of all military department and
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           84
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
defense agency activities that sustain the military forces assigned to the combatant and
component commanders. Major categories of infrastructure are support to equipment
(acquisition and central logistics activities), support to military personnel (non-unit
central "school-house" training, personnel administration and benefits, and medical
care), and support to military bases (installations and communications/information
infrastructure).
In general, traditional life-cycle cost estimates are often adequate in scope to support
the review and oversight of cost estimates made as part of the acquisition system.
However, depending on the issue at hand, the broader perspective of total ownership
cost may be more appropriate than the life-cycle cost perspective, which may be too
narrow to deal with the particular context. As discussed previously, for a defense
acquisition program, life-cycle costs include not only the direct costs of the program but
also certain indirect costs that would be logically attributed to the program. In a typical
life-cycle cost estimate, however, the estimated indirect costs would include only the
costs of infrastructure support specific to the program's military manpower (primarily
medical support and system-specific training) and the program's associated installations
or facilities (primarily base operating support and facilities sustainment, restoration, and
modernization).
One special case in which traditional life-cycle cost models and data sources need to be
augmented is the inclusion of the fully burdened cost of delivered energy in trade-off
analyses for certain tactical systems. This case is discussed in the next section.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           85
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
3.1.6. Fully Burdened Cost of Delivered Energy
Summary
In the acquisition process, the Fully Burdened Cost of Energy (FBCE) estimates the
energy-related costs to sustain specific pieces of equipment, including procurement of
energy, the logistics needed to deliver it where and when needed, related infrastructure,
and force protection for those logistics forces directly involved in energy delivery. FBCE
shall be applied in trade-off analyses conducted for all developmental Department of
Defense (DoD) systems with end items that create a demand for energy in the
battlespace. FBCE does not identify savings for programmatic purposes. It is an analytic
input to the business case analysis designed to identify the difference in total energy-
related costs among competing options. Consistent with Section 138c of title 10, United
States Code, and DoDI 5000.02, FBCE estimates shall be made and reported for all
acquisition category (ACAT) I and II systems that will demand fuel or electric power in
operations and will be applied to all phases of acquisition beginning with the preparation
of the Analysis of Alternatives (AoA). An FBCE estimate is also required as part of Total
Ownership Cost (TOC) calculations. FBCE is not additive to Total Ownership Costs but
rather is reported beside it. While TOC estimates are based on the total peace-time life
of a system, FBCE estimates are based on short combat scenarios. They provide
different but complementary insights.
Introduction
The energy required to field and sustain forces with current deployed systems poses
significant operating costs and imposes several operational constraints on the larger
force structure. First, growing logistics footprints can impede force mobility, flexibility,
timing, and staging, especially for anti-access and irregular conflicts. Reducing the need
for energy can have significant benefits for force deployability and the timeline of
operations. Second, this logistics footprint presents a target for conventional, irregular,
and catastrophic threats, creating demand for force protection and transportation forces.
In the conflicts of the past decade, for example, adversaries have targeted U.S. fuel
supply convoys, putting our forces and their missions at risk and redirecting combat
power and dollars to fuel delivery.
Conversely, reducing system energy demand can make operational forces more agile
and lethal by extending their range and reducing their dependence on logistics lines.
These reductions can be achieved through different, better informed tradespace
choices, design alternatives, technologies, and force structure concepts.
As outlined in the 2011 DoD Operational Energy Strategy, DoD is instituting procedures,
frameworks, analytic tools and reporting requirements to better understand and manage
how this energy demand affects force capability, vulnerability, and enterprise costs.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           86
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
One of these frameworks, FBCE, is used to inform the acquisition tradespace by
quantifying the per gallon price of fuel (or per kilowatt price of electricity) used per day
for two or more competing materiel solutions. The FBCE estimate includes apportioned
costs of the energy logistics forces needed to deliver and protect the fuel in a scenario.
Calculating the FBCE gives DoD decision makers a way to more accurately consider
the cost of a systems energy logistics footprint when making trades between cost,
schedule, and performance. It has the added benefit of informing decisions on the size
and focus of DoD investments in science and technology programs that affect the
energy demands of the force such as engines and propulsion, light-weight structural
and armor materials, power efficiency in electronics, mobile power production and
distribution, and more innovative system design approaches.
FBCE includes the cost of the fuel itself and the apportioned cost of all of the fuel
logistics and related force protection required beyond the Defense Logistics Agency-
Energy (DLA Energy) point of sale. While most planning scenarios generally employ
military forces for fuel delivery and protection, in some cases, contractor logistics and
protection may be presumed. The cost estimation method is the same though the data
sources required may vary. As a decision tool, FBCE is meant to inform technological
and design choices as it is applied in requirements development, acquisition trades, and
technology investments. Successful implementation will over time help DoD manage
larger enterprise risks such as high and volatile fuel prices.
The FBCE is applied in trade-off analyses conducted for all deployable DoD systems
with end items that create a demand for energy in the battlespace. This FBCE
methodological guidance applies to ACAT I and II developmental systems as well as
mid-life upgrade or modernization choices.
FBCE estimates shall be prepared concurrently with the AoA for each materiel solution
being considered. The AoA should develop those estimates to sufficient fidelity to
determine if the differences in energy demand and resupply costs are significant
enough to meaningfully influence the final choice of alternatives. For developmental
system with delivered energy requirements (i.e., most systems), the AoA shall examine
alternative ways to reduce operational energy demand as a significant system capability
factor. Even if FBCE does not significantly differ between alternatives, but shows
sensitivity to change between sub-component or design choices within all alternatives,
the Service sponsoring the program shall continue FBCE efforts after completion of the
AoA to inform trades in the subsequent acquisition phases. This includes technology
development, systems engineering, and design decisions, or even to incentivize bidders
to offer more efficient systems. In all cases, FBCE shall be developed for all alternatives
remaining in the trade space at the end of the AoA and not just for the alternative
favored/chosen by the Service sponsor.
FBCE has a wide range of applications beyond system design. For example, it can be
used for site specific investments such as efficiency improvements at a contingency
base to reduce fuel deliveries.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           87
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Commercial vehicles such as buses or cars used in support of routine fixed base
operations normally should not be regarded as "deployable" and are addressed in other
regulations and guidance.
This section outlines a basic framework developed by the Office of the Assistant
Secretary of Defense for Operational Energy Plans and Programs (OASD(OEPP)) and
the Office of the Secretary of Defense (OSD), Cost Assessment and Program
Evaluation (CAPE), to calculate the FBCE. This framework is oriented towards liquid
fuels but extends to other forms of energy demands (e.g., fuel cells, hybrid-electric
engines, and nuclear and solar energy sources). The specific analytic tools and
methods to estimate FBCE are being refined within the analytic, acquisition and costing
communities. This approach was informed by analytical work started by a Defense
Science Board task force in 2001, applied by the Office of Program Analysis and
Evaluation in 2006 and 2007 in a ground system case study, and revisited by OSD
while assessing several major defense acquisition programs (MDAPs) and their
approach to fuel issues. This framework is intended to give DoD Components flexibility
in developing methodologies tailored to their various domains and force planning
methods. Alternative methods or interpretations may be allowed, but DoD Components
should consult iteratively with appropriate OSD offices, especially the OASD(OEPP)
before delivering a final product at a milestone review or similar decision point.
Calculation of the FBCE differs from most other cost factors in two main ways. First, it is
scenario-based. The FBCE analysis should be based upon a range of operational
scenarios or use conditions from those specified in the programs AoA guidance or in the
approved programs analysis base to ensure comparability within program tradespace
discussions. Further, in order to estimate operationally realistic costs, all scenarios have
to be of sufficient duration to account for demanded logistics and force protection. In
addition, the FBCE calculation requires participation from Component force planning
and analytic organizations to appropriately calculate the estimates. The appropriate
organizations vary by Service.
There is no definitive, "correct" answer for a given systems FBCE estimate. However,
DoD Components should present a realistic and analytically defensible scenario and
cost elements. The proponent’s scenario assumptions for fuel logistics must be
consistent with Service future force plans and Concepts of Operation. Consistency
enables the Services and DoD to evaluate their assumptions relative to strategy and
doctrine and make better informed risk decisions. DoD Components should use existing
analytic tools, planning data, and costing methodologies where possible to develop
FBCE values. If Components find their analytic tools are inadequate to make the
necessary estimates, Components should approach OASD(OEPP) at the earliest
opportunity to help identify potential solutions.
There are two key analytical components essential to developing a FBCE value:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           88
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
1. Scenarios. Services decide upon a representative set of future operational scenarios
or vignettes. However, to ensure the results of the FBCE calculations are comparable to
other analytic measures, the same scenarios used in the programs AoA or analysis
base shall be used in calculating the FBCE. The DoDs approved joint Defense Planning
Scenarios (or Integrated Security Construct scenarios) and the Components supporting
future force plans should provide the general guidance and analytic assumptions
needed to identify appropriate scenarios. For purposes of computing the FBCE,
scenarios must be of sufficient duration to require logistical re-supply of energy. Once
the FBCE is calculated for the chosen scenarios, a simple mean average of the results
will be computed.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           89
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
               Figure 3.1.6.F1. FBCE Scenario Fuel Delivery Process Diagram
The first item needed to compute the FBCE is the Assured Delivery Price (ADP). The
price elements described in Figure 3.6.1.F2 (below) provide a framework for
determining the ADP of fuel within a given scenario. It is a measure of the burdened
cost of the fuel in $/gallon or $/barrel and all the tactical delivery assets and force
protection needed to assure the fuel is safely delivered out to a given location. The ADP
is the same for all users of fuel in that location using a given source of fuel and delivery
method.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           90
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                       Price Elements to Determine Assured Delivery Price
*These prices vary by Service and delivery method (ground, sea, air).
Although this figure provides a framework for calculating ADP, the elements must be
tailored to a selected supply chain, system or platform type, and larger force structure
context. In all cases, the results are scenario or unit-type-specific, and are not
applicable for all situations. Each of the elements is discussed further in the following
sections.
Fuel
The first price element for consideration is the fuel itself. DLA Energy serves as DoDs
single supply center for petroleum products worldwide and for coal, natural gas, and
electricity services within the continental United States. DLA Energy not only procures
the energy products but serves as DoDs Integrated Materiel Manager for all petroleum
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           91
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
products. DLA Energy charges the Services for the fuel delivered through a
reimbursable arrangement known as the Defense Working Capital Fund.
The Standard Price established by DLA Energy is the rate that is charged to military
customers at the retail point of sale worldwide. To simplify cost planning and
accounting, the Standard Price for a given fuel is the same globally and does not
represent the full capitalized costs DLA Energy incurs to deliver the fuel out to the point
of sale. For purposes of calculating ADP, the Standard Price shall be used, referencing
the most recent price update from DLA Energy. The Standard Price should then be
inflated, using the most recent Office of Management and Budget inflation factors for
fuel prices, to the year in which the AoA scenarios in the analysis are set (e.g. 2018 or
some future year at or after Initial Operational Capability).
The second price element captures the burdens associated with the tactical delivery
assets used by the Services to deliver fuel from the point of sale to the system that will
consume it. It includes the Operating and Support (O&S) costs, the cost of depreciation
of the actual delivery assets, and any significant infrastructure costs needed to operate
these assets.
Once the Services take over possession of fuel from DLA Energy at the point of sale,
they must employ Service-owned delivery assets. For the purposes of ADP estimates,
fuel delivery assets means major items of fuel delivery equipment, such as Navy oilers
(T-AOs), aerial refueling aircraft (KC aircraft) for fixed-wing and rotary-wing aircraft, and
tanker trucks and trailers for ground vehicles. It also includes C-130s airdropping
palletized fuel and rotary-wing aircraft carrying fuel by sling load for delivery.
The O&S cost for the fuel delivery assets is measured in $/gallon and consists of the
costs of operations and maintenance (O&M) of the vehicles and equipment and the
costs for military and civilian manpower dedicated to the fuel delivery mission divided by
the gallons of fuel delivered. For fuel delivery systems that are major systems in their
own right, such as oilers or aerial refueling aircraft, actual O&S cost history is collected
and made available to registered users of the Air Forces and Navy’s Visibility and
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           92
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Management of Operating and Support Cost data systems. For other classes of
equipment, cost and manpower data is found in planning factors used to develop O&M
budgets and tables of organization and equipment associated with fuel delivery units. If
the planning scenarios/missions being used for this calculation requires another
Services assets to delivering fuel in the battlespace, Services may need share data.
The cost of depreciation of the primary fuel delivery assets is also part of the second
price element. Normally, depreciation is not used in DoD analyses, since most studies
tend to deal with equipment recapitalization costs explicitly. However, in this case,
depreciation provides a measure of the decline in capital value of the fuel delivery
assets over time from use. The standard method is to use straight line depreciation over
the anticipated service life of the primary fuel delivery asset. For example, for an ADP
calculation for an aerial system that requires air-to-air refueling as part of its mission
profile/duty cycle, this step would require inclusion of a depreciation value for the
systems air refueling tanker.
An additional part of the cost of depreciation is the potential loss of delivery assets due
to hostile attack or other attrition. Based on the scenario chosen, there is a definable
probability that the associated logistics platforms will be interdicted and destroyed. If
destroyed, the entire remaining value of the platform is immediately amortized and this
cost is added to this price element. Depending on the quantity of fuel being carried by
the delivery asset, an adjustment to the amount of fuel obtained from the point of sale
will be required to account for this potential loss, if appropriate. Many cost and attrition
factors related to fuel resupply convoys are available through existing combat models
and historical databases.
Finally, miscellaneous infrastructure costs may be added if they significantly add to the
cost of supporting the delivery assets and if the scenarios in the AoA involve energy
infrastructure. These items may include the price of O&S and recapitalization for the
facilities (such as fueling facilities and fuel storage sites) and related ground system
equipment (such as pumps, fuel storage bladders, hose lines, and other refueling
equipment to include maintenance and parts for refueling vehicles and other related
ground refueling equipment). The costs to deploy the delivery assets may also be
included, if the assets need to be transported to the theater of interest. This applies only
to infrastructure that is operated by the military Services in the theaters of interest, and
does not apply to infrastructure that is operated by DLA Energy and incorporated into
the DLA Energy capitalized cost of fuel.
For DoD infrastructure, data sources and associated cost factors are centrally managed
by the Office of the Deputy Under Secretary of Defense (Installations and Environment)
and available to authorized users. Data on all DoD world-wide facilities is stored in the
DUSD(I&E) Facilities Assessment Database. A four digit number known as the Facility
Analysis Code (FAC) classifies each facility. For example, there is a unique code for
each facility category such as marine fueling facility, POL pipeline, pump station, or fuel
storage facility. For each four digit code, the DoD Facilities Pricing Guide provides cost
factors used in DoD facilities cost models. Cost factors are expressed as annual costs
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           93
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
per unit of measure (e.g., square foot) and are provided for facilities sustainment,
modernization, and operations.
Security
The third and final price element includes the costs of escort protection of the fuel
supply chain in hostile environments. In the case of DoD force protection assets
allocated to the fuel delivery forces, the O&S costs, direct fuel costs and the
depreciation cost of those forces will also have to be estimated and included in the
overall calculation. In essence, all of the costs considered in the second price element
should also be considered for security assets. This includes the possibility that some
security assets will be destroyed due to hostile activity while protecting the fuel supply
chain. In some high-risk scenarios, force protection costs may be the largest factor in
the FBCE estimate.
To arrive at the FBCE, the ADP is multiplied by the apportioned amount of fuel
demanded by the system of interest. The FBCE is computed for each scenario being
considered. Programs then have the option of reporting out the FBCE for each of the
scenario they’ve assessed separately, or to provide their mean or weighted average,
depending on anticipated usage of the system. To arrive at a single FBCE for the
program, average these estimates based upon the relative amount of time that the
system is expected to operate in each of the chosen scenarios.
Other Considerations
The FBCE, which is based on a simplified activity based costing framework, is meant to
provide the acquisition process with a realistic, financial proxy for the fuel burden our
forces will incur in the future battlespace. It is not meant to capture the operational
impacts and capability gained or lost by changes in the logistical burden or in the
unrefueled range of the system due to fuel consumption. The DoD force planning
process and the analyses conducted to inform requirement development, the Joint
Capabilities Integration Development System (JCIDS) process, are evolving to consider
these variables. Because acquisition is governed by "cost, schedule, and performance",
the requirements developer and approving authority should consider those fuel-related
variables as part of the performance tradespace relative to the capability gap they are
trying to fill.
The use of FBCE estimates do not normally identify near-term savings that can be
identified in a budget. Choices made during an acquisition program to reduce the fuel
demand will not begin to show an effect until after the system is fielded. Further, actual
usage may vary considerably from the planning scenarios used in the AoA. This is often
10 to 20 years following an initial ICD for a major program, well beyond the FYDP.
Readers interested in this subject should periodically check this section of the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           94
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Guidebook for future updates to this framework.
3.2. Affordability
3.2. Affordability
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           95
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Department's full-funding policy.
The Milestone Decision Authority (MDA) considers affordability at all major decision
points of an acquisition program. Consideration and subsequent enforcement of
affordability constraints help to ensure sufficient resources will be available to support
the procurement and operation and support (O&S) of the system throughout its life
cycle. The MDA also examines the realism of projected funding over the programs life
cycle, given likely DoD Component resource constraints.
Affordability analysis and constraints are not intended to produce rigid, long-term plans.
Rather, they are tools to promote responsible and sustainable investment decisions by
examining the likely long-range implications of today’s requirements choices and
investment decisions based on reasonable projections of future force structure
equipment needs-before substantial resources are committed to a program.
Even before a program is approved for formal initiation into the acquisition process,
affordability plays a key role in identifying capability needs as part of the Joint
Capabilities Integration and Development System (JCIDS), which balances cost versus
performance in establishing requirements for new acquisitions.
Moreover, all elements of life-cycle cost (or total ownership cost, if applicable) are
documented as part of the Capability Development Document and the Capability
Production Document (section 16 in both documents). To ensure the program is
affordable, cost constraints are established to drive early consideration of potential
tradeoffs.
Affordability is the ability to allocate resources out of a future total budget projection to
individual activities. It is determined by Component leadership given priorities, values,
and total resource limitations against all competing fiscal demands on the Component.
Affordability goals set early cost objectives and highlight the potential need for tradeoffs
within a program, and affordability caps set the level beyond which actions must be
taken, such as reducing costs.
Affordability analysis and constraints are not synonymous with cost estimation and
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           96
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
approaches for reducing costs. Constraints are determined in a top-down manner by the
resources a Component can allocate for a system given inventory objectives and all
other fiscal demands on the Component. Constraints then provide a threshold for
procurement and sustainment costs that cannot be exceeded by the Program Manager
(PM) without advanced permission of the MDA and Component leadership. On the
other hand, cost estimates are generated in a bottom-up manner and forecast whether
the system can be acquired under those constraints and at what level of risk. Thus,
constraints are not set based on cost estimates but rather on a different calculus of
whether a Component can afford the estimated costs of a system. The difference
between the affordability constraints and the cost estimates indicate the levels of risk at
the current requirements and quantity levels, and whether actions must be taken to
prevent exceeding the constraints.
Cost control and cost reduction approaches are central to maximizing the buying power
of the Department and should be considered in all phases and aspects of program
management as ways to meet or beat affordability constraints. Reducing the cost of
program management, RDT&E, procurement, or sustainment of a product that meets
validated requirements is always of importance, independent of achieving affordability
constraints; however, if those constraints cannot be met-even with aggressive cost
control and reduction approaches-then technical requirements, schedule, and planned
quantities are revisited, with support from the Components Configuration Steering
Board, with any requirements trades proposed to the validation authority. If constraints
still cannot be met and the Component cannot afford to raise the constraint level by
lowering constraints elsewhere in their analysis and obtaining MDA approval, then the
program may be cancelled.
Affordability analysis is the cornerstone process for the Component leadership to set
priorities and determine what it can afford for each acquisition. Each DoD Component
develops life-cycle affordability constraints for its ACAT I and IA acquisition programs
for procurement unit cost and sustainment costs by conducting portfolio affordability
analyses that contain a product life-cycle funding projection and supporting analysis.
The basic procurement unit cost calculation is the annual estimated procurement
budget divided by the number of items that should be procured each year to sustain the
desired inventory.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           97
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Similar calculations are made to derive sustainment affordability constraints.
Components standardize the portfolios they use for their analysis and can be based on
mission areas or commodity types. These portfolios provide a collection of products that
can be managed together for investment planning and oversight purposes. Components
normally make trade-offs within portfolios, but if necessary, can and should make trade-
offs across portfolios to provide adequate resources for high-priority programs.
A future total budget projection for each Component for affordability analysis provides
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           98
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
the first-order economic reality and for allocation of estimated future resources to each
portfolio. This projection establishes a nominal rather than optimistic foundation for the
future and covers all fiscal demands that compete for resources in the Component,
including those outside acquisition and sustainment.
The affordability analysis examines all programs and portfolios together, extending over
enough years to reveal the life-cycle cost and inventory implications of the longest
program for the Component. The same analysis is used as individual programs come
up for review. Nominally, affordability analysis covers at least 30 to 40 years into the
future (especially for the Military Departments) but may be approximately 15 years for
Components whose acquisitions all have planned life cycles of, and reach steady-state
inventory in, 15 years or less (e.g., Components with only MAIS programs whose life
cycles are estimated to be acquisition time plus 10 years after Full Deployment
declaration).
The aggregation of portfolio cost estimates for each year, when combined with all other
fiscal demands on the Component, may not exceed the Components reasonably
anticipated future budget levels. Absent specific Component-level guidance by Director,
Cost Assessment and Program Evaluation (DCAPE) or USD(AT&L), each Component
projects its topline budget beyond the FYDP using the average of the last two years of
the current FYDP and the OSD inflator provided by Under Secretary of Defense
(Comptroller) (USD(C)), resulting in zero real growth.
3.2.2.3. Updates
3.2.2.4. Presentation
3.2.2.5. Format
3.2.2.7. Timing
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                           99
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
programs review, the Component employs the relevant portfolio to facilitate
understanding and discussion of life-cycle costs and inventories of related acquisition
systems.
3.2.2.3. Updates
Each Component maintains and updates its affordability analysis as needed at the
Component or portfolio level to reflect significant changes such as large cost growths in
portfolios and programs, changes in defense strategy, force structure changes, or major
budgetary changes.
3.2.2.4. Presentation
Transparency ensures that the risk, cost implications, and alternatives of system
acquisitions and sustainment are sufficiently understood by the Component leadership
and the programming, resource planning, requirements, intelligence, and acquisition
communities.
3.2.2.5. Format
At each major acquisition decision meeting, the Component provides stacked area
charts ("sand charts") and underlying spreadsheet data showing the programs budget,
what portfolio it fits within, and the top-level total of all portfolios and accounts totaling at
or below the future total budget projection, equivalent to Total Obligation Authority
(TOA), using the affordability constraints (refer to Figure 3.2.2.5.F1).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      100
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
           Figure 3.2.2.5.F1. Notional Example of Affordability Analysis Charts
Notional examples used by the Army , Navy , and Air Force are provided for
informational purposes.
The affordability analysis must be consistent with the data in the Cost Analysis
Requirements Description (CARD) for a program under review, including the
requirements, quantity, and schedule used in the analysis. Affordability Analysis also
provides data to support the procurement and sustainment constraints that are
documented in the MDD, Milestone A, and Pre-B Acquisition Decision Memorandums
(ADMs) and in the acquisition program baselines (APBs) normally set at Milestone B
and beyond.
3.2.2.7. Timing
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      101
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
opportunity for ensuring that a program will be affordable is through requirements
tailoring that occur before and during the AoA(s) and early development. Thus, the
Components incorporate estimated funding streams for future programs within their
affordability analyses at the earliest conceptual point and specify those estimates at
MDD and beyond to inform system design concepts and alternative selection.
Affordability constraints are established to inform the requirements authority, PM, and
AoA team of the cost limitations dictated by the Components affordability analysis.
Affordability goals are key objectives set to inform requirements and design tradeoffs
during early research and development. Affordability caps are fixed requirements that
are functionally equivalent to Key Performance Parameters (KPPs). Based on the
Components affordability analysis and recommendations, the MDA sets and enforces
affordability constraints as follows:
    •    At MDD: tentative affordability cost goals (e.g., total funding, annual funding
         profiles, unit procurement and/or sustainment costs, as appropriate) and
         inventory goals to help scope the AoA and provide targets around which to
         consider alternatives;
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      102
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    At Milestone A: affordability goals for unit procurement and sustainment costs;
         and
    •    At the Pre-B Decision Review, Milestone B, and Beyond: binding affordability
         caps.
These constraints are documented in the ADMs for these decision points. At Milestone
B, the affordability caps are documented in the programs APB. Any programs that skip
earlier reviews, or have baselines set before Milestone B, receive goals or constraints
commensurate with their position in the acquisition cycle and their levels of maturity.
The MDA enforces affordability constraints throughout the life cycle of the program. If a
PM concludes that, despite efforts to control costs and reduce requirements an
affordability constraint will be exceeded, then the PM notifies the Component
Acquisition Executive and the MDA to request assistance and resolution. The PM also
reports progress relative to affordability constraints at Defense Acquisition Executive
Summary (DAES) reviews.
As noted above, the affordability constraints are not based on cost estimates. Rather,
the constraints are what the Component can afford to spend on the program under
review relative to all other fiscal demands.
Once affordability is established, cost estimates can help inform the feasibility and risk
of a set of proposed requirements given the affordable level of investment. Thus, at the
point of establishing an APB, the affordability caps should be at least as high as the
APB values (otherwise, the program will already require action to address cost and/or
requirements). In practical terms, Components will likely want to propose caps above
the APB values to allow for some flexibility in dealing with unforeseen issues. The
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      103
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
amount by which the proposed caps exceed the APB values is at the Components
discretion as long as the life-cycle cost at those caps, along with all other Component
fiscal demands, can be shown to fit within the Components future total budget
projection.
The caps set the level at which the program may be de-scoped or cancelled, not what
the cost estimates say a specified set of program requirements will cost.
Components are responsible for developing and issuing similar guidance to ensure life-
cycle affordability for lower ACAT programs that have resource implications beyond the
FYDP, and PMs should ensure they are familiar with that guidance.
It has been a long-standing DoD policy to seek full funding of acquisition programs,
based on the most likely cost, in the budget year and out-year program years. DoD
Directive 5000.01 affirms this full funding policy. Moreover, DoD Instruction 5000.02
requires full funding-defined as inclusion of the dollars and manpower needed for all
current and future efforts to carry out the acquisition strategy in the budget and out-year
program-as part of the entrance criteria for the transition into engineering and
manufacturing development.
For MDAPs at MS B, the MDA must certify in writing to Congress that the program is
fully funded through the period covered by the FYDP, relative to reasonable cost and
schedule estimates that meet DCAPE concurrence. Other certification requirements are
listed under section 2366b of title 10, United States Code. For all acquisition programs,
the MDA normally assesses full funding at all major decision points. As part of this
assessment, the MDA reviews the actual funding (in the most recent FYDP position) in
comparison to the (time-phased) DoD Component Cost Estimate. In addition, the MDA
considers the funding recommendations made by DCAPE (for Acquisition Category ID
and IAM programs), or the DoD Component Cost Analysis team (for Acquisition
Category IC and IAC programs). If the MDA concludes that the current funding does not
support the acquisition program, then the ADMD may direct a funding adjustment and/or
program restructure in the next FYDP update.
While full funding focuses on the FYDP, the long-range aspects of affordability analysis
and constraints are meant to consider the implications beyond the FYDP of decisions
made today.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      104
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
3.3. Analysis of Alternatives
3.3.1. Introduction
3.3.1. Introduction
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      105
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
major funding commitment to the acquisition program). The update to the AoA is used
to refine the proposed materiel solution, as well as reaffirm the rationale, in terms of
cost-effectiveness, for initiation of the program into the formal systems acquisition
process. For Major Defense Acquisition Programs at Milestone A, the Milestone
Decision Authority (MDA) must certify in writing to the Congress that the Department
has completed an AoA consistent with study guidance developed by the Director, Cost
Assessment and Program Evaluation (DCAPE), in addition to meeting other certification
criteria ( 10 U.S.C. 2366a). For Major Defense Acquisition Programs at Milestone B, the
Milestone Decision Authority (MDA) must certify in writing to the Congress that the
Department has completed an AoA with respect to the program in addition to meeting
other certification criteria ( 10 U.S.C. 2366b). Pursuant to DoDI 5000.02, the AoA is
updated as needed at Milestone C.
In practice, AoA issues vary somewhat between AoAs for weapon and other tactical
systems and AoAs for major automated information systems. Sections 3.3.2 , 3.3.3 ,
and 3.3.4 provide discussion about AoAs that may be of general interest, although
much of the discussion is focused on weapon systems. Section 3.3.5 discusses the AoA
process for major automated information systems.
3.3.2. Role of the Analysis of Alternatives (AoA) as Part of the Materiel Solution
Analysis
The analysis of alternatives process is expected to play a key role in support of the
Materiel Solution Analysis Phase. After a program has an approved Materiel
Development Decision, the analysis of alternatives process is expected to contribute to
the selection of a preferred materiel solution that satisfies the capability need
documented in the approved Initial Capabilities Document (ICD).
The Director, Cost Assessment and Program Evaluation (DCAPE), develops and
approves study guidance for the AoA. The guidance is developed with the input of other
DoD officials. Prior to the MDD review, DCAPE provides the AoA study guidance to the
DoD Component designated by the MDA. Following receipt of the AoA study guidance,
the DoD Component prepares an AoA study plan that describes the intended
methodology for the management and execution of the AoA. The AoA study plan is
coordinated with the MDA and approved by DCAPE prior to the MDD review. A
suggested template for the AoA study plan is provided in section 3.3.3.
The study guidance shall require, at minimum, full consideration of possible trade-offs
among cost, schedule, and performance objectives for each alternative considered. The
study guidance shall also require an assessment of whether or not the joint military
requirement can be met in a manner that is consistent with the cost and schedule
objectives recommended by the JROC. The AoA study guidance and resulting AoA plan
should build upon the prior analyses conducted as part of the Joint Capabilities
Integration and Development System (JCIDS). The JCIDS process is briefly described
in section 1.3, and is fully described in CJCS Instruction 3170.01. The JCIDS analysis
process that leads to an approved Initial Capabilities Document (ICD) is built upon the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      106
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
analysis known as the Capabilities-Based Assessment (CBA). The CBA provides
recommendations (documented in the ICD) to pursue a materiel solution to an identified
capability gap that meets an established capability need. The CBA does not provide
specific recommendations as to a particular materiel solution, but rather provides a
more general recommendation as to the type of materiel solution (such as Information
Technology system, incremental improvement to an existing capability, or an entirely
new "breakout" or other transformational capability). In this way, the ICD can be used to
establish boundary conditions for the scope of alternatives to be considered in the
subsequent AoA. The AoA study guidance should be crafted to provide a fair balance
between focusing the AoA and ensuring that the AoA considers a robust set of novel
and imaginative alternatives.
The final AoA supporting a Milestone A decision is provided to the DCAPE not later than
60 days prior to the milestone decision review meeting. The evaluation criteria to be
addressed in this assessment are provided in DoD Instruction 5000.02, Enclosure 7,
paragraph 5, and are discussed further in section 3.3.4.1.
The AoA is used to identify the most promising end-state materiel solution, but the AoA
also can play a supporting role in crafting a cost-effective and balanced evolutionary
acquisition strategy. The alternatives considered in the AoA may include alternative
evolutionary paths, each path consisting of intermediate nodes leading to the proposed
end-state solution. In this way, the Materiel Solution Analysis can help determine the
best path to the end-state solution, based on a balanced assessment of technology
maturity and risk, and cost, performance, and schedule considerations (as shown in
Figure 3.3.2.1.F1). The rationale for the proposed evolutionary acquisition strategy
would be documented as part of the Technology Development Strategy.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      107
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
        Figure 3.3.2.1.F1. Establishment of an Evolutionary Acquisition Strategy
The first major step leading to a successful AoA is the creation and coordination of a
well-considered analysis plan. The study plan should establish a roadmap of how the
analysis will proceed, and who is responsible for doing what. At minimum, the study
plan should facilitate full consideration of possible trade-offs among cost, schedule, and
performance objectives for each alternative considered, as well as an assessment of
whether or not the joint military requirement can be met in a manner that is consistent
with the cost and schedule objectives recommended by the JROC.
A recommended outline for the AoA plan would resemble the following:
    •    Introduction
             o Background
             o Purpose
             o Scope
    •    Ground Rules
             o Scenarios
             o Threats
             o Environment
             o Constraints and Assumptions
             o Timeframe
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      108
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
              o Excursions
    •    Alternatives
             o Description of Alternatives
             o Nonviable Alternatives
             o Operations Concepts
             o Sustainment Concepts
    •    Determination of Effectiveness Measures
             o Mission Tasks
             o Measures of Effectiveness
             o Measures of Performance
    •    Effectiveness Analysis
             o Effectiveness Methodology
             o Models, Simulations, and Data
             o Effectiveness Sensitivity Analysis
    •    Cost Analysis
             o Life-Cycle Cost Methodology
             o Additional Total Ownership Cost Considerations (if applicable)
             o Fully Burdened Cost of Delivered Energy (if applicable)
             o Models and Data
             o Cost Sensitivity and/or Risk Analysis
    •    Cost-Effectiveness Comparisons
             o Cost-Effectiveness Methodology
             o Displays or Presentation Formats
             o Criteria for Screening Alternatives
    •    Organization and Management
             o Study Team/Organization
             o AoA Review Process
             o Schedule
Of course, every AoA is unique, and the above outline may need to be tailored or
streamlined to support a given situation. Each point in the above outline is discussed
further in the next several sections.
The introduction to the AoA plan describes the developments that led to the AoA,
including prior relevant analyses (such as the Capabilities-Based Assessment. It should
reference the applicable capability needs document(s) and other pertinent documents,
and highlight the capability gaps being addressed through the applicable capability
needs. The introduction should describe the applicable AoA study guidance and any
other terms of reference. It also should provide a broad overview of the planned AoA
that describes in general terms the level of detail of the study, and the scope (breadth
and depth) of the analysis necessary to support the specific milestone decision.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      109
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
3.3.3.2. Analysis of Alternatives (AoA) Study Plan-Ground Rules
The ground rules described in the analysis plan include the scenarios and threats, as
well as the assumed physical environment and any constraints or additional
assumptions. The scenarios are typically derived from defense planning scenarios and
associated joint operational plans, augmented by more detailed intelligence products
such as target information and enemy and friendly orders of battle. Environmental
factors that impact operations (e.g., climate, weather, or terrain) are important as well.
In addition, environment, safety, and occupational health factors associated with the use
of chemical and/or biological weapons may need to be considered as excursions to the
baseline scenario(s).
The study plan should describe what future timeframe, or timeframes, will be considered
in the analysis. Often, the time period(s) selected will be determined by the time
period(s) assumed in the DoD-approved planning scenario. However, there is some
flexibility on this point, especially if something significant-such as the deployment of a
new capability, or the retirement of a legacy system-is projected to occur one or two
years after one of the time periods in the scenario. A common and desirable practice is
to consider two time periods of interest, say "near-term" and "far-term," separated by a
decade or so.
The AoA study plan should describe the planned analytic excursions to the baseline
scenarios and other major ground rules. Such excursions are strongly encouraged in
order to explore any impact of changing threat levels, warning times, involvement of
allied forces, political constraints on basing or overflights, just to name a few issues.
These excursions can be used to see if there any major issues that are critical to the
relative cost-effectiveness of the alternatives considered in the AoA.
The analysis plan also should document the range of alternatives to be addressed in the
analysis. In many cases, there will be a minimum set of alternatives required by the
initial analysis guidance. Additional direction during subsequent AoA reviews may insert
yet other alternatives. Practically, the range of alternatives should be kept manageable.
Selecting too few or too many are both possibilities, but experience has shown that
selecting too many, exceeding the available resources of the AoA study team, is the
greater concern. The number of alternatives can be controlled by avoiding similar but
slightly different alternatives and by early elimination of alternatives (due to factors such
as unacceptable life-cycle cost or inability to meet Key Performance Parameters). In
many studies, the first alternative (base case) is to retain one or more existing systems,
representing a benchmark of current capabilities. An additional alternative based on
major upgrades and/or service-life extensions to existing systems also may be
considered.
For each alternative, evaluating its effectiveness and estimating its life-cycle cost (or
total ownership cost, if applicable) requires a significant level of understanding of its
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      110
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
operations and support concepts. The operations concept describes the details of the
peacetime, contingency, and wartime employment of the alternative within projected
military units or organizations. It also may be necessary to describe the planned basing
and deployment concepts (contingency and wartime) for each alternative. The
sustainment concept for each alternative describes the plans and resources for system
training, maintenance, and other logistics support.
It is important that the alternatives considered in the AoA should address alternative
concepts for maintenance, training, supply chain management, and other major
sustainment elements. In this way, the AoA can identify the preferred materiel solution
not only in terms of traditional performance and design criteria (e.g., speed, range,
lethality), but also in terms of support strategy and sustainment performance as well. In
other words, the AoA should describe and include the results of the supportability
analyses and trade-offs conducted to determine the most cost-effective support concept
as part of the proposed system concept.
The analysis plan should describe how the AoA will establish metrics associated with
the military worth of each alternative. Military worth often is portrayed in AoAs as a
hierarchy of mission tasks, measures of effectiveness, and measures of performance.
Military worth is fundamentally the ability to perform mission tasks, which are derived
from the identified capability needs. Mission tasks are usually expressed in terms of
general tasks to be performed to correct the gaps in needed capabilities (e.g., hold
targets at risk, or communicate in a jamming environment). Mission tasks should not be
stated in solution-specific language. Measures of effectiveness are more refined and
they provide the details that allow the proficiency of each alternative in performing the
mission tasks to be quantified. Each mission task should have at least one measure of
effectiveness supporting it, and each measure of effectiveness should support at least
one mission task. A measure of performance typically is a quantitative measure of a
system characteristic (e.g., range, weapon load-out, logistics footprint, etc.) chosen to
enable calculation of one or more measures of effectiveness. Measures of performance
are often linked to Key Performance Parameters or other parameters contained in the
approved capability needs document(s). Also, measures of performance are usually the
measures most directly related to test and evaluation criteria.
The analysis plan spells out the analytic approach to the effectiveness analysis, which is
built upon the hierarchy of military worth, the assumed scenarios and threats, and the
nature of the selected alternatives. The analytic approach describes the level of detail at
various points of the effectiveness analysis. In many AoAs involving combat operations,
the levels of effectiveness analysis can be characterized by the numbers and types of
alternative and threat elements being modeled. A typical classification would consist of
four levels: (1) system performance, based on analyses of individual components of
each alternative or threat system, (2) engagement, based on analyses of the interaction
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      111
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
of a single alternative and a single threat system, and possibly the interactions of a few
alternative systems with a few threat systems, (3) mission, based on assessments of
how well alternative systems perform military missions in the context of many-on-many
engagements, and (4) campaign, based on how well alternative systems contribute to
the overall military campaign, often in a joint context. For AoAs involving combat
support operations, the characterization would need to be modified to the nature of the
support. Nevertheless, most AoAs involve analyses at different levels of detail, where
the outputs of the more specialized analysis are used as inputs to more aggregate
analyses. At each level, establishing the effectiveness methodology often involves the
identification of suitable models (simulation or otherwise), other analytic techniques, and
data. This identification primarily should be based on the earlier selection of measures
of effectiveness. The modeling effort should be focused on the computation of the
specific measures of effectiveness established for the purpose of the particular study.
Models are seldom good or bad per se; rather, models are either suitable or not suitable
for a particular purpose.
It also is important to address excursions and other sensitivity analyses in the overall
effectiveness analysis. Typically, there are a few critical assumptions that often drive the
results of the analysis, and it is important to understand and point out how variations in
these assumptions affect the results. As one example, in many cases the assumed
performance of a future system is based on engineering estimates that have not been
tested or validated. In such cases, the effectiveness analysis should describe how
sensitive the mission or campaign outcomes are to the assumed performance
estimates.
The AoA plan also describes the approach to the life-cycle cost (or total ownership cost
(see section 3.1.5, if applicable) analysis. The cost analysis normally is performed in
parallel with the operational effectiveness analysis. It is equal in importance as part of
the overall AoA process. It estimates the total life-cycle cost (or total ownership cost) of
each alternative, and its results are later combined with the operational effectiveness
analysis to portray cost-effectiveness comparisons. What is important to emphasize is
that the cost analysis will be a major effort that will demand the attention of experienced,
professional cost analysts.
The principles of economic analysis apply to the cost analysis in an AoA. Although the
cost estimates used in an AoA originally are estimated in constant dollars, they should
be adjusted for discounting (time value of money), accounting for the distribution of the
costs over the study time period of interest. In addition, the cost estimates should
account for any residual values associated with capital assets that have remaining
useful value at the end of the period of analysis. Further guidance on economic analysis
is provided in DoD Instruction 7041.3, "Economic Analysis for Decisionmaking."
The cost analysis should also describe the planned approach for addressing the Fully
Burdened Cost of Energy, for those AoAs where this issue is applicable. See section
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      112
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
3.3.4.1 for further information on this topic.
Typically, the next analytical section of the AoA plan deals with the planned approach
for the cost-effectiveness comparisons of the study alternatives. In most AoAs, these
comparisons involve alternatives that have both different effectiveness and cost, which
leads to the question of how to judge when additional effectiveness is worth additional
cost. Cost-effectiveness comparisons in theory would be best if the analysis structured
the alternatives so that all the alternatives have equal effectiveness (the best alternative
is the one with lowest cost) or equal cost (the best alternative is the one with greatest
effectiveness). Either case would be preferred; however, in actual practice, in many
cases the ideal of equal effectiveness or equal cost alternatives is difficult or impossible
to achieve due to the complexity of AoA issues. A common method for dealing with
such situations is to provide a scatter plot of effectiveness versus cost. Figure
3.3.3.7.F1 presents a notional example of such a plot.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      113
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
            Figure 3.3.3.7.F1. Sample Scatter Plot of Effectiveness versus Cost
Note that the notional sample display shown in Figure 3.3.3.7.F1 does not make use of
ratios (of effectiveness to cost) for comparing alternatives. Usually, ratios are regarded
as potentially misleading because they mask important information. The advantage to
the approach in the figure above is that it reduces the original set of alternatives to a
small set of viable alternatives for decision makers to consider.
Finally, the AoA plan should address the AoA study organization and management.
Often, the AoA is conducted by a working group (study team) led by a study director
and staffed appropriately with a diverse mix of military, civilian, and contractor
personnel. Program offices or similar organizations may provide assistance or data to
the AoA study team, but (per DoD Instruction 5000.02, Enclosure 7) the responsibility
for the AoA may not be assigned to a program manager, and the study team members
should not reside in a program office. In some cases, the AoA may be assigned to an
in-house analytic organization, a federally funded research and development center, or
some other similar organization.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      114
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
The AoA study team is usually organized along functional lines into panels, with a chair
for each panel. Typical functional areas for the panels could be threats and scenarios,
technology and alternatives (responsible for defining the alternatives), operations and
support concepts (for each alternative), effectiveness analysis, and cost analysis. In
many cases, the effectiveness panel occupies the central position and integrates the
work of the other panels. The study plan also should describe the planned oversight and
review process for the AoA. It is important to obtain guidance and direction from senior
reviewers with a variety of perspectives (operational, technical, and cost) throughout the
entire AoA process.
Normally, the final results of the AoA initially are presented as a series of briefings. For
potential and designated major defense acquisition programs (Acquisition Category
(ACAT) I) and major automated information systems (ACAT IA), the final AoA results
are provided to the Office of the Director, Cost Assessment and Program Evaluation
(CAPE), no later than 60 days prior to the milestone decision meeting (Defense
Acquisition Board or Information Technology Acquisition Board review). Providing
emerging results to CAPE prior to the final briefing is wise to ensure that there are no
unexpected problems or issues. For other programs, the AoA results should be
provided to the DoD Component entity equivalent to CAPE, if applicable. In any case,
the AoA final results should follow all of the important aspects of the study plan, and
support the AoA findings with the presentation. In particular, all of the stated AoA
conclusions and findings should follow logically from the supporting analysis.
Having received the final AoA briefing(s), the CAPE evaluates the AoA and provides an
independent assessment to the Head of the DoD Component (or the Principal Staff
Assistant) and to the Milestone Decision Authority. DoD Instruction 5000.02, Enclosure
7, provides the evaluation criteria for this assessment. According to the Instruction, the
CAPE, in collaboration with the OSD and Joint Staff, shall assess the extent to which
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      115
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
the AoA:
The recommended template for the AoA study plan provided in Section 3.3.3 provides
considerable guidance for conducting an AoA that would be responsive to the first five
assessment criteria.
For the issue of technology risk and maturity, Section 3.3.2.1 provides a suggested
approach where the AoA can help craft a cost-effective evolutionary acquisition strategy
that is based on a balanced assessment of technology maturity and risk, as well as
cost, performance, and schedule considerations.
For the issue of energy efficiency (applicable to tactical systems with end items that
create a demand for delivered fuel or other forms of energy), Section 3.1.6 describes
the analytic construct known as the Fully Burdened Cost of Delivered Energy; the
Department now intends for this construct to play a major role in applicable AoAs.
For the issue of system training, the AoA should consider alternatives that provide for
the individual, collective, and joint training for system operators, maintainers, and
support personnel. The training system includes simulators and other training
equipment, as well as supporting material such as computer-based interactive
courseware or interactive electronic technical manuals. Where possible, the alternatives
should consider options to exploit the use of new learning techniques, simulation
technology, embedded training (i.e., training capabilities built into, strapped onto, or
plugged into operational systems) and/or distributed learning to promote the goals of
enhancing user capabilities, maintaining skill proficiencies, and reducing individual and
collective training costs. Further information on system training is provided in Section
6.3.3. In addition to addressing the assessment criteria explicitly identified in DoD
Instruction 5000.02, Enclosure 7, the AoA should also address alternative concepts for
maintenance, supply chain management, and other sustainment elements (see Chapter
5 of this Guidebook).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      116
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
3.3.4.2. Analysis of Alternatives (AoA) Final Report
Usually, in addition to a final briefing, the AoA process and results are documented in a
written final report. The report typically is not published formally by the time of the
program milestone decision review, due to schedule constraints. However, the report
nevertheless may be important to the historical record of the program, since the report
serves as the principal supporting documentation for the AoA. The report also may
serve as a reference source for analysts conducting future AoAs. The final report can
follow the same format as the study plan, with the addition of these sections:
    •    Effectiveness Analysis
            o Effectiveness Results
    •    Cost Analysis
            o Life-Cycle Cost (or Total Ownership Cost, if applicable) Results
    •    Cost-Effectiveness Comparisons
            o Cost-Effectiveness Results
            o Assessment of Preferred Alternative(s)
By following the same format, much of the material from the (updated) study plan can
be used in the final report.
DoD Instruction 5000.02, Enclosure 4, Table 2-1 and Table 3, requires an AoA for MAIS
programs at milestone decisions. Much of the discussion on AoAs provided in the
earlier sections of the Guidebook is more applicable to weapon systems, and needs to
be modified somewhat for MAIS programs. This section discusses AoA issues for MAIS
programs. The AoA should include a discussion of whether the proposed program (1)
supports a core/priority mission or function performed by the DoD Component, (2)
needs to be undertaken because no alternative private sector or governmental source
can better support the function, and (3) supports improved work processes that have
been simplified or otherwise redesigned to reduce costs, improve effectiveness, and
make maximum use of commercial off-the-shelf technology. The analysis should be tied
to benchmarking and business process reengineering studies (such as analyses of
simplified or streamlined work processes, or outsourcing of non-core functions).
For all MAIS program AoAs, one alternative should be the status quo alternative as
used in the Economic Analysis, and one alternative should be associated with the
proposed MAIS program. Other possible alternatives could be different system,
network, and/or data architectures, or they might involve different options for the
purchase and integration of commercial-off-the-shelf products, modifications, and
upgrades of existing assets, or major in-house development.
Most likely, the effectiveness analysis in a MAIS program AoA will not involve scenario-
based analysis as is common for the weapon system AoAs. The effectiveness analysis
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      117
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
for an MAIS program should be tied to the organizational missions, functions, and
objectives that are directly supported by the implementation of the system being
considered. The results of the AoA should provide insight into how well the various
alternatives support the business outcomes that have been identified as the business
goals or capabilities sought. In some cases, it may be possible to express the
assessment of effectiveness across the alternatives in monetary terms, and so
effectiveness could be assessed as benefits in the framework for the Economic
Analysis. In other cases, the effectiveness might be related to measurable
improvements to business capabilities or better or timelier management information
(leading to improved decision-making, where it can be difficult or impossible to quantify
the benefits). In these cases, a common approach is to portray effectiveness by the use
of one or more surrogate metrics. Examples of such metrics might be report generation
timeliness, customer satisfaction, or supplier responsiveness. In addition to
management information, the effectiveness analysis also should consider information
assurance and interoperability issues.
The cost analysis supporting the AoA should follow the framework of the Economic
Analysis. The life-cycle cost estimates of the alternatives considered in the AoA should
be consistent with and clearly linked to the alternatives addressed in the Economic
Analysis. Both the effectiveness analysis and the cost analysis should address the risks
and uncertainties for the alternatives, and present appropriate sensitivity analysis that
describes how such uncertainties can influence the cost-effectiveness comparison of
the alternatives.
The appropriate sponsor or domain owner should lead the development of the AoA for a
MAIS program. Experience has shown that the MAIS programs for which the sponsor or
domain owner engages with the Office of the Director, Cost Assessment and Program
Evaluation (CAPE) early in the process are much more likely to be successful than
those that select a preferred alternative before contacting CAPE or before completing
the AoA.
The DoD Component performing the AoA should develop a study plan that addresses
the AoA study guidance, as applicable. At a minimum, the study plan should address
the following topics:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      118
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    g. Cost-Effectiveness Comparisons
    h. Risk & Sensitivity Analysis
            1. Mission
            2. Technology
            3. Programmatic, to include funding
    i.   Study Organization and Management
    j.   Schedule, with associated deliverables
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      119
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Milestone A, certification at Milestone B, before any decision to enter into low-rate initial
production or full-rate production, and in advance of certification following critical cost
growth. An ICE is required for MAIS programs that have experienced a critical change.
An ICE may be conducted by DCAPE for MDAPs and MAIS programs for which
USD(AT&L) is the MDA at any time considered appropriate by DCAPE or upon the
request of the USD(AT&L).
For ACAT ID programs, DCAPE conducts the ICE (as described in Section 3.4.3), and
for ACAT IC programs, the appropriate Service Cost Center or Defense Agency
equivalent conducts the ICE. The Service Cost Centers are in the financial management
organizations of their respective military departments, and are outside of their
department's acquisition chain-of-command.
DCAPE and the Secretary of the Military Department concerned are required by
Congress to report certain elements of program cost risk for MDAP and MAIS
programs. For such programs, DCAPE and the Secretary of the Military Department
concerned (or the head of the Defense Agency concerned) must state the confidence
level used in establishing a cost estimate, the rationale for selecting the confidence
level, and ensure that the confidence level provides a high degree of confidence that the
program can be completed without the need for significant adjustment to program
budgets.
The confidence level disclosure shall be included in the ADM approving the APB; in any
other cost estimates for MDAPs or MAIS programs prepared in association with the
estimates prepared in accordance with Section 3.4.1, above; and for MDAPs, in the
next Selected Acquisition Report prepared in accordance with 10 U.S.C. 2432, or for
MAIS programs, in the next quarterly report prepared in accordance with 10 U.S.C.
2445c.
DCAPE reviews all cost estimates and cost analyses conducted in conjunction with
MDAPs and MAIS programs. In order to accomplish this, 10 U.S.C. 2334(b) requires
that DCAPE promptly receive the results of all cost estimates and analyses conducted
by military departments and Defense Agencies (together, "DoD Component Cost
Estimates").
Each DoD Component establishes a DoD Component-level cost position for all MDAPs
and MAIS programs at milestone reviews. To support the Department's full funding
policy for acquisition programs (see section 3.2.3), as well as statutory certifications and
regulatory requirements, the DoD Component is expected to fully fund the program to
this cost position in the current President's Budget Future Years Defense Program
(FYDP), or commit to full funding of the cost position in the next President's Budget
FYDP, with identification of specific offsets to address any funding shortfalls that may
exist in the current FYDP. In addition, the appropriate Deputy Assistant Secretary of the
Military Department for Cost and Economics (or defense agency equivalent) signs for
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      120
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
the DoD Component-level cost position, and the DoD Component Acquisition Executive
and the Component Chief Financial Officer endorses and certifies that the FYDP fully
funds the program consistent with the DoD Component-level cost position. This policy
was promulgated in the OSD Memorandum, "Required Signed and Documented
Component-level Cost Position for Milestone Reviews," dated March 12, 2009.
The Office of Cost Assessment (CA), within the Office of Cost Assessment and
Program Evaluation (CAPE), receives the results of and reviews all cost estimates and
cost analyses and associated studies conducted by the DoD Components for major
defense acquisition programs (MDAPs) and major automated information system
(MAIS) programs and has timely access to any records and data in the Department.
During the CA review process, CA staff may engage in discussion with the DoD
Components regarding any discrepancies related to the cost estimates and comment on
deficiencies regarding the methodology or execution of cost estimates. Furthermore, the
Director, CAPE, is authorized to concur with the choice of a cost estimate used to
support the acquisition program baseline (APB).
Although CA will provide periodic reviews, certain reviews are regular and required. For
programs subject to CAPE review (normally Acquisition Category ID) that are
approaching Milestone decisions or the Full-Rate Production Decision Review, CA staff
conducts a comprehensive review, establishes a formal position on a program's life-
cycle cost, and advises the Milestone Decision Authority accordingly. The CA review
consists of preparation of an independent life-cycle cost estimate as well as an
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      121
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
assessment of the DoD Component Cost Estimate. This section provides a brief
summary of the major events associated with the CA review and provides additional
information on the procedures for each event. A more comprehensive description of the
Cost Assessment review process is found in DoD 5000.04-M, "DoD Cost Analysis
Guidance and Procedures," Section 2.
Table 3.4.3.1.T1 provides a brief summary of the major events and timelines associated
with a Cost Assessment review leading to a Defense Acquisition Board milestone
decision review:
                                     Event                              Date
                  •    Cost Assessment Review Kick-off Meeting
                                                               180 days before
                          o Draft Cost Analysis Requirements
                                                               Overarching Integrated
                              Description (CARD) Delivered by
                                                               Product Team (OIPT)
                              DoD Component
                                                               meeting
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      122
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
3.4.3.1.1. Cost Assessment Review Events-180 Days before Overarching
Integrated Product Team (OIPT) Meeting
The Cost Assessment (CA) review process begins roughly six months before the
planned Defense Acquisition Board milestone review. At that time, the draft Cost
Analysis Requirements Description (CARD) is provided to CA for review. The CARD is
used to describe formally the acquisition program for purposes of preparing both the
DoD Component Cost Estimate and the CA independent cost estimate. CA staff
promptly evaluates the CARD for completeness and consistency with other program
documents (such as capability needs documents, acquisition strategy, etc.). As part of
this evaluation, CA staff may require access to privileged information such as contractor
proposals that are proprietary or source selection sensitive. CA staff will follow all
necessary procedures to ensure that the integrity of the privileged information is
protected.
At roughly the same time that the draft CARD is submitted, CA staff announces its
upcoming review in a formal memo. The memo initiates a working-level kick-off meeting
that is held with representatives from the program office cost estimating team, the CA
independent cost estimate team, and other interested parties (typically DoD Component
or OSD staff members). The purpose of the meeting is to discuss requirements and
issues for the upcoming milestone review, the scope of the cost estimates, and ground
rules and assumptions on which the estimates will be based. Much of the discussion will
focus on material provided in the draft CARD. This ensures that both cost teams have a
common understanding of the program to be costed. In addition, ground rules are
established for CA interactions with the program office. CA staff also coordinates any
travel or visit requirements with appropriate DoD Component points of contact.
Per DoD Instruction 5000.02, Enclosure 7, section 4, Cost Assessment (CA) staff will
brief the preliminary independent Life-Cycle Cost Estimate (LCCE) to the program
manager (PM) 45 days before the OIPT meeting. In a similar timeframe, the program
office should provide draft documentation of its estimate to the CA staff, and if
applicable, the DoD Component should provide draft documentation of the DoD
Component Cost Position. The CA report eventually submitted to the OIPT and to the
Defense Acquisition Board membership provides not only the CA independent cost
estimate but also an evaluation of the DoD Component Cost Estimate. It is therefore
important for the DoD Components to submit well-documented cost estimates that are
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      123
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
ready for review.
The specific standards for the cost documentation are described in DoD 5000.04-M,
"DoD Cost Analysis Guidance and Procedures," Sections 1 and 2. In general, the
documentation should be sufficiently complete and well organized that a cost
professional could replicate the estimate, given the documentation. Along with the draft
documentation of the program office cost estimate, the DoD Component provides an
updated (and final) Cost Analysis Requirements Description to CA staff. At the same
time that the documents are provided, CA staff will provide feedback and identify any
emerging cost issues to the program manager and DoD Component staff, in part based
on CA work to date on its independent cost estimate.
Per DoD Instruction 5000.02, Enclosure 7, section 4, CA staff will brief the results of the
independent cost estimate to the program manager 21 days before the OIPT meeting.
This is normally handled as part of the CA review meeting. At this time, the program
office should provide their final estimate to the Cost Assessment staff, and the DoD
Component should provide the final DoD Component Cost Position. Other invited OSD
and Joint Staff representatives may attend these reviews/exchanges. A typical
presentation format for the Cost Assessment review meeting would include:
In addition, at the CA meeting, CA staff provides any further feedback to the program
office and DoD Component staff. If appropriate, CA staff will provide a presentation of
the major areas of difference between its independent cost estimate and the program
office cost estimate and/or DoD Component cost position.
At least 10 days before the OIPT meeting, the DoD Component provides final
documentation if its cost estimate (program office cost estimate, or DoD Component
Cost Position where applicable).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      124
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
3.4.3.1.5. Cost Assessment Review Events-3 Days before Overarching Integrated
Product Team (OIPT) Meeting
Cost Assessment (CA) staff's final report is delivered to the OIPT leader at least three
days before the OIPT meeting. Immediately thereafter, it is distributed to the OIPT
members and is available to the DoD Component staff. The expectation is that any
issues had already emerged in prior discussions and that the final CA report should not
contain any surprises. The report normally is two to three pages and typically includes
the following:
Per DoD Instruction 5000.02, Enclosure 2, section 5.c.(5), the DoD Component at
Milestone A submits a cost estimate for the proposed materiel solution(s). Also, per 10
U.S.C. 2334, the Director of Cost Assessment and Program Evaluation (DCAPE)
conducts an independent cost estimate in advance of Milestone A certification. In order
to facilitate these estimates, the cost estimating procedures at Milestone A will track
those at the other milestone decisions points. This includes the required preparation of
a Cost Analysis Requirements Description (CARD), see below, although the early stage
of the program development will necessitate less specificity in many of the required
elements within the CARD.
The actual process and timing leading to the DoD Component estimate may vary
among programs, and therefore, a tailored approach should be developed and
proposed. Early in the Materiel Solution Analysis Phase, the Program Manager and
DoD Component staff should work with the OSD Office of Cost Assessment (CA) and
Acquisition Resources & Analysis staffs to develop a plan and schedule for delivery of
the cost estimate to support the upcoming Milestone A review. The plan is subject to
approval of the Milestone Decision Authority (MDA).
The DoD Component Cost Estimate, in addition to the DCAPE independent cost
estimate, is used to support the MDA certification requirements for Milestone A (10
U.S.C. 2366a). The emphasis for the Milestone A cost estimate is to provide costing
adequate to support the selection of the preferred materiel solution(s) identified by the
Analysis of Alternatives, and to support a determination by the MDA that current funding
for the Technology Development Phase (required technology development, competitive
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      125
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
prototyping, and possibly preliminary design of the end-item system) is adequate. The
Milestone A cost estimate is a complete estimate of the system life-cycle cost. However,
for the costs associated with the acquisition phases beyond Technology Development
(i.e., Engineering and Manufacturing Development, Production and Deployment, and
Operations and Support), the Milestone A cost estimate typically would not have the
same level of rigor or fidelity as will later cost estimates (prepared for milestones B and
beyond). Although the cost estimate addresses the complete life-cycle cost, since it
must support the Analysis of Alternatives process, only the program development and
procurement costs are subject to certification.
Note that if the certified cost estimate grows at least 25 percent during the Technology
Development Phase, then the Program Manager must notify the MDA of the increase.
The MDA in turn consults with the Joint Requirements Oversight Council to reassess
program requirements and the military need(s) for the system. See DoD Instruction
5000.02, Enclosure 2, section 5.c.(3) for further guidance.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      126
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
program for purposes of preparing both the DoD Component Cost Estimate and the
Cost Assessment independent cost estimate. DoD Instruction 5000.02 specifies that for
major defense acquisition programs, the CARD will be provided in support of major
milestone decision points (Milestone B, Milestone C, or the full-rate production decision
review). In addition, for Major Automated Information Systems, the CARD is prepared
whenever an Economic Analysis is required. For other acquisition programs, the
preparation of a CARD, or an abbreviated CARD-like document with appropriate
tailoring, is strongly encouraged to provide a written program description suitable to
support a credible life-cycle cost estimate.
The CARD is prepared by the program office and approved by the DoD Component
Program Executive Officer. For joint programs, the CARD includes the common
program agreed to by all participating DoD Components as well as all unique program
requirements of the participating DoD Components. DoD 5000.4-M, "DoD Cost Analysis
Guidance and Procedures," Chapter 1, provides further guidelines for CARD content.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      127
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Summary of security or program protection features
    •    Summary of environment, safety, and occupational health considerations
    •    System milestone schedule
    •    Summary of acquisition plan or strategy
    •    Plans for system disposal
    •    Track to prior CARD
    •    Approved or proposed CSDR plan
For each topic listed in the suggested outline, the CARD should provide information and
data for the program to be costed. In addition, the CARD should include quantitative
comparisons between the proposed system and a predecessor and/or reference system
for the major topics, as much as possible. A reference system is a currently operational
or pre-existing system with a mission similar to that of the proposed system. It is often
the system being replaced or augmented by the new acquisition. For a program that is a
major upgrade to an existing weapon platform, such as an avionics replacement for an
operational aircraft, the new system would be the platform as equipped with the
upgrade, and the reference system would be the platform as equipped prior to the
upgrade. For Major Automated Information System programs, the CARD format
described above may need to be tailored.
The level of detail provided in the CARD will depend on the maturity of the program.
Programs at the Pre-Engineering and Manufacturing Development Review are less
well-defined than programs at Milestone C or at full-rate production. In cases where
there are gaps or uncertainties in the various program descriptions, these uncertainties
should be acknowledged as such in the CARD. This applies to uncertainties in either
general program concepts or specific program data. For uncertainties in program
concepts, nominal assumptions should be specified for cost-estimating purposes. For
example, if the future depot maintenance concept were not yet determined, it would be
necessary for the CARD to provide nominal (but specific) assumptions about the
maintenance concept. For uncertainties in numerical data, ranges that bound the likely
values (such as low, most likely and high estimates) should be included. In general,
values that are "to be determined" are not adequate for cost estimating. Dealing with
program uncertainty in the CARD greatly facilitates subsequent sensitivity or
quantitative risk analyses in the life-cycle cost estimate.
The last section of the CARD should contain a copy of the approved Cost and Software
Data Reporting plan (see section 3.4.4.2), if available. If the plan has not yet been
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      128
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
approved, then the proposed plan should be included as part of the CARD.
Clearly, much of the information needed for the CARD is often available in other
program documents. The CARD should stand-alone as a readable document but can
make liberal use of appropriate references to the source documents to minimize
redundancy and effort. In such cases, the CARD should briefly summarize the
information pertinent to cost in the appropriate section of the CARD and provide a
reference to the source document. DoD Instruction 5000.02, Enclosure 7, paragraph 2,
states that the program manager shall synchronize preparation of the CARD with other
program documentation so that the final CARD is consistent with other final program
documentation. The source documents should be readily available to the program office
and independent cost estimating teams or can be provided as an appendix to the
CARD. Many program offices provide controlled access to source documents through a
web site (perhaps at a ".mil" web address or on the Secret Internet Protocol Router
Network).
The CARD should be consistent with any contractual solicitations, such as a Request
for Proposal or any accompanying document (e.g., System Requirements Document).
Another issue for the CARD at the Pre-EMD Review can occur when the Technology
Development Phase maintains two or more competing contractor teams (that are
producing prototypes of the system) up to and through the PDR. In this situation, there
are two possible approaches for the preparation of the CARD. If the competing teams
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      129
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
are using similar technologies and designs, then a single generic CARD, based on a
nominal Government design, may be used to prepare a single ICE for the nominal
design. If the competing teams have significantly different technologies or designs, then
it may be necessary to prepare offeror-specific CARDs, which in turn may be used to
prepare multiple ICEs. For programs with competing prototype teams approaching a
Pre-EMD Review, the DoD Component should discuss its proposed use of a single
generic CARD, or use of multiple offeror-specific CARDs, with the Cost Assessment
staff at the Kick-Off Review meeting (see section 3.4.3.1.1), if not earlier.
The CSDR system is the primary means that DoD uses to collect and program
managers use to report actual cost, software, and related business data on Acquisition
Category (ACAT) I, ACAT IA, pre-MDAP, pre-MAIS, and sustainment defense
contracts. The repository of collected data serves as the primary contract cost and
software data repository for most DoD resource analysis efforts, including cost database
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      130
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
development, applied cost estimating, cost research, program reviews, analysis of
alternatives, and life cycle cost estimates. The two principal components of CSDR are
contractor cost data reporting (CCDR) and software resources data reporting (SRDR).
The Deputy Director, Cost Assessment establishes procedural guidance and reporting
formats for the CSDR system and monitors implementation throughout the Department
of Defense. DoD 5000.04-M-1, "Cost and Software Data Reporting (CSDR) Manual,"
establishes the policies and procedures for CSDR and provides report formats and
definitions, specific report examples, and other related information. The CSDR Manual
is available on the Defense Cost and Resource Center (DCARC) web site. Access to
CSDR data is readily provided by DCARC to DoD government cost analysts and
sponsored support contractors possessing Non-Disclosure Agreements who are
registered users.
The CCDR system collects data on the development, production, and sustainment costs
incurred by contractors in performing DoD ACAT I, ACAT IA, pre-MDAP, pre-MAIS, and
sustainment program contracts. DoD Instruction 5000.02, Enclosure 4, Table 4,
establishes the CCDR requirements for Acquisition Category I and IA contracts and
sub-contracts, regardless of contract type. Detailed procedures and other
implementation guidance are found in DoD 5000.04-M-1, "Cost and Software Data
Reporting (CSDR) Manual."
CCDR focuses on the collection of actual total contract costs that are subdivided into
standard categories for cost estimating purposes by Work Breakdown Structure (WBS),
functional categories, and resource elements. CCDR reports provide a display of
incurred costs to date and estimated incurred costs at completion by elements of the
WBS, with nonrecurring costs and recurring costs separately identified. In some cases,
CCDR reports can display incurred costs to date and estimated incurred costs at
completion by functional category (manufacturing labor, engineering, etc.). Where
appropriate, a functional category is broken out by direct labor hours, direct material,
overhead, and other indirect.
CCDR reports are required on all major contracts and subcontracts, regardless of
contract type, for Acquisition Category I and IA programs and pre-Major Defense
Acquisition Program and pre-Major Automated Information System programs
subsequent to Milestone A approval, valued at more than $50 million Then year dollars.
CCDRs are not required for contracts priced below $20 million Then year dollars. The
CCDR requirement on high-risk or high-technical-interest contracts priced between $20
and $50 million is left to the discretion of the DoD Program Manager (PM) based upon
the advice of the Cost Working-level Integrated Product Team (CWIPT). These
requirements must also be approved by the Deputy Director, Cost Assessment. CCDRs
are not required for procurement of commercial systems provided the DoD PM requests
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      131
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
and obtains approval for a reporting waiver from the Deputy Director, Cost Assessment.
CCDR shall normally be at level 3 (level 4 for space contracts) of the Contract Work
Breakdown Structure (WBS) and determined separately for each prime contractor and
subcontractor that meets the reporting thresholds. Reporting at levels 4 and below shall
be required on prime contracts or subcontracts containing WBS elements that address
high-risk, high-value, or high-technical-interest areas of a program. Such reporting
applies only if the CWIPT proposes and the Deputy Director, Cost Assessment
approves.
Initial reports, if required, are due within 60 days following the completion of the
integrated baseline review when a pre-award or post-award conference is held. If a
conference is not held, the initial report, if required, is due within 180 days of contract
award. For subsequent reporting on development contracts, reporting contractors
typically shall submit CCDR reports after such major events as first flight or completion
of prototype, before major milestones, and upon contract completion. Annual reporting
is allowed if requested and approved by the Deputy Director, Cost Assessment. For
production, reporting contractors normally shall submit CCDR reports upon the delivery
of each annual lot for all weapon systems. Due to the extended construction process for
ships, CCDR reports are also required for the total number of ships in each buy and for
each individual ship within that buy at three intervals-initial report (total buy and
individual ships), the mid-point of first ship construction (individual ships only) or other
relevant timeframe as the CWIPT determines, and after final delivery (total buy and
individual ships).
The related instructions are included in the DIDs for these forms as follows:
The forms including the Microsoft Excel templates and the link to the official DIDs are
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      132
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
shown on the DCARC web site. The DCARC also provides software which will produce
the forms from an excel flat file.
The SRDR system collects software metrics data to supplement the actual Contractor
Cost Data Reporting (CCDR) data in order to provide a better understanding and
improved estimating of software intensive programs. DoD Instruction 5000.02,
Enclosure 4, Table 4, establishes SRDR requirements for Acquisition Category I and IA
contracts and sub-contracts, regardless of contract type. Detailed procedures and other
implementation guidance are found in DoD 5000.04-M-1, "Cost and Software Data
Reporting (CSDR) Manual."
SRDRs are required on all major contracts and subcontracts, regardless of contract
type, for contractors developing/producing software elements within Acquisition
Category I and IA programs and pre-Major Defense Acquisition Program and pre-Major
Automated Information System programs subsequent to Milestone A approval for any
software development element with a projected software effort greater than $20M Then
year dollars. The SRDR requirement on high-risk or high-technical-interest contracts
priced below $20 million is left to the discretion of the DoD Program Manager (PM)
based upon the advice of the Cost Working-level Integrated Product Team (CWIPT).
These requirements must also be approved by the Deputy Director, Cost Assessment.
The program office, in coordination with the CWIPT, may choose to combine a set of
smaller releases within a contract into a single release for reporting purposes. Separate
software element developments within a single contract may be reported on separately
or may be aggregated at the discretion of the DoD PM based upon the advice of the
CWIPT.
Within 60 days of contract award, the software developer shall submit an SRDR Initial
Developer Report for the entire software product, customized as agreed to by the DoD
PM in coordination with the CWIPT. The software developer also shall submit an SRDR
Initial Developer Report for each deliverable software release or element within 60 days
of the beginning of its development. In addition, the software developer shall submit an
"as built" SRDR Final Developer Report, customized as agreed to by the CWIPT, within
60 days after delivery of each software release or element to the U.S. Government.
SRDR reports consist of the sample SRDR formats which are contained within the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      133
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
report instructions as follows:
The instructions for the Initial Government Report can be found on the DCARC web
site. The instructions for the other two reports are contained in DIDs DI-MGMT-81739
and DI-MGMT-81740, respectively. Links to the official DIDs and the Microsoft Excel
templates are also found on the DCARC web site. To note, SRDR formats should be
tailored based upon the way the software developer performs its activities and the
related metrics it uses. The three sample SRDR formats are intended as the starting
point for developing tailored reports that capture the developer’s unique software
process.
CSDR data is collected and stored in a central repository, the Defense Automated Cost
Information Management System (DACIMS), maintained by the DCARC. DACIMS has
more than thirty five years of contractor cost data. DACIMS access is easy and quick for
all authorized DoD users. The DCARC web site and Chapter 5 of the CSDR Manual,
DoD 5000.04-M-1, contain specific registration instructions.
DACIMS may be used to obtain cost data to estimate total program acquisition costs,
including work by both contractors and the U.S. Government; total program contract
costs, awarded and future, for a particular contractor; and individual contract costs.
Reporting Formats and Instructions. The CSDR system includes two formats and
instructions that apply to both CCDRs and SRDRs, four unique CCDRs, and three
unique SRDRs. The two CSDRs are shown in this section while the unique reports are
covered in the separate CCDR and SRDR sections. The DD Form 2794, "Cost and
Software Data Reporting Plan" (commonly referred to as the "CSDR Plan") describes
the proposed collection of data by individual report, by work breakdown structure (WBS)
and reporting frequency. The plan must be approved by the Deputy Director, Cost
Assessment prior to issuance of a contract solicitation. The Deputy Director, Cost
Assessment, may waive the information requirements prescribed in Table 4 in
Enclosure 4 of DoDI 5000.02. The format for the Contract Work Breakdown Structure is
contained within the Data Item Description (DID) (DI-MGMT-81334, current edition).
The CSDR Plan format and instructions and the link to the official DID can be found at
the DCARC web site.
Training. The DCARC provides periodic CSDR training at various sites throughout
CONUS for both government and contractor personnel. DCARC strongly encourages
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      134
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
stakeholders to attend these training sessions and schedules classes to meet
stakeholder requirements. The training schedule and various training materials can also
be found at the DCARC web site.
Historical O&S cost data for currently fielded systems are available from the Visibility
and Management of Operating and Support Costs (VAMOSC) data system managed by
each DoD military service. The data can be displayed in several different formats,
including the Office of Cost Assessment standard cost element structure described
previously. Data can be obtained for entire systems, or at lower levels of detail.
VAMOSC provides not only cost data, but related non-cost data (such as operating
tempo or maintenance man-hours) as well. This type of data is useful for analogy
estimates (between proposed systems and appropriate predecessor or reference
systems) and for "bottoms-up" engineering estimates (for fielded systems or
components, possibly adjusted for projected reliability and maintainability growth).
VAMOSC data should always be carefully examined before use in a cost estimate. The
data should be displayed over a period of a few years (not just a single year), and
stratified by different sources (such as major command or base). This should be done
so that abnormal outliers in the data can be identified, investigated, and resolved as
necessary.
To achieve visibility into the Operating and Support (O&S) costs of major fielded
weapon systems, DoD requires that each military service will maintain an historical data
collection system that collects O&S data in a standard presentation format. The Office
of Cost Assessment provides policy guidance on this requirement, known as the
VAMOSC program, and monitors its implementation by each of the military services.
Each service has its own unique VAMOSC data system that tracks actual O&S cost
experience for major weapon systems. The data can be displayed by time frame, at
various levels of detail, and by functional elements of cost (such as depot maintenance,
fuel, consumable items, and so forth). Each VAMOSC system provides not only cost
data, but related non-cost data (such as system quantities, operating tempo, or
maintenance man-hours) as well. VAMOSC data can be used to analyze trends in O&S
cost experience for each major system, as well as to identify and assess major cost
drivers. In addition, VAMOSC data are important as a data source for cost estimates of
future systems, since cost estimates for future systems are often made by analogy to
appropriate predecessor systems. DoD 5000.04-M, "DoD Cost Analysis Guidance and
Procedures," Section 8, provides additional direction for VAMOSC.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      135
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
3.5. Manpower Estimates
Manpower estimates serve as the authoritative source for out-year projections of active-
duty and reserve end-strength, civilian full-time equivalents, and contractor support
work-years. As such, references to manpower in other program documentation should
be consistent with the manpower estimate once it is finalized. In particular, the
manpower estimates should be consistent with the manpower levels assumed in the
final Affordability Analysis and the Cost Analysis Requirements Description (CARD).
The exact content of the manpower estimate is tailored to fit the particular program
under review. A sample format for the manpower estimate is displayed in the Table
3.5.T1 below. In addition, the estimate should identify if there are any resource shortfalls
(i.e., discrepancies between manpower requirements and authorizations) in any fiscal
year addressed by the estimate. Where appropriate, the manpower estimate should
compare manpower levels for the new system with those required for similar legacy
systems, if any. The manpower estimate also should include a narrative that describes
the scope of each functional area (operations, maintenance, support, and training), and
the methods, factors, and assumptions used to estimate the manpower for each
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      136
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
functional area. See section 6.3.1.2 and section 6.3.1.3 for further information
concerning manpower.
1
  Provide separate estimates for Active and Reserve Components for each Service.
2
  Report manpower by fiscal year (FY) starting with initial fielding and continuing
through retirement and disposal of the system (to include environmental clean-up).
3
  Until fielding is completed.
4
  Provide estimates for manpower requirements and authorizations. Provide deltas
between requirements and authorizations for each fiscal year.
             Contractor
             Sub-Total
                        4
             SUPPORT: Military
             Officers
             Enlisted
             Civilian
             Contractor
             Sub-Total
                      4
             TRAIN: Military
             Officers
             Enlisted
             Civilian
             Contractor
             Sub-Total
             TOTAL
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      137
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
3.6. Major Automated Information Systems Economic Analysis
3.6.1. Introduction
3.6.2.2. Use of the Cost Analysis Requirements Description (CARD) for Major
Automated Information System (MAIS) Programs
3.6.2.3. Office of Cost Assessment and Program Evaluations CARD Review and
Assessment
3.6.1. Introduction
DoD Instruction 5000.02, Enclosure 4, Table 2-1, requires that an Economic Analysis be
performed in support of the Milestone A, Milestone B, and full-rate production decision
(or equivalent) reviews. The purpose of the Economic Analysis is to determine the best
MAIS program acquisition alternative by assessing the net costs and benefits of the
proposed MAIS program relative to the status quo. In general, the best alternative will
be the one that meets validated capability needs at the lowest life-cycle cost (measured
in net present value terms), and/or provides the most favorable return on investment.
Whenever an Economic Analysis is required, the DoD Component responsible for the
program also may be required to provide a DoD Component Cost Analysis, which is an
independent estimate of program life-cycle costs. Normally, the Economic Analysis is
prepared by the MAIS program office, and the DoD Component Cost Analysis is
prepared by an office or entity not associated with the program office or its immediate
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      138
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
chain of command. The need for a DoD Component Cost Analysis at Milestone A is
evaluated for each program in tailoring the oversight process.
For Acquisition Category IAM programs, both the Economic Analysis and the DoD
Component Cost Analysis are subject to independent review and assessment by the
Director, Cost Assessment and Program Evaluation (DCAPE).
The purpose of the DCAPEs assessment is to provide the Milestone Decision Authority
with an independent determination that (1) the estimates of life-cycle costs and benefits
are reasonable, traceable, and reflect DoD policy and DCAPE guidance on the
consideration of life-cycle costs, (2) the return on investment calculation is valid, and (3)
the cost estimates are built on realistic program and schedule assumptions.
During the review process, DCAPE staff may engage in discussion with the DoD
Components regarding any discrepancies related to MAIS cost estimates and comment
on deficiencies regarding the methodology or execution of cost estimates. Furthermore,
DCAPE staff are authorized to concur with the choice of a cost estimate used to support
the acquisition program baseline (APB) as well as in the selection of a proper
confidence interval for the MAIS program.
DCAPE and the Secretary of the Military Department concerned are required by
Congress to report certain elements of program cost risk for MAIS programs. For such
programs, DCAPE and the Secretary of the Military Department concerned (or the head
of the Defense Agency concerned) must state the confidence level used in establishing
a cost estimate, the rationale for selecting the confidence level, and ensure that the
confidence level provides a high degree of confidence that the program can be
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      139
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
completed without the need for significant adjustment to program budgets.
The confidence level disclosure shall be included in the ADM approving the APB and in
any other cost estimates for MAIS programs prepared in association with this section.
The review process normally begins with a kick-off meeting held with DCAPE staff,
representatives from the Major Automated Information System (MAIS) program office,
the DoD Component Cost Analysis Team, and any DoD Component functional or
headquarters sponsors. The purpose of the meeting is to reach a common
understanding on the expectations for the upcoming activities and events leading to the
Information Technology Acquisition Board milestone review. As a starting point, the
DoD Component staff and/or sponsors' representatives should review the contents of
the most recently approved capability needs documents, and explain any prior analysis
(such as a Capabilities-Based Assessment) used to justify the need for a materiel
solution (that will be met by the MAIS program).
At the kick-off meeting, the DoD Component staff and/or sponsors' representatives also
should be prepared to explain the planned approach for the upcoming Economic
Analysis. To facilitate this dialogue, the MAIS program office should prepare and
provide a brief Economic Analysis development plan. The development plan should
document the organizational responsibilities, analytic approach, ground rules and
assumptions, and schedule for the economic analysis. The development plan should
identify the specific alternatives that will be compared in the Economic Analysis.
Normally, at least one alternative should be associated with the proposed MAIS
program, and one alternative should be associated with the status quo (no
modernization investment). It may well be the case that the status quo alternative
represents an unacceptable mission posture-it may cost too much to sustain, be unable
to meet critical capability needs, or be unsupportable due to technological
obsolescence. Nevertheless, the status quo concept, applied over the same time frame
(Life Cycle) as the proposed MAIS program, is used for comparative purposes in the
Economic Analysis. The Economic Analysis development plan should document the
DoD Component Cost Analysis approach and schedule as well.
As part of the Economic Analysis development plan, the program office should propose
the cost element structure that will be used to organize and categorize cost estimates in
the Economic Analysis. The cost element structure provides a hierarchal framework of
defined cost elements that in total comprise the program life-cycle cost. The cost
element structure should include phase-out costs associated with the status quo (legacy
or predecessor) system. These costs would be incurred in managing, preserving, and
maintaining the operations of the status quo system as it runs parallel to the phasing in
of the new system. The status quo phase-out cost elements are not used in the
estimate of the status quo alternative. A sample of a generic cost element structure is
available from DCAPE staff. DCAPE can also provide advice on a consistent approach
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      140
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
to net present value and return on investment computations.
3.6.2.2. Use of the Cost Analysis Requirements Description (CARD) for Major
Automated Information System (MAIS) Programs
As soon as possible after the kick-off meeting, the draft Cost Analysis Requirements
Description (CARD) is provided to DCAPE staff for review. The CARD is used to define
and describe the MAIS program for purposes of preparing both the Economic Analysis
and the DoD Component Cost Analysis. For a MAIS program, the CARD typically would
address the following elements:
    •    Program description;
    •    Program operational concept;
    •    Program data management requirements;
    •    Program quantity requirements;
    •    Program manpower requirements;
    •    Program fielding strategy;
    •    Program milestone schedule; and
    •    Program acquisition plan or strategy.
Procedures for the preparation of the CARD are described in DoD Instruction 5000.02,
Enclosure 7, paragraph 2. Additional guidelines on CARD preparation are found in DoD
5000.4 M, "DoD Cost Analysis Guidance and Procedures," Section 1. However, these
guidelines are for the most part oriented toward weapon systems and may need to be
tailored somewhat for automated information systems. The system description in the
CARD should address both hardware and software elements. The CARD should
describe each major hardware item (computers, servers, etc.), noting those items that
are to be developed, and those items that are off-the-shelf. The CARD also should
describe each software configuration item (including applications as well as support
software) and identify those items that are to be developed. For software items to be
developed, the CARD should provide (1) some type of sizing information (such as
counts of source lines of code, function points, or Reports, Interfaces, Conversions and
Enhancements (RICE)-Forms and Workflows (FW) (RICE-(FW) objects) suitable for
cost estimating, and (2) information about the programming language and environment.
In addition, the CARD should describe any special (physical, information, or operations)
system security requirements, if applicable.
Clearly, much of the information needed for the CARD is often available in other
program documents. The CARD should stand-alone as a readable document, but can
make liberal use of appropriate references to the source documents to minimize
redundancy and effort. In such cases, the CARD should briefly summarize the
information pertinent to the Economic Analysis in the appropriate section of the CARD,
and provide a reference to the source document.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      141
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
3.6.2.3. Office of Cost Assessment and Program Evaluations CARD Review and
Assessment
To facilitate the DCAPE review and assessment, the DoD Component's Economic
Analysis and Cost Analysis teams should provide written documentation early enough
to permit a timely report to the Overarching Integrated Product Team (OIPT) and
Information Technology Acquisition Board. The timeline for document submission is the
same as the timeline set forth in Section 3.4.3.1 for major defense acquisition programs.
The documentation serves as an audit trail of source data, methods, and results. The
documentation should be easy to read, complete and well organized to allow any
reviewer to understand the estimate fully. The documentation also serves as a valuable
reference for future cost analysts, as the program moves from one acquisition milestone
to the next.
After review of the documentation, DCAPE staff provides feedback to the program office
and DoD Component staff. Subsequently, DCAPE staff prepares a written report
containing the findings of their independent assessment to the Milestone Decision
Authority. Depending on the circumstances, the report may contain recommended cost
and benefits positions, and it may raise funding or schedule issues. The expectation is
that any issues raised have already emerged in prior discussions and that the final
DCAPE report should not contain any surprises.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      142
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
3.7.2.3. Estimate Costs
3.7.3. Coordination
Section 3.4.3 of this Guidebook primarily focused on procedures associated with life-
cycle cost estimates which are subject to review by the Office of Cost Assessment for
major defense acquisition programs. The estimate is prepared in support of major
milestone or other program reviews held by the Defense Acquisition Board. This section
is intended to be more generally applicable and somewhat more analytic in nature. It
describes a recommended analytic approach for planning, conducting, and
documenting a life-cycle cost estimate for a defense acquisition program (whether or
not the estimate is subject to Office of Cost Assessment review). Much of the discussion
in this section was written with the less experienced cost analyst in mind.
The recommended analytic approach for preparing a life-cycle cost estimate is shown in
Figure 3.7.F1:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      143
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
 Figure 3.7.F1. A Recommended Analytic Approach for Life-Cycle Cost Estimates
The first step in preparing a credible cost estimate is to begin with the development of a
sound analytic approach. During this planning phase, critical ground rules and
assumptions are established, the scope of the estimate is determined, and the program
to be costed is carefully defined and documented. The program definition includes not
only a technical and physical description of the system (and perhaps major
subsystems), but also a description of the system's program schedule, acquisition
strategy, and operating and support concepts. In some cases, it is necessary to state
explicitly the costs to be included, and the costs to be excluded. For example, when
systems have complex interfaces with other systems or programs (that are outside the
scope of the system being costed), the interfaces should be carefully defined.
For programs that will be reviewed by the Office of Cost Assessment, the program office
is required to define its program in a comprehensive formal written document known as
a Cost Analysis Requirements Description (CARD). The format for this document is
briefly summarized in section 3.4.4.1 of this Guidebook, and is completely described in
DoD 5000.4 M, "DoD Cost Analysis Guidance and Procedures," Section 1. Much of the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      144
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
necessary information to prepare a written program description can be extracted and
synthesized from common program source documents and contract specifications. The
written program description should stand-alone as a readable document, but can make
liberal use of suitable references to the source documents to minimize redundancy and
effort.
It is important that the analytic approach to the cost estimate be documented and
reviewed by all potentially interested parties, before the actual work on preparing the
cost estimate begins. This helps ensure that there are no false starts or
misunderstandings later in the process.
Part of the system definition typically includes the program work breakdown structure.
The program WBS is a hierarchy of product-oriented elements (hardware, deliverable
software, data, and services) that collectively comprise the system to be developed or
produced. The program WBS relates the elements of work to each other and to the end
product. The program WBS is extended to a contract WBS that defines the logical
relationship between the elements of the program and corresponding elements of the
contract work statement. The WBS provides the framework for program and technical
planning, cost estimating, resource allocation, performance measurement, technical
assessment, and status reporting. In particular, the contract WBS provides the reporting
structure used in contract management reports or reports in the Contractor Cost Data
Reporting system. Further information about the WBS can be found in MIL-STD-881C,
Work Breakdown Structures for Defense Materiel Items, which is available at the
Defense Cost and Resource Center web site.
A sample of the WBS for an air-to-air tactical missile is provided in Figure 3.7.1.1.F1
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      145
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                       Figure 3.7.1.1.F1. Sample Work Breakdown Structure
In most cost estimates, selected WBS elements (usually high cost) often are further
broken down into functional categories. A typical structure for the functional categories
is provided in Figure 3.7.1.2.F1. In the tactical missile example discussed in the last
section, most likely the cost estimate for the Airframe WBS element would be broken
down by functional category, whereas the cost estimate for the Initial Spares and Repair
Parts WBS element most likely would be estimated at the level of total cost, and not by
functional category.
Standard terms and definitions for the various functional categories were developed to
support the Cost and Software Data Reporting system (see section 3.4.4.2). The terms
and definitions used in Figure 3.7.1.2.F1 can be found in the following:
All of these are available at the Defense Cost and Resource Center web site.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      146
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                 Figure 3.7.1.2.F1. Functional Categories for Cost Estimating
Another step in developing the analytic approach to the cost estimate is establishing the
cost element structure that will be used as the format for the O&S cost estimate. The
cost element structure describes and defines the specific elements to be included in the
O&S cost estimate in a disciplined hierarchy. Using a formal cost element structure
(prepared and coordinated in advance of the actual estimating) identifies all of the costs
to be considered, and organizes the estimate results. The cost element structure is
used to organize an O&S cost estimate similar to the way that a work breakdown
structure is used to organize a development or procurement cost estimate. The intent is
to capture all costs of operating, maintaining, and supporting a fielded system (and its
associated manpower and facilities). A notional portrayal of these costs, organized into
a cost element structure format, is provided in Figure 3.7.1.3.F1. Note that the use of a
cost element structure provides considerably more detail than simply using budget
appropriation categories (operations and maintenance, military personnel).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      147
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
          Figure 3.7.1.3.F1. O&S Costs Organized by a Cost Element Structure
A standard cost element structure used by the Office of Cost Assessment was
introduced in section 3.1.3.3. Details can be found in the OSD CAPE O&S Cost-
Estimating Guide. Although each DoD Component (military department or defense
agency) may have its own preferred cost element structure, it is expected that each
DoD Component will have a cross walk or mapping so that any presentation to the
Office of Cost Assessment can be made using the standard structure.
This section describes the typical steps in preparing a life-cycle cost estimate. The
discussion summarizes the steps entailed in selecting estimating techniques or models,
collecting data, estimating costs, and conducting sensitivity or risk analysis.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      148
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
The suitability of a specific approach will depend to a large degree on the maturity of the
program and the level of detail of the available data. Most cost estimates are
accomplished using a combination of the following estimating techniques:
    •    Engineering Estimate. With this technique, the system being costed is broken
         down into lower-level components (such as parts or assemblies), each of which
         is costed separately for direct labor, direct material, and other costs. Engineering
         estimates for direct labor hours may be based on analyses of engineering
         drawings and contractor or industry-wide standards. Engineering estimates for
         direct material may be based on discrete raw material and purchase part
         requirements. The remaining elements of cost (such as quality control or various
         overhead charges) may be factored from the direct labor and material costs. The
         various discrete cost estimates are aggregated by simple algebraic equations
         (hence the common name "bottoms-up" estimate). The use of engineering
         estimates requires extensive knowledge of a system's (and its components')
         characteristics, and lots of detailed data.
    •    Actual Costs. With this technique, actual cost experience or trends (from
         prototypes, engineering development models, and/or early production items) are
         used to project estimates of future costs for the same system. These projections
         may be made at various levels of detail, depending on the availability of data.
         Cost estimates that support a full-rate production milestone decision should be
         based on actual cost data to the greatest extent possible. A common mistake is
         to use contract prices as a substitute for actual cost experience. Contract prices
         should not be used to project future costs (even when firm-fixed price) unless it is
         known that the contract prices are associated with profitable ventures, and that it
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      149
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         is reasonable to assume that similar price experience will be obtained for
         subsequent contracts.
In many instances, it is a common practice to employ more than one cost estimating
method, so that a second method can serve as a cross-check to the preferred method.
Analogy estimates are often used as cross-checks, even for estimates of mature
systems based on actual costs.
The next two sections provide two illustrative examples of common cost estimating
techniques.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      150
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
3.7.2.1.2. Example #2-Analogy
There are many possible sources of data that can be used in cost estimates.
Regardless of the source, the validation of the data (relative to the purpose of its
intended use) always remains the responsibility of the cost analyst. In some cases, the
data will need to be adjusted or normalized. For example, in analogy estimates, the
reference system cost should be adjusted to account for any differences in system
characteristics (technical, physical, complexity, or hardware cost) or operating
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      151
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
environment between the reference system and the proposed system being costed.
Actual cost experience on past and current acquisition programs often forms the basis
of estimates of future systems. The Cost and Software Data Reporting (CSDR) system
is the primary means within the Department of Defense to systematically collect data on
the development and production costs and other resource usage incurred by
contractors in performing DoD acquisition program contracts associated with major
defense acquisition programs. DoD Instruction 5000.02 makes CSDR reporting
mandatory for all major contracts and subcontracts, regardless of contract type valued
at more than $50 million (then-year dollars). Program managers use the CSDR system
to report data on contractor development, production, and sustainment costs and
resource usage incurred in performing DoD programs. Further, the Defense Federal
Acquisition Regulation Supplement (DFARS) establishes requirements for CSDR
Reporting to be included in the proposals and contract performance for major
acquisition programs (MDAPs) and Major Automated Information Systems (MAIS).
Additional information on cost data reporting is found in section 3.4.4.2. of this
Guidebook.
With the completion of the steps described earlier in this chapter, the actual
computations of the cost estimate can begin. It is important to assess critically the
outputs from the estimating methods and models, drawing conclusions about
reasonableness and validity. Peer review is often helpful at this point. For complex cost
estimates, with many elements provided from different sources, considerable effort and
care are needed to deconflict and synthesize the various elements.
For any system, estimates of future life-cycle costs are subject to varying degrees of
uncertainty. The overall uncertainty is not only due to uncertainty in cost estimating
methods, but also due to uncertainties in program or system definition or in technical
performance. Although these uncertainties cannot be eliminated, it is useful to identify
associated risk issues and to attempt to quantify the degree of uncertainty as much as
possible. This bounding of the cost estimate may be attempted through sensitivity
analyses or through a formal quantitative risk analysis.
Sensitivity analysis attempts to demonstrate how cost estimates would change if one or
more assumptions change. Typically, for the high-cost elements, the analyst identifies
the relevant cost-drivers, and then examines how costs vary with changes in the cost-
driver values. For example, a sensitivity analysis might examine how maintenance
manning varies with different assumptions about system reliability and maintainability
values, or how system manufacturing labor and material costs vary with system weight
growth. In good sensitivity analyses, the cost-drivers are not changed by arbitrary
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      152
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
plus/minus percentages, but rather by a careful assessment of the underlying risks.
Sensitivity analysis is useful for identifying critical estimating assumptions, but has
limited utility in providing a comprehensive sense of overall uncertainty.
Sensitivity and risk analyses also have uses beyond addressing the uncertainty in cost
estimates. They also can be used to help better understand what can go wrong with a
program, and focus appropriate management attention to risk areas that are concerns.
The history of DoD weapon system acquisition would indicate that cost growth and
schedule delays can occur as a direct result of one or more of the following concerns:
The documentation should address all aspects of the cost estimate: all ground rules and
assumptions; the description of the system and its operating and support concepts; the
selection of cost estimating methods; data sources; the actual estimate computations;
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      153
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
and the results of any sensitivity or risk analyses. The documentation for the ground
rules and assumptions, and the system description, should be written as an updated
(final) version of the Cost Analysis Requirements Description (CARD) or CARD-like
document described earlier. The documentation for the portion of the cost estimate
dealing with data, methods, and results often is published separately from the CARD or
CARD-like document, but if that is the case, the two documents should be completely
consistent.
3.7.3. Coordination
For independent cost estimates, the team may be smaller and less formal, but the basic
principle-complete and continual coordination of the cost estimate with all interested
parties-still applies.
In addition, the Defense Acquisition University offers the following courses in residence:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      154
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    CLM016 - - Cost Estimating
    •    CLB024 - - Cost Risk Analysis Introduction
In addition, each year the Cost Assessment Office sponsors a Department of Defense Cost
Analysis Symposium. This symposium includes presentations from government and support
contractor cost analysts concerning best practices and state-of-the-art in cost estimating. The
Symposium also features senior distinguished speakers and panelists from government,
industry, and academia. Further information may be found at the DoD Cost Analysis
Symposium web site.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      155
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
DEFENSE ACQUISITION GUIDEBOOK
Chapter 4 -- Systems Engineering
4.0. Overview
4.1. Introduction
4.0. Overview
4.0.1. Purpose
4.0.2. Contents
4.0. Overview
4.0.1. Purpose
4.0.2. Contents
Section 4.2 Systems Engineering Activities in the Life Cycle provides a by-phase
description of key activities and the SE technical reviews and audits.
4.1. Introduction
4.1. Introduction
Systems engineering (SE) establishes the technical framework for delivering materiel
capabilities to the warfighter. SE provides the foundation upon which everything else is
built and supports program success.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      156
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
SE ensures the effective development and delivery of capability through the
implementation of a balanced approach with respect to cost, schedule, performance,
and risk using integrated, disciplined, and consistent SE activities and processes
regardless of when a program enters the acquisition life cycle. SE also enables the
development of engineered resilient systems that are trusted, assured, and easily
modified (agile).
    •    The "Systems Engineer" refers to the Program Lead Systems Engineer, the
         Chief Engineer or Lead Engineer with SE responsibility, and the SE staff
         responsible for SE processes and who plan, conduct, and/or manage SE
         activities in the program.
    •    The "end user" includes the warfighter and other operational users, including
         support personnel, maintainers, and trainers who use or support the system.
    •    The "developer" refers to the system prime contractor (including associated
         subcontractors) or the Government agency responsible for designing and
         building the system.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      157
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Definition of Systems Engineering
The Systems Engineer balances the conflicting design constraints of cost, schedule,
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      158
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
and performance while maintaining an acceptable level of risk. SE solves systems
acquisition problems using a multi-disciplined approach. The Systems Engineer should
possess the skills, instincts, and critical thinking ability to identify and focus efforts on
the activities needed to enhance the overall system effectiveness, suitability,
survivability and sustainability.
SE activities begin before a program is officially established and are applied throughout
the acquisition life cycle. Any effective SE approach should support and be integrated
with sound program management. Prior to program initiation, the Program Manager, or
Service lead if no Program Manager has been assigned, should perform development
planning to lay the technical foundation for successful acquisition. Development
planning encompasses the engineering analyses and technical planning activities that
provide the foundation for informed investment decisions on which path a materiel
development decision takes. Development planning effectively addresses the capability
gap(s), desired operational attributes, and associated dependencies of the desired
capability. In addition, development planning ensures that there exists a range of
technically feasible solutions generated from across the entire solution space and that
consideration has been given to near-term opportunities to provide a more rapid interim
response to the capability need. Development planning is initiated prior to the Materiel
Development Decision review, continues throughout the Materiel Solution Analysis
phase, and transitions the knowledge (documents, tools, and related data) to the
designated program.
Affordability
The Program Manager controls requirements growth and should use affordability goals
early to guide design trades and program decisions. The Systems Engineer assists in
managing affordability by working closely with the program cost estimator/analyst team
when developing common cost and technical models and aligning baselines. See DAG
Chapter 3 Affordability and Life-Cycle Resource Estimates for more information on
affordability.
Throughout the acquisition life cycle, the Program Manager and Systems Engineer
should monitor the system affordability, seek out cost saving opportunities, and identify
any associated cost, schedule, and performance risks. The Program Manager’s
emphasis prior to Milestone B should be on defining and achieving affordability targets
and desired capabilities. During the Technology Development (TD) phase, the Program
Manager and Systems Engineer work to reduce technical risk and develop a sufficient
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      159
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
understanding of the materiel solution development to validate design approaches and
cost estimates, to refine requirements and to ensure affordability is designed in to the
desired capability. After Milestone B, the emphasis shifts to defining and achieving
should cost estimates.
Should cost focuses on controlling the cost of both current and planned work. To have
an impact, these activities should inform contract negotiations leading up to Engineering
and Manufacturing Development (EMD) and Production and Deployment (P&D) phases.
Should cost management does not mean trading away the long-term value of sound
design practices and disciplined SE activities for short-term gain; it does mean
eliminating non-value-added activities and reports that are not required and that are
deemed unessential. For guidance on implementing should cost management, see the
Better Buying Power website.
Program Managers address affordability requirements and begin to apply should cost
management early in the acquisition life cycle. This includes applying SE to define an
affordable system design while also working to eliminate inefficiencies and duplication
where applicable and to drive productivity improvements into their programs.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      160
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
performance with cost, schedule, risk, and design constraints.
The eight technical management processes are implemented across the acquisition life
cycle and provide insight and control to assist the Program Manager and Systems
Engineer to meet performance, schedule, and cost goals. The eight technical processes
closely align with the acquisition life-cycle phases and include the top-down design
processes and bottom-up realization processes that support transformation of
operational needs into operational capabilities.
The ultimate purpose of the SE processes is to provide a framework that allows the SE
team to efficiently and effectively deliver a capability to satisfy a validated operational
need. To fulfill that purpose, a program implements the SE technical processes in an
integrated and overlapping manner to support the iterative maturation of the system
solution. The level of SE required supporting these processes declines as a program
progresses into the later phases of the acquisition life cycle. Implementation of the SE
processes begins with the identification of a validated operational need as shown in the
top left corner of the V-diagram (see Figure 4.1.F2). The technical processes enable the
SE team to ensure that the delivered capability accurately reflects the operational needs
of the stakeholders. The key activities that are accomplished by the execution of the
technical processes are described below:
The technical management processes, listed at the bottom of Figure 4.1.F2, provide a
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      161
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
consistent approach to managing the program’s technical activities and controlling
information and events that are critical to the success of the program. Taken together,
these 16 processes are a systematic approach focused on providing operational
capability to the warfighter while reducing technical and programmatic risk.
All organizations performing SE should scale their application and use of the processes
in DAG section 4.3. Systems Engineering Processes to reflect the unique needs of the
program and the type of product or system being developed. This scaling should reflect
the system’s maturity and complexity, size and scope, life-cycle phase, and other
relevant considerations. For example, lower-risk, less-complex programs may scale the
processes to ensure key activities are effective but not overly cumbersome (e.g.,
simpler and less-expensive tools, less-frequent reporting, and activities adjusted to fit
smaller organizations with fewer personnel).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      162
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
4.1.1. Systems Engineering Policy and Guidance
Policy and guidance related to systems engineering (SE) are intended to minimize the
burden and cost on programs while maintaining technical integrity through the planning
and execution of SE activities across the acquisition life cycle. Program Managers and
Systems Engineers should know and understand the statutory and regulatory SE
mandates. Table 4.1.1.T1 identifies top-level SE-related policy and guidance.
Compliance with mandated DoD SE policy is required for program approval and
completion of successful milestone decisions. DoD policy and guidance provide a
framework for structuring the program and help define the areas available for tailoring to
effectively and efficiently deliver capability to the warfighter.
Within this policy and guidance framework, tailoring the acquisition effort to meet
program cost, schedule, and performance goals is not only desired but mandated in
accordance with DoDD 5000.01. In July 2012, USD(AT&L) emphasized there is no one-
size-fits-all optimal program structure. Every program has its own optimal structure, and
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      163
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
that structure is dependent on many variables that contribute to program success or
failure. Areas that should be considered for tailoring include:
The requirements of DoD SE policy that are identified for tailoring by the Program
Manager are submitted to the Milestone Decision Authority (MDA) for approval.
The structuring of every program should start with a deep understanding of the nature of
the capability intended to be acquired and the effort needed to realize that capability.
Critical thinking during early program formulation is important to clearly identify the
internal and external stakeholders, system interdependencies, technological
opportunities, contractual and budgetary constraints, and policy mandates. The optimal
program structure includes the set of technical activities, events, and management
mechanisms that best address the unique circumstances and risks of the program.
All program strategy and planning documents depend on SE activities to define and
balance requirements against cost, schedule, and risks; identify potential solutions;
assess the maturity and feasibility of available technologies; develop a realistic
schedule; and allow for multiple other considerations affecting the final cost and delivery
of capability to the warfighter. Therefore, the Program Manager should build a program
office structure that ensures the Systems Engineer is an integrated part of the program
planning and execution activities.
The Systems Engineer leads or is a key enabler in the planning and execution of the
program's technical approach. To aid this planning, the Systems Engineer should
proactively seek experience from similar past and current programs and map this
learning as applicable into the SE planning of the program (see also DAG section
4.3.19.4. Lessons Learned, Best Practices, Case Studies).
The purpose of the Systems Engineering Plan (SEP) is to help Program Managers
develop, communicate, and manage the overall systems engineering (SE) approach
that guides all technical activities of the program. The SEP documents key technical
risks, processes, resources, metrics, SE products, and completed and scheduled SE
activities. The SEP is a living document that should be updated as needed to reflect the
program’s evolving SE approach and/or plans and current status. The PDUSD(AT&L)
memorandum, "Program Strategies and Systems Engineering Plan" requires programs
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      164
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
to use the SEP Outline to guide SEP preparation. The SEP Outline identifies the
minimum expected content to be addressed in the SEP. The SEP should be consistent
with and complementary to the Acquisition Program Baseline (APB), Technology
Development Strategy (TDS), Acquisition Strategy (AS), Test and Evaluation Strategy
(TES), Test and Evaluation Master Plan (TEMP), Program Protection Plan (PPP), Life-
Cycle Sustainment Plan (LCSP), and other program plans as appropriate. The SEP
should be written in a common language to clearly communicate what the program
plans to do in each phase of the acquisition life cycle and should be written to avoid
redundancy and maintain consistency with other planning documents.
For Major Defense Acquisition Programs (MDAPs), the Program Manager should
formally charter a SE Working-Level Integrated Product Team (WIPT), led by the
Systems Engineer, to assist in developing and monitoring SE activities as documented
in the program SEP. DoDI 5000.02, Public Law 111-23 (Weapon Systems Acquisition
Reform Act), and DoDI 5134.16 require a formal SEP to be approved by the Deputy
Assistant Secretary of Defense for Systems Engineering (DASD(SE)) for all Acquisition
Category level 1 (ACAT I) and potential ACAT I programs prior to Milestones A, B, and
C and program restructures. The PDUSD(AT&L) memo on "Improving Milestone
Process Effectiveness" requires that a draft formal SEP be available for the pre-
Engineering and Manufacturing Development (pre-EMD) review. For all lower ACAT
programs, the Component Acquisition Executive or delegated authority approves the
SEP. As a best practice, SEP updates should be approved by the Program Executive
Office (PEO) prior to each technical review and when the program changes in a way
that has an impact on the technical strategy. The Program Manager may approve other
periodic updates to the SEP.
The SEP describes the integration of SE activities with other program management and
control efforts, including the Integrated Master Plan (IMP), Work Breakdown Structure
(WBS), Integrated Master Schedule (IMS), Risk Management Plan (RMP), Technical
Performance Measures (TPMs), and other documentation fundamental to successful
program execution. The SEP also describes the program’s technical requirements,
engineering resources and management, and technical activities and products as well
as the planning, timing, conduct, and success criteria of event-driven SE technical
reviews throughout the acquisition life cycle. As a best practice, the Government SEP
should accompany the Request for Proposal (RFP) as guidance to the offerors. The
developer’s Systems Engineering Management Plan (SEMP), which is the contractor-
developed plan for the conduct, management, and control of the integrated engineering
effort, should be consistent with the Government SEP to ensure that Government and
contractor technical plans are aligned. The SEMP should define the contractor technical
planning and how it is accomplished from the contractor perspective, and articulates
details of their processes, tools, and organization.
As the program’s blueprint for the conduct, management, and control of all technical
activities, the SEP captures decisions made during the technical planning process and
communicates objectives and guidance to program personnel and other stakeholders.
The SEP should define the "who, what, when, why, and how" of the SE approach, for
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      165
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
example:
A system should not be acquired in isolation from other systems with which it associates
in the operational environment. The Program Manager and Systems Engineer should
understand how their system fills the needs for which it was designed and the enterprise
context within which it operates. This includes understanding the diverse or dissimilar
mix of other systems (hardware, software, and human) with which the system needs to
exchange information. To that end, the Program Manager and Systems Engineer
should define intersystem interfaces using a systems engineering (SE) document, i.e.,
the interface control document(s). In addition to interface control documents, the
Program Manager and Systems Engineer, should also actively pursue Memoranda of
Agreement or Memoranda of Understanding (MOA/MOU) with companion programs
regarding interfaces, data exchanges, and advance notices of changes
interdependencies and schedule (timing) that may affect either program. These
agreements are a professional courtesy and a means of mitigating the inherent risk in
planning to deliver a capability to an anticipated future technical baseline when there is
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      166
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
uncertainty that the other programs are able to maintain schedule and have adequate
resources to deploy the capabilities as planned.
Each DoD Service and Agency, and the Department itself, are examples of enterprises
as systems. Such organizations have the challenge of integrating and evolving multiple
portfolios of systems often with conflicting sets of objectives, constraints, stakeholders,
and demands for resources.
The Systems Engineer should be cognizant of the enterprise context and constraints for
the system in development and should factor these enterprise considerations into
acquisition technical decisions from the outset. Mission areas, for example, can be
viewed as cross-organizational enterprises and also provide critical context for system
acquisition. Controlled interfaces with enabling systems in the SoS architecture drive
system design. In some cases, enterprise considerations have been articulated as
standards and certification requirements. In other cases, system decisions need to be
made in the context of the larger Service portfolio of systems and mission area needs.
Most DoD capabilities today are provided by an aggregation of systems often referred to
as systems of systems (SoS). A SoS is described as a set or arrangement of systems
that results when independent and useful systems are integrated into a larger system
that delivers unique capabilities. For complex SoS, the interdependencies that exist or
are developed between and/or among the individual systems being integrated are
significantly important and need to be tracked. Each SoS may consist of varying
technologies that matured decades apart, designed for different purposes but now used
to meet new objectives that may not have been defined at the time the systems were
fielded.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      167
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Both individual systems and SoS conform to the accepted definition of a system in that
each consists of parts, relationships, and a whole that is greater than the sum of its
parts; however, not all systems are SoS. There are distinct differences between
systems and SoS that should be taken into account in the application of SE to SoS (see
Table 4.1.3.T1, adapted from DoD Systems Engineering Guide for Systems of Systems,
page 11).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      168
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Application of Systems Engineering to Systems of Systems
Systems of systems (SoS) systems engineering (SE) deals with planning, analyzing,
organizing, and integrating the capabilities of new and existing systems into a SoS
capability greater than the sum of the capabilities of its constituent parts. Consistent
with the DoD transformation vision and enabling net-centric operations, SoS may deliver
capabilities by combining multiple collaborative and independent-yet-interacting
systems. The mix of systems may include existing, partially developed, and yet-to-be-
designed independent systems.
The DoD Guide to Systems Engineering for Systems of Systems addresses the
application of SE to SoS. The guide defines four types of SoS (see Table 4.1.3.T2).
When a SoS is recognized as a "directed," "acknowledged," or "collaborative" SoS, SE
is applied across the constituent systems and is tailored to the characteristics and
context of the SoS. Due to increased efforts to network systems to facilitate information
sharing across the battlespace, most DoD systems also may be viewed as components
of a "virtual" SoS. For virtual SoS, DoD net-centric policies and strategies, such as,
Department of Defense Net-Centric Services Strategy provide SE guidance regarding
SoS contexts where there is an absence of explicit shared objectives or central
management.
            Type                                                      Definition
                               Directed SoS are those in which the SoS is engineered and managed to fulfill
                               specific purposes. It is centrally managed during long-term operation to
                               continue to fulfill those purposes as well as any new ones the system owners
    Directed                   might wish to address. The component systems maintain an ability to operate
                               independently, but their normal operational mode is subordinated to the
                               centrally managed purpose.
                               Acknowledged SoS have recognized objectives, a designated manager, and
                               resources for the SoS; however, the constituent systems retain their
    Acknowledged               independent ownership, objectives, funding, development, and sustainment
                               approaches. Changes in the systems are based on cooperative agreements
                               between the SoS and the system.
                               In collaborative SoS, the component systems interact more or less voluntarily
    Collaborative              to fulfill agreed-upon central purposes.
                               Virtual SoS lacks a central management authority and a centrally agreed-
                               upon purpose for the system of systems. Large-scale behavior emerges-and
    Virtual                    may be desirable-but this type of SoS relies upon relatively invisible, self-
                               organizing mechanisms to maintain it.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      169
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
4.1.3.1. Software
4.1.3.1. Software
Software is a key enabler for almost every system, making possible the achievement
and sustainment of advanced warfighting capabilities. Development and sustainment of
software is frequently the major portion of the total system life-cycle cost, and factors
such as safety, security, reliability, interoperability, and insertion of new technology are
considered at every decision point in the acquisition life cycle.
Establish the software acquisition strategy as early as possible to address function and
component allocation to software and determine what is to be developed, what is
provided as Government off-the-shelf (GOTS) software, commercial-off-the-shelf
(COTS) software, or open source software (OSS), and what is a mix or hybrid. The
strategy also incorporates plans for associated data and intellectual property rights for
GOTS, COTS, and OSS.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      170
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
M&S activities are most valuable earlier in program planning as decision support tools
and may be used iteratively to assess evolving functional architectures. The cost of
M&S is allocated during initial program planning. Cost basis is the rationale supporting
the balance between M&S cost and degree of needed risk reduction. M&S used by a
Program Manager to make decisions should be verified and validated to the intended
use in a time frame before assessment is needed. Data used by M&S to support
assessments should have a known pedigree and should be adequate to the level of
assessment. See DAG section 4.3.19.1. Modeling and Simulation for more information.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      171
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
interfaces in one component to reduce risk of vulnerability, aggregating functions having
higher mutual interaction, determining components that have well-defined interfaces to
other components as candidates for technology insertion or OSS in support of OSA,
and allocation of functionality to maximize use of COTS given acceptable risk.
When employing COTS software, criteria for selecting among competitive alternatives
may not include details of commercial design or performance but should require ample
evidence that the software is adequate for its intended use. Code-scanning tools should
be used to help ensure that COTS software does not pose a security or software
assurance risk. (See DAG Chapter 7 Acquiring Information Technology, Including
National Security Systems and NIST-SP-800 series publications for additional
information.) In addition, mitigation of security and information assurance risks
associated with COTS software go beyond code-scanning techniques for their solution.
Those risk mitigation efforts should be expanded to make use of activities identified in
DAG section 4.3.18.24. System Security Engineering, as well as the activities discussed
in DAG Chapter 13 Program Protection.
As a best practice, the Systems Engineer for a software-intensive system, defined as "a
system in which software represents the largest segment in one or more of the following
criteria: system development cost, system development risk, system functionality, or
development time" (DAU Glossary of Acronyms and Terms), should be well versed in
the technical and management activities of computer programming, software project
planning, and software configuration management, including defining computer software
configuration items. The SE approach should include software engineers early in the
acquisition life cycle to ensure software considerations are included in defining and
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      172
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
allocating software-related requirements and generating cost and schedule estimates,
especially for software-intensive systems. Software engineers are also needed to
evaluate the developer’s software architecture, functional baseline, allocated baseline,
and product baseline, documents, plans, and estimates, including M&S capabilities and
facilities. Program-independent software engineers should support validation activities.
For software-related acquisitions the Systems Engineering Plan (SEP) should consider
the following, as a minimum, for software SE planning:
    •    Software-unique risks
    •    Inclusion of software in technical reviews, with the addition of the Software
         Specification Review (SSR) as a precursor to the Preliminary Design Review
         (PDR) for software-intensive systems
    •    Software organization, integrated product teams, and relationships to
         interdependent organizations
    •    Software technical performance, process, progress, and quality metrics (see
         DAG section 4.3.4. Technical Assessment Process)
    •    Software safety, security, protection, and similar requirements, including
         processes, architecture, and interfacing systems
    •    Configuration management, verification, and validation of software integration
         labs/facilities used as tools for software development
    •    Open systems architecture, associated data rights, and sustainment
         considerations
    •    Automated test plans, development tools, and pedigreed data to support
         modeling of requirements, design, and environmental interfaces
    •    Software problem reporting and assessment, code development, build
         generation, and regression testing
    •    Software independent verification and validation (IV&V) to be accomplished,
         especially as it relates to contractor proprietary software
    •    Versioning, data control, and testing, especially for GOTS
    •    Verification of documentation, configuration management, test relevancy, and
         other considerations for legacy versus new software
Each of the Services provides additional guidance to assist the Program Manager,
Systems Engineer, and Software Engineers on software-intensive systems:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      173
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    The Department of the Army provides software metrics recommendations within
         DA-PAM-70-3, Army Acquisition Procedures and DA-PAM-73-1, Test and
         Evaluation in Support of Systems Acquisition.
Software considerations occur and vary throughout the acquisition life cycle, with
specific activities associated with each acquisition phase described in Table 4.1.3.1.T1.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      174
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
             Phase                                          Software Considerations
    Operations and                 The In-Service Review (ISR) assesses user acceptance and potential
    Support                        upgrades on delivered software systems. A block change or follow-on
                                   incremental development may be defined that delivers maintenance,
                                   safety, or urgent builds and upgrades to the field in a controlled manner.
                                   Procedures for updating and maintaining software on fielded systems can
                                   require operators to download new builds or to install them from physical
                                   media, and may require more training. Procedures should be in place to
                                   support effective configuration management and control. There are
                                   inherent risks involved in modifying software on fielded systems upon
                                   which warfighters depend while engaged in frontline activities. Another
                                   aspect of the hardware-software interaction is that maliciously altered
                                   devices or inserted software can infect the supply chain, creating
                                   unexpected changes to systems. Vigilance is needed as part of supply
                                   chain risk management (see DAG Chapter 5 Life-Cycle Logistics and
                                   Chapter 13 Program Protection ). Upon completion of development, the
                                   problem report tracking system can be used with other factors as legacy
                                   information to inform system and component upgrades. During
                                   Operations and Support phase, software problem reporting is continued.
Software Development Plan (SDP): The SDP as a best practice provides details
below the level of the Systems Engineering Plan (SEP) and the contractor’s Systems
Engineering Management Plan (SEMP) for managing software development and
integration. The SDP Data Item Description (DID) DI-IPSC-81427A is a tailorable
template and a useful starting point in defining a software development plan. The SDP
provides the Systems Engineer with insight into, and a tool for monitoring, the
processes being followed by the developer for each activity, the project schedules, the
developer’s software organization, and resource allocations.
Data Protection and Software Assurance: These factors are defined as the level of
confidence that software functions as intended and is free of vulnerabilities, either
intentionally or unintentionally designed or inserted as part of the software code,
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      175
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
throughout the acquisition life cycle. The Program Manager is responsible for protecting
system data and software, whether the data are stored and managed by the program
office or by the developer (see DAG Chapter 13 Program Protection).
Software Data Management and Technical Data Rights: Rights associated with
commercial products can be highly restrictive and are defined in licenses that may
restrict the number of copies made and ability to alter the product. Often there is no
assurance of suitability for intended purposes and no recourse to the vendor. Open
source, sometimes referred to as "freeware," may not be free and may also have
restrictions or carry embedded modules that are more restrictive than the overall
package. The Program Manager, Systems Engineer, software engineer, and contracting
officer should be familiar with the restrictions placed on each software item used in the
contract or deliverable to the Government. The Program Office should determine the
necessary intellectual property rights to computer software and should ensure that the
intellectual property right should be determined in advance of the RFP and contract
award and that they are acquired as needed, including:
Software Reuse: The reuse of any system, hardware, firmware, or software should be
addressed in multiple plans and processes throughout the acquisition life cycle,
including the SEP, SDP, firmware development plan, configuration management plan,
test plans (Test and Evaluation Master Plan, Software Test Plan, Independent
Verification and Validation Plan), and quality assurance plans (system and software).
(Note: Software reuse has traditionally been overestimated in the beginning of
programs, and software reuse has often proven to be more costly than new software
development. Software reuse plans should be monitored as a potential risk.) For more
discussion of the reuse of software, see DAG section 4.3.18.15. Open Systems
Architecture.
Government and Industry Teaming: Teaming is needed in order for the Government
to successfully acquire software-reliant systems with industry as a partner. As a result of
the teaming agreement, the Government may be able to use the experience and
expertise of its industry partner. Extensive teaming with industry makes it incumbent on
the Government to ensure that it maintains current and applicable software engineering
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      176
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
expertise.
Software Safety: Software safety is applicable to most DoD systems as a factor of the
ubiquitous nature of software-driven functions, network connectivity, and systems of
systems (SoS). Specific mandatory certifications such as "air worthiness certification"
require attention early in the development cycle to ensure adequate documentation and
testing are planned and executed to meet certification criteria. Systems Engineers are
encouraged to check with certification authorities frequently because rules can change
during development.
Organizing and staffing the systems engineering (SE) organization and providing
supporting resources and tools are critical tasks that merit attention from both the
Program Manager and Systems Engineer because these tasks influence the effective
implementation and control of the SE approach. The Program Manager is responsible
for developing a tailored strategy that enables a cost-effective program to deliver a
required capability within the needed delivery time. However, any program tailoring
should be based on SE assessments of maturity and risk in order to determine the
appropriate entry point into the acquisition life cycle and to identify opportunities to
streamline the acquisition strategy. Therefore, the Program Manager should create a
program office structure ensuring the Systems Engineer is an integrated part of the
program planning and execution activities.
Building an integrated SE team with the expertise and knowledge to implement and
execute an effective program is a key to success. The structure and size of the SE
organization should reflect both the risk and complexity of the system under
development and its life-cycle phase. The Systems Engineering Plan (SEP) describes
the SE organizations of both the Government program office and, when available, the
developer organization.
To provide the required capabilities in the most efficient and effective manner, the
Program Manager should ensure completion of the following activities that affect the
technical approach:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      177
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         Strategy (AS)), program plans (e.g., SEP, Program Protection Plan (PPP), Test
         and Evaluation Master Plan (TEMP), and Life-Cycle Sustainment Plan (LCSP)),
         and cost and budget documents
    •    Establishing program office organization (roles, responsibilities, authorities
         accountabilities) and staffing the program office and Government technical team
         with qualified (trained and experienced) Systems Engineers and other relevant
         technical professionals
    •    Integrating all aspects of the program office, including business processes
         relating to program management, SE, test, and program control
    •    Ensuring all necessary memoranda of understanding and agreement
         (MOU/MOAs) are in place and sufficiently detailed
    •    Resourcing the managers of all functional areas such as administration,
         engineering, logistics, test, etc.
    •    Managing program risks by developing, resourcing, and implementing realistic
         mitigation strategies
    •    Approving the configuration management plan and ensuring adequate resources
         are allocated for implementing configuration management throughout the life
         cycle
    •    Reviewing/approving Engineering Change Proposal (ECP) requests and
         determining the path forward required by any baseline changes
    •    Ensuring contracting activities are coordinated with the program systems
         engineering team
The Systems Engineer is responsible for planning and overseeing all technical activity
within the program office and for managing effective SE processes. The Systems
Engineer should ensure the Program Manager has sufficient and clear information for
scheduling and resource-allocation decisions. In addition, the Systems Engineer
implements and controls the technical effort by:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      178
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Working closely with developer’s SE teams to ensure integrated and effective
         processes
    •    Planning and executing the formal technical reviews and audits
    •    Tracking and reporting baseline changes and recommending a path forward, as
         a part of configuration management
    •    Supporting the Program Manager in configuration management activities
    •    Identifying and mitigating the program’s technical risks which include
            o Integration risks
            o Engineering risks
            o Critical technology risks assessed in the Technology Readiness
                 Assessment (TRA)
    •    Measuring and tracking program maturity using technical performance measures,
         requirements stability, and integrated schedules
    •    Updating the PPP
    •    Staffing the engineering team with qualified and appropriate engineers
    •    Supporting updates to the TEMP and LCSP
    •    Supporting test and evaluation activities as documented in the TEMP (see Chief
         Developmental Tester responsibilities in DAG Chapter 9 Test and Evaluation)
    •    Reviewing requirements traceability matrix and cross reference matrix
         (verification)
    •    Managing root cause and corrective action (RCCA) efforts along with supporting
         the risk boards
    •    Ensuring selection of qualified vendors for parts, materiel, and processes (for
         hardware and software)
    •    Reviewing deliverables on the contract to ensure compliance and utility, and to
         ensure appropriate format and content
The Program Manager and Systems Engineer focus on the transformation of required
operational and sustainment needs into a system design capability. As the design
solution evolves through the application of the eight technical processes, the verification
component or test organization provides confidence that the design solution that
evolved from the requirements analysis, functional allocation, and design synthesis
properly addresses the desired capabilities. The Test Engineer, working in tandem with
the Systems Engineer, accomplishes the verification loop of the SE process. Together
the Systems Engineer and Test Engineer generate and analyze data from the
integrated tests. The developer uses the test results to improve system performance,
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      179
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
the SE team uses the test results for risk assessments, and the acquisition community
and operational evaluators use the test results for operational assessments of the
evolving system. This test and evaluation strategy should be consistent with and
complementary to the SEP. The Program Manager and the Systems Engineer work
closely with the Test Engineer to facilitate coordinated verification and validation
activities.
Stakeholders
The Program Manager has the critical role of approving a systems engineering (SE)
approach that includes all stakeholders. The Systems Engineer coordinates with all
participants to translate the operational needs and capabilities into technically feasible,
affordable, testable, measurable, sustainable, achievable (within scheduled need
dates), and operationally effective and suitable system requirements. The Systems
Engineer is responsible for planning and overseeing all technical activity within the
program office and for managing stakeholder expectations. Early and frequent
involvement with stakeholders by both the Program Manager and the Systems Engineer
facilitates the successful execution of SE activities throughout the acquisition life cycle.
Most program personnel are involved in one or more of the 16 SE processes. Personnel
from non-SE organizations or from outside the program office (e.g., end users,
requirements sponsors, maintainers, testers, planners) should be integrated within the
program’s technical management activities so they have the ability to actively participate
throughout the life cycle in support of SE-related activities.
The following is a partial list of the stakeholders that contribute to and benefit from SE
activities and processes:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      180
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Companion programs
IPTs provide both the Government and developer stakeholders with the opportunity to
maintain continuous engagement. This continuous engagement is necessary to ensure
a common understanding of program goals, objectives, and activities. These
Government/developer IPTs should further maintain effective communication as they
manage and execute those activities and trade-off decisions. The program’s SE
processes should include all stakeholders in order to ensure the success of program
efforts throughout the acquisition life cycle.
For Major Defense Acquisition Programs, the Program Manager ensures that the
program office is structured to interface with the SE Working-Level Integrated Product
Team (SE WIPT) (a multidisciplinary team responsible for the planning and execution of
SE) to address DoD leadership concerns and interests. The SE WIPT is chartered by
the Program Manager and is usually chaired by the Systems Engineer. The SE WIPT
includes representatives from OUSD(AT&L) and the component acquisition executive’s
organization, both Government and developer IPT leads from the program, the Program
Executive Office Systems Engineer, the SoS Systems Engineer, and the developer
Systems Engineer. A generic SE WIPT charter is available on the ODASD(SE) Policy
and Guidance website under "Guidance and Tools."
4.1.5. Certifications
4.1.5. Certifications
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      181
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
of the system. Certain specific certifications are required before additional design,
integration, network access, or testing can take place. For example, airworthiness
certifications need to be in place before an aircraft can begin flight testing. Often
programs insufficiently plan for the number of required certifications. Insufficient
planning for certifications can have a negative impact on program costs and schedule.
Obtaining the various certifications can be a lengthy process. As a result, the Program
Manager should ensure that the time necessary to obtain any required certification is
factored into technical planning. By planning for the activities required to achieve the
necessary certifications, the Program Manager and Systems Engineer can ensure that
development of the system continues uninterrupted while the program meets all system
certification requirements. Early planning allows the Systems Engineer and technical
team to begin interacting with certification authorities, which sets the foundation for
communication throughout the development of the system.
The SEP Outline requires programs to provide a certification matrix that identifies
applicable technical certifications and when they are required during the acquisition life
cycle. Programs should include certification activities and events in the Integrated
Master Schedule (IMS) and the Integrated Master Plan (IMP).
The Systems Engineer should actively participate in developing program contract tasks
to ensure that the appropriate technical activities are contained and properly scoped in
the final contract. Proper scoping of the technical tasks in the Statement of Work
(SOW), Statement of Objectives (SOO), or Performance Work Statement (PWS) is
necessary to ensure that the final system meets end user’s needs. Often contracting
activities may appear to be primarily programmatic in nature (e.g., acquisition strategy
development, writing requests for proposal, performing market research, developing the
Contract Data Requirements List (CDRL)) but, in fact, they reflect technical planning
and should be influenced by the desired technical content. For example, technical
understanding of data rights can be a key element in planning for modularity and open
systems design, or the decision to choose an incremental acquisition strategy depends
on generic functionality groupings that may not be appropriate for every system.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      182
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
If that is the case, the developer may have a strong incentive to call the review complete
as soon as possible. The Systems Engineer and Program Manager exercise best
judgment in an objective and informed manner to ensure the reviews are not
prematurely declared completed in order for the developer to qualify for the contract
incentive. Another area to which incentives are tied is the Earned Value Management
System (EVMS). The Program Manager should ensure that the EVMS tied to any
incentive measures the quality and technical maturity of technical work products instead
of just the quantity of work. If contracts include earned value (EV) incentives, the criteria
should be stated clearly and should be based on technical performance. EV incentives
should be linked quantitatively with:
The program office uses a Request for Information (RFI) to communicate expectations
and plans, including the expected business rhythm for contract execution. This
communication ensures the offerors have an opportunity to provide a tight linkage
across the Integrated Master Plan (IMP), Work Breakdown Structure (WBS), Integrated
Master Schedule (IMS), risk management, and cost in their proposals. Early industry
engagement opportunities include pre-solicitation notices, industry days, and other
market research venues.
Before releasing the RFP, the program office needs to allow enough time to develop
and mature the performance and functional specifications that need to be included in
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      183
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
the RFP. The RFP and supporting technical documentation clearly define the
Government’s expectations in terms of the performance and functional specifications,
program planning, program process, risks, and assumptions. The RFP also should
direct potential offerors to integrate their approach to reflect the Government’s
expectations.
Although there are many opportunities for contract-related interactions between the
Government and potential offerors prior to contract award, the RFP remains the primary
tool for shaping the contract, the program, and ultimately the system. See the "Guide for
Integrating Systems Engineering into DoD Acquisition Contracts, Version 1.0, 2006" for
additional guidance on the content and format of RFPs.
Within the RFP development team, the Systems Engineer should be responsible for the
technical aspects of the RFP and should perform the following actions:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      184
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Lead or support the technical evaluation during source selection, to include
         providing inputs to the development of source selection criteria
    •    Perform schedule risk assessments as part of the source selection evaluation
         process
    •    Support the Independent Management Review (Peer Review) of the RFP before
         release
    •    Identify external or SoS interfaces and ensure the technical interface requirement
         and task scope are unambiguous to the offerors
Table 4.1.6.T1 contains the typical technical contents of the RFP and the associated
Systems Engineer’s responsibilities, and should not be considered an exhaustive or
mandatory list.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      185
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                               Typical Technical Contents                            SE Responsibilities
        Section K                 •    Data rights                               •    Identify provisions that require
                                                                                      representations, certifications,
                                                                                      or the submission of other
   Representations,                                                                   information by offerors
   Certifications,                                                               •    Consider including a provision
   and                                                                                requiring offerors to identify any
   Other                                                                              technical data or computer
   Statements                                                                         software the offeror proposes to
                                                                                      deliver to the Government after
                                                                                      award with less than unlimited
                                                                                      rights
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      186
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                               Typical Technical Contents                            SE Responsibilities
        Section M                 •    Technical: technical solution,            •    Define technical evaluation
                                       supporting data, performance                   factors and provide SE specific
                                       specification                                  evaluation criteria used to
   Source Selection
                                  •    Management: SOW,                               assess proposals
   Evaluation                          Contractor Systems                        •    Participate on or lead the
   Factors                             Engineering Management                         technical evaluation team
                                       Plan (SEMP), IMS, risks                   •    Provide technical personnel to
                                       plans                                          participate on each evaluation
                                  •    Environmental objectives                       factor team (e.g., management,
                                       (when appropriate)                             past performance, cost)
                                  •    Quality or product assurance              •    Provide consistency across the
                                  •    Past performance                               SOW and system specifications
                                  •    Price or cost to the                      •    Evaluate RFP responses
                                       Government                                     against technical requirements,
                                  •    Extent offeror’s rights in the                 threshold requirements,
                                       data rights attachment meet                    management (e.g., SEMP,
                                       Government’s needs                             WBS, and program schedule),
                                                                                      and consistency across the
                                                                                      proposal (e.g., link between
                                                                                      WBS, program schedule, risks,
                                                                                      and cost)
                                                                                 •    Identify and assess the
                                                                                      technical risks for each
                                                                                      proposal, including schedule
                                                                                      risks and related risk mitigation
                                                                                      plans
    •    DAG sections 4.2.1 - 4.2.7 provide introductory material and describe the
         Systems Engineer’s role in each phase of the weapon system acquisition life
         cycle. The notional technical reviews and audits in each phase are identified, but
         the details are left for the second major area of 4.2.
    •    DAG section 4.2.8 provides an overview of technical reviews and audits, followed
         by DAG sections 4.2.9 - 4.2.17 which address each specific technical review and
         audit. This arrangement accommodates the planning and conducting of the
         technical reviews and audits in accordance with a program’s specific needs.
         Some large and complex programs may require each technical review and audit;
         others may combine technical reviews and audits or get permission to tailor them
         out.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      187
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
4.2.1. Life-Cycle Expectations
Systems engineering (SE) provides the technical foundation for all acquisition activities
regardless of acquisition category (ACAT) or acquisition model (e.g., weapon system or
information system). The SE framework described in this chapter spans the entire
acquisition life cycle and is based on DoDD 5000.01 and DoDI 5000.02. Framework
content should be tailored and structured to fit the technology maturity, risks,
interdependencies, related characteristics, and context for the program or the system of
interest. The succeeding sections identify the SE activities, processes, inputs, outputs,
and expectations during each acquisition phase and for each technical review and audit.
Acquisition milestones and SE technical reviews and audits serve as key points
throughout the life cycle to evaluate significant achievements and assess technical
maturity and risk. Table 4.2.1.T1 identifies the objectives of each SE assessment and
the technical maturity point marked by each review. The Materiel Development Decision
(MDD) review is the formal entry point into the acquisition process and is mandatory for
all programs in accordance with DoDI 5000.02. Depending on the maturity of the
preferred materiel solution, the Milestone Decision Authority (MDA) designates the initial
review milestone. This would normally be the MDD, but it can be A, B, or C. In any case
the decision is documented in the Acquisition Decision Memorandum (ADM) published
immediately after an MDD event. Since the review milestone is chosen consistent with
the maturity of the preferred materiel solution, entry at any milestone requires evidence
of the associated solution maturity as summarized in Table 4.2.1.T1 Technical Maturity
Points.
Department experience (e.g., GAO Report 12-400SP) has found that successful
programs use knowledge-based product development practices which include steps to
gather knowledge to confirm the program’s technologies are mature, their designs are
stable, and their production processes are in control. Successful product developers
ensure a high level of knowledge is achieved at key junctures in development. Table
4.2.1.T1 summarizes the concept of technical maturity points.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      188
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                       TECHNICAL MATURITY POINTS
 DoD Acquisition
Milestone/Decision
       Point                                                      Technical
                                       Objective                                            Additional Information
        &                                                        Maturity Point
    Technical
   Review/Audit
                               Decision to assess             Capability gap met by Technically feasible solutions
Materiel                       potential materiel             acquiring a materiel  have the potential to effectively
                               solutions and                  solution.             address a validated capability
Development                    appropriate phase for                                need. Technical risks
Decision (MDD)                 entry into acquisition life                          understood.
                               cycle.
                               Recommendation that            System parameters            Initial system performance
Alternative                    the preferred materiel         defined; balanced with       established and plan for further
Systems Review                 solution can affordably        cost, schedule, and          analyses supports Milestone A
(ASR)                          meet user needs with           risk.                        criteria.
                               acceptable risk.
                               Decision to invest in     Affordable solution      Affordability targets identified
                               technology maturation     found for identified     and technology development
                               and preliminary design.   need with acceptable plans, time, funding, and other
                                                         technology risk, scope, resources match customer
Milestone A                                              and complexity.          needs. Prototyping and end-
                                                                                  item development strategy for
                                                                                  Technology Development (TD)
                                                                                  phase focused on key
                                                                                  technical risk areas.
                               Recommendation to         Level of understanding Government and contractor
                               proceed into              of top-level system      mutually understand system
                               development with          requirements is          requirements including
                               acceptable risk.          adequate to support      (1) the preferred materiel
                                                         further requirements solution (including its support
System                                                   analysis and design      concept) from the Materiel
Requirements                                             activities.              Solution Analysis (MSA)
Review (SRR)                                                                      phase,
                                                                                  (2) available technologies
                                                                                  resulting from the prototyping
                                                                                  efforts, and
                                                                                  (3) maturity of interdependent
                                                                                  systems.
                               Recommendation that Functional baseline            Functional requirements and
                               functional baseline fully established and under verification methods support
                               satisfies performance     formal configuration     achievement of performance
System Functional              requirements and to       control. System’s        requirements. Acceptable
Review (SFR)                   begin preliminary design functions decomposed technical risk of achieving
                               with acceptable risk.     and defined to lower allocated baseline.
                                                         levels in order to start
                                                         preliminary design.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      189
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                       TECHNICAL MATURITY POINTS
 DoD Acquisition
Milestone/Decision
       Point                                                      Technical
                                       Objective                                            Additional Information
        &                                                        Maturity Point
    Technical
   Review/Audit
                  Recommendation that Allocated baseline            Preliminary design and basic
                  allocated baseline fully established such that system architecture support
Preliminary       satisfies user           design provides          capability need and
Design Review     requirements and         sufficient confidence to affordability target
(PDR)             developer ready to begin support 2366b            achievement.
                  detailed design with     certification.
                  acceptable risk.
                  Determination that       Systems engineering The Request for Proposal
                  program plans are        trades completed and (RFP) reflects the program’s
                  affordable and           have informed            plans articulated in the draft
                  executable and that the program requirements. Acquisition Strategy and other
Pre-Engineering program is ready to        Competitive              draft, key planning documents
and Manufacturing proceed to EMD phase prototyping and the          such as the Systems
Development       source   selection.      development     of the   Engineering Plan (SEP),
(EMD) Review                               preliminary   design     Program Protection Plan
                                           have influenced risk     (PPP), Test and Evaluation
                                           management plans         Master Plan (TEMP), and Life-
                                           and should cost          Cycle Sustainment Plan
                                           initiatives.             (LCSP).
                  Decision to invest in    Critical technologies Maturity, integration, and
                  product development,     assessed able to meet producibility of the preliminary
                  integration, and         required performance design (including critical
                  verification as well as  and are ready for        technologies) and availability
Milestone B       manufacturing process further development. of key resources (time,
                  development.             Resources and            funding, other) match customer
                                           requirements match. needs. Should cost goals
                                                                    defined.
                  Recommendation to        Product design is        Design is stable and performs
                  start fabricating,       stable. Initial product as expected. Initial product
                  integrating, and testing baseline established. baseline established by the
Critical Design   test articles with                                system detailed design
Review (CDR)      acceptable risk.                                  documentation confirms
                                                                    affordability/should-cost goals.
                                                                    Government control of Class I
                                                                    changes as appropriate.
                  Recommendation that System design verified Actual system (which
                  the system as tested has to conform to            represents the production
                  been verified (i.e.,     functional baseline.     configuration) has been
System            product   baseline  is                            verified through required
                  compliant with the                                analysis, demonstration,
Verification      functional baseline) and                          examination, and/or testing.
Review (SVR)      is ready for validation                           Synonymous with system-level
                  (operational                                      Functional Configuration Audit
                  assessment) with                                  (FCA).
                  acceptable risk.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      190
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                       TECHNICAL MATURITY POINTS
 DoD Acquisition
Milestone/Decision
       Point                                                      Technical
                                       Objective                                            Additional Information
        &                                                        Maturity Point
    Technical
   Review/Audit
                               Recommendation that            Design and                   Production engineering
Production       production processes     manufacturing are                                problems resolved and ready
Readiness Review are mature enough to ready to begin                                       to enter production phase.
(PRR)            begin limited production production.
                               with acceptable risk.
                               Decision to produce        Manufacturing           Production readiness meets
                               production-                processes are mature cost, schedule, and quality
                               representative units for   enough to support       targets. Begin initial
                               operational test and       Low-Rate Initial        deployment as appropriate.
Milestone C                    evaluation (OT&E).         Production (LRIP) and
                                                          generate production-
                                                          representative articles
                                                          for OT&E.
                               Recommendation to          Final product baseline Confirmation that the system to
                               start full-rate production established. Verifies be fielded matches the product
                               and/or full deployment the design and              baseline. Product configuration
Physical                       with acceptable risk.      manufacturing           finalized and system meets
                                                          documentation           user’s needs. Conducted after
Configuration                                             matches the item to be OT&E issues are resolved.
Audit (PCA)                                               fielded, following
                                                          update of the product
                                                          baseline to account for
                                                          resolved OT&E issues.
                               Decision to begin full-    Manufacturing           Delivers fully funded quantity
                               rate production and/or processes are mature of systems and supporting
Full-Rate                      decision to begin full     and support full-rate   materiel and services for the
Production                     deployment.                production and/or       program or increment to the
Decision Review                                           capability              users.
                                                          demonstrated in
(FRP DR) or Full                                          operational
Deployment                                                environment
Decision Review                                           supporting full
(FDDR)                                                    deployment (i.e.,
                                                          system validated
                                                          through OT&E).
Figure 4.2.1.F1 provides the end-to-end perspective and the integration of SE technical
reviews and audits across the system life cycle.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      191
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                    Figure 4.2.1.F1. Weapon System Development Life Cycle
The acquisition model captured in this version of the DAG Chapter 4 is based on the
weapon system model described in DoDI 5000.02 dated December 8, 2008. Other
models are being used in the Defense Department which are variations of this model.
The anticipated update to DoDI 5000.02 is expected to address these other models.
When it is issued, DAG Chapter 4 will be updated accordingly.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      192
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
as shown in the SoS SE Implementers’ View in Figure 4.2.1.2.F1. The backbone of SoS
SE implementation is continuous analysis that considers changes from the broader
environment as well as feedback from the ongoing engineering process. The results of
that analysis provide the basis for developing and evolving the SoS architecture,
identifying or negotiating changes to the constituent systems that impact the SoS, and
working with the constituent systems to implement and integrate those changes. This
view of SoS SE implementation provides structure to the evolution of the SoS through
changes in constituent systems that are typically on different life-cycle timelines,
adapting as systems come in and move out, and as concept of operations (CONOPS)
adapt and change. Hence the need for continually updating the SoS analysis and
adapting the architecture and updating systems on an ongoing basis.
Therefore, SoS SE planning and implementation should consider and leverage the
development plans of the individual systems in order to balance SoS needs with
individual system needs. Finally, SoS SE should address the end-to-end behavior of the
ensemble of systems, addressing the key issues that affect this end-to-end behavior
with particular emphasis on integration and interoperability. Effective application of SoS
SE addresses organizational as well as technical issues in making SE trades and
decisions. The Systems Engineer has different roles and authorities at the system
versus the SoS level. The SoS-level Systems Engineer can provide the technical
foundation for effective user capabilities by conducting balanced technical management
of the SoS, using an SoS architecture based on open systems and loose coupling, and
focusing on the design strategy and trades (both at establishment and through
evolution). They should collaborate with multiple Systems Engineers across multiple
systems. Each Systems Engineer has the authority for their system implementation.
These waves of implementations and upgrades taken as a whole provide the SoS
capability.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      193
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Consideration of SoS in SE for Individual Systems
Both from an individual system perspective and the SoS perspective, Program
Managers and Systems Engineers have found it difficult to coordinate and balance the
acquisition objectives and strategies for a given system with those of the SoS and other
constituent systems. A senior governance body is useful to provide a forum for
discussion and decision. This forum should address technical plans, configuration
management, and strategies with respect to interfaces, interdependences, risks, and
risk mitigation. It is critical to address all equities and make collective decisions that can
be implemented in changes to a system’s configuration.
One SoS best practice is to monitor closely interdependent programs, with checkpoints
at scheduled design reviews to assess program progress, assess related risks, and
determine actions to mitigate potentially negative impacts.
Table 4.2.1.2.T1 lists SoS considerations for systems at each stage of acquisition. At
each phase, the SE approach to addressing SoS-related dependencies should be
addressed in the Systems Engineering Plan (SEP).
                                                                                                                 P&D and
                         Pre-MDD                   MSA                     TD                  EMD
                                                                                                                  O&S
Focus                 • Define role of      • In the Analysis • Assess the               • Develop,            • Verify the
                        the system in         of Alternatives   technical                  verify, and           as-built
                        supporting a          (AoA),            approaches                 validate the          interface
                        mission               consider the      and risks for              detailed              s meet
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      194
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                         capability,        alternatives in   addressing                    design that     specs
                         including          the context of    system                        addresses       and
                         relationship to    the larger SoS    requirements                  system          support
                         other systems      supporting the    including                     requirements,   operation
                         in the SoS         capability.       considerations                considering     al needs.
                         which support • In the               for the system                the SoS       • Support
                         that capability.   operational       as a                          context         effective
                                            analysis and      component                     including       system
                                            concept           operating in a                recognized      operation in a
                                            engineering for   SoS context                   dependencies SoS context.
                                            the preferred     (including                    and
                                            materiel          dependencies,                 interfaces.
                                            solution,         interoperability
                                            consider the      , and
                                            new system in     interfaces).
                                            the SoS         • Address
                                            context,          considerations
                                            identify          of changes
                                            dependencies      needed in
                                            and               other systems
                                            relationships     for the
                                            with other        systems in
                                            systems,          acquisition to
                                            including key     meet capability
                                            interfaces and    objectives.
                                            technical risks
                                            based on SoS
                                            considerations
                                            to be
                                            addressed in
                                            Technology
                                            Development
                                            (TD).
                                          • Identify the
                                            nature of the
                                            dependencies
                                            and interfaces,
                                            including the
                                            parties
                                            involved, and
                                            an initial plan
                                            for addressing
                                            these including
                                            initial
                                            memoranda of
                                            agreement
                                            (MOAs).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      195
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                         currently            key system       ICAs               dependencies
                         supporting           dependencies • Risks              • Risks
                         capability           or interfaces    associated         associated
                                              that influence   with SoS           with SoS
                                              system           dependencies       dependencies
                                              requirements     and                and
                                            • Initial          interoperability   interoperabilit
                                              management       requirements       y
                                              plans with                          requirements
                                              supporting
                                              MOAs,
                                              including draft
                                              Interface
                                              Control
                                              Agreements
                                              (ICAs) for
                                              collaborations
                                              with other
                                              systems in a
                                              SoS
                                            • Risks
                                              associated
                                              with SoS
                                              dependencies
                                              (both
                                              programmatic
                                              and technical)
                                              and
                                              interoperability
                                              requirements,
                                              including
                                              environment,
                                              safety, and
                                              occupational
                                              health
                                              (ESOH), and
                                              security risks
                                              to be accepted
                                              by Joint
                                              Authorities
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      196
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                      • Identification                               expected
                        of                                           performance.
                        stakeholders
For a more detailed discussion of SE for SoS, refer to the SoS Initiatives page on the
DASD(SE) website.
The objectives of the pre-Materiel Development Decision (MDD) efforts are to obtain a
clear understanding of user needs, identify a range of technically feasible candidate
materiel solution approaches, consider near-term opportunities to provide a more rapid
interim response, and develop a plan for the next acquisition phase, including the
required resources. This knowledge supports the Milestone Decision Authority’s (MDA)
decision to authorize entry into the acquisition life cycle and pursue a materiel solution.
An additional objective is to characterize trade space, risks, and mission
interdependencies to support the start of the Analysis of Alternatives (AoA).
Policy in this area comes from two perspectives: the Joint Capabilities Integration and
Development System (JCIDS) defined in CJCSI 3170.01 and the Defense Acquisition
System (DAS) defined in DoDD 5000.01.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      197
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
planning activities that provide the foundation for informed investment decisions on the
path a materiel development follows to effectively, affordably, and sustainably meet
operational needs. Development planning activities are initiated prior to the Materiel
Development Decision, continue throughout the Materiel Solution Analysis phase, and
eventually transition to the program environment.
An important aspect of the pre-MDD effort is narrowing the field of possible solutions to
a reasonable set that is analyzed in the AoA. Early recognition of constraints, combined
with analysis of technical feasibility, can eliminate many initial ideas because they lack
the potential to meet the need in a timely, sustainable, and cost-effective manner.
Conversely, the range of alternatives analyzed in the AoA are chosen from a sufficiently
broad solution space. Whenever possible, the Systems Engineer should try to engage
with the end user before the Initial Capabilities Document (ICD) and associated
operational architecture is validated by the Joint Requirements Oversight Council
(JROC) (see DAG section 4.3.12. Stakeholder Requirements Definition Process).
Studies have found that "programs that considered a broad range of alternatives tended
to have better cost and schedule outcomes than the programs that looked at a narrow
scope of alternatives." (Reference GAO-09-665 Analysis of Alternatives, page 6.)
The work performed in this time frame should be well documented so the Program
Manager and Systems Engineer, when assigned, can benefit from the mutual
understanding of the basis of need (requirements) and the art of the possible
(concepts/materiel solutions). To achieve these benefits, the Systems Engineer should
proactively collaborate with the Science and Technology (S&T) and user communities.
Often there is no assigned Program Manager or Systems Engineer at this point in the
weapon system’s life cycle. Instead, a designated Service representative is
orchestrating and leading the preparations for MDD. This leader, motivated by the
entrance criteria for MDD, is responsible for synthesizing the necessary information to
satisfactorily address the four policy evidence needs stated in DTM 10-017. For a more
detailed discussion of development planning policy, refer to the white paper on the pre-
MDD Activities.
The designated Service representative should make use of appropriate models and
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      198
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
simulations (DAG section 4.3.19.1 Modeling and Simulation) to develop required MDD
evidence. The designated Service representative also should consider issuing a
Request for Information (RFI) to industry to help identify and characterize alternative
solutions.
Inputs
Table 4.2.2.T1 summarizes the primary inputs and technical outputs associated with
this part of the life cycle. Unlike the sections that follow, this pre-MDD period is the
bridge between JCIDS and the DAS. It is the period before the pre-systems acquisition
period of the DAS.
Other analyses
        •    Other prior analytic, experimental, prototyping, and/or technology demonstration efforts may
             be provided by the S&T community
The MDD review requires an ICD that represents an operational capability need
validated in accordance with CJCSI 3170.01. The Joint Staff provides this document,
which is generally the output of a Capability-Based Assessment (CBA) or other studies.
The designated Service representative should have access to both the ICD and
supporting studies. Other technical information (such as models and simulations) may
be useful for understanding both the need and its context. The S&T community can
contribute pertinent data and information on relevant technologies, prototypes,
experiments, and/or analysis. An example is available of how a program may provide
evidence at the MDD review to support the MDA decision.
Activities
Figure 4.2.2.F1 provides the end-to-end perspective and the integration of SE technical
reviews and audits across the acquisition life cycle.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      199
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                    Figure 4.2.2.F1. Weapon System Development Life Cycle
This effort ends after a successful MDD review in which the MDA approves entry into
the Defense Acquisition System. This decision is documented in a signed Acquisition
Decision Memorandum (ADM), which specifies the approved entry point, typically the
Materiel Solution Analysis (MSA) phase. Outputs of pre-MDD efforts provided in Table
4.2.2.T2 also include approved AoA Guidance and an AoA Study Plan, which should be
informed by SE.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      200
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                 Table 4.2.2.T2. Technical Outputs Associated with Pre-MDD
All potential materiel solutions pass through an MDD before entering the DAS.
However, the MDA may authorize entry at any point in the acquisition life cycle based
on the solution’s technical maturity and risk. Technical risk has several elements:
technology risk, engineering risk, and integration risk. If the Service-recommended entry
point is beyond the MSA phase, for example part way through the Technology
Development (TD) phase, the program provides evidence that all MSA and TD phase-
specific entrance criteria and statutory requirements are met, and that the solution’s
technical maturity supports entry at the point in the phase being proposed. Emphasis
should be placed on the soundness of supporting technical information and plans in
order to inform the MDA’s decision, as opposed to which documents may or may not be
complete.
As the next section explains, the MSA phase is made up of more than an AoA; it
includes technical tasks to determine the preferred materiel solution based on the AoA
results and technical tasks to prepare for the initial milestone review. Therefore, the
technical plan and budget presented at the MDD should reflect the full range of activities
required in the next phase.
The objective of the Materiel Solution Analysis (MSA) phase is to select and adequately
describe a preferred materiel solution to satisfy the phase-specific entrance criteria for
the next program milestone designated by the Milestone Decision Authority (MDA).
Usually, but not always, the next milestone is a decision to invest in technology
maturation and preliminary design in the Technology Development (TD) phase. The
systems engineering (SE) activities in the MSA phase result in several key products.
First, a system model and/or architecture is developed that captures operational context
and envisioned concepts, describes the system boundaries and interfaces, and
addresses operational and functional requirements. Second, a preliminary system
performance specification is developed that defines the performance of the preferred
materiel solution. Third, the Systems Engineer advises the Program Manager on what is
to be prototyped, why, and how.
During the MSA phase, the program team identifies a materiel solution to address user
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      201
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
capability gaps partially based on an Analysis of Alternatives (AoA) (i.e., analysis of the
set of candidate materiel solutions) led by the Director, Cost Analysis and Program
Evaluation (CAPE) and conducted by an organization independent from the Program
Manager. Once the Service sponsor selects a preferred materiel solution, the program
team focuses engineering and technical analysis on this solution to ensure development
plans, schedule, funding, and other resources match customer needs and match the
complexity of the preferred materiel solution. SE activities should be integrated with
MSA phase-specific test, evaluation, logistics, and sustainment activities identified in
DAG Chapter 9 Test and Evaluation and Chapter 5 Life-Cycle Logistics.
This phase has two major blocks of activity: (1) the AoA and (2) the post-AoA
operational analysis and concept engineering to prepare for a next program milestone
designated by the MDA (see Figure 4.2.3.F1).
The AoA team considers a range of alternatives and evaluates them from multiple
perspectives as directed by the AoA Guidance and AoA Study Plan. Engineering
considerations including technical risk should be a component of the AoA Guidance and
be addressed in the AoA Study Plan.
The objective of the AoA is to analyze and characterize each alternative (or alternative
approach) relative to the others. The AoA does not result in a recommendation for a
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      202
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
preferred alternative; it provides information that the Service sponsor uses to select
which materiel solution to pursue. The Systems Engineer may participate in the AoA to
help analyze performance, feasibility, and to optimize alternatives. Using the AoA
results, the Service sponsor may conduct additional engineering analysis to support the
selection of a preferred materiel solution from the remaining trade space of candidate
materiel solutions. After choosing the preferred materiel solution, the Service sponsor
matures the solution in preparation for the next program milestone designated by the
MDA.
After the AoA, program systems engineers establish the technical performance
requirements consistent with the draft Capability Development Document (CDD),
required at next program milestone designated by the MDA, assuming it is Milestone A.
These requirements form the basis for the system performance specification placed on
contract for the TD Phase. These requirements also inform plans to mitigate risk in the
TD phase.
During MSA, several planning elements are addressed to frame the way forward for the
MDA’s decision at the next program milestone. SE is a primary source for addressing
several of these planning elements. The planning elements include:
See DAG section 4.3.2. Technical Planning Process. These planning elements are
documented in various program plans such as the Technology Development Strategy
(TDS), Test and Evaluation Strategy (TES), Program Protection Plan (PPP), next-phase
Request for Proposal (RFP), and the Systems Engineering Plan (SEP). The SEP
describes the SE efforts necessary to provide informed advice to these other planning
artifacts (see the SEP Outline).
SE provides, for example, the technical basis for TD phase planning and execution,
including identification of critical technologies, development of a competitive prototyping
strategy, and establishment of other plans that drive risk-reduction efforts. This early SE
effort lays the foundation for the TD phase contract award(s) and preliminary designs,
which confirm the system’s basic architecture.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      203
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Roles and Responsibilities
    •     Prepare for and support source selection activities for the upcoming phase
          solicitation and contract award
    •     Support the requirement community with development of the draft CDD,
          assuming the next phase is TD
    •     Develop the TDS, which incorporates necessary risk-reduction activities
    •     Staff the program office with qualified (trained and experienced) systems
          engineers
In addition to the general roles and responsibilities described in DAG section 4.1.4.
Engineering Resources, during this phase it is the Systems Engineer’s responsibility to:
    •     Lead and manage the execution of the technical activities in this phase
    •     Measure and track program technical maturity
    •     Identify technologies that should be included in an assessment of technical risk
    •     Perform trade studies
    •     Support preparations for the RFP package
    •     Develop the system performance specification. See DAG section 4.3.7
          Configuration Management Process. A particular program's naming convention
          for specifications should be captured in the SEP and other plans and processes
          tailored for the program.
    •     Ensure integration of key design considerations into the system performance
          specification.
    •     Develop technical approaches and plans, and document them in the SEP.
    •     Ensure the phase technical artifacts are consistent and support objectives of the
          next phase.
Inputs
Table 4.2.3.T1 summarizes the primary inputs associated with this pre-systems
acquisition part of the life cycle (see DoDI 5000.02). The table assumes the next phase
is TD, but most of the technical outputs would be applicable going into any follow-on
phase.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      204
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
 AoA Guidance and AoA Study Plan
 Acquisition Decision Memorandum (ADM) (may contain additional direction)
 Other analyses
      •    Other prior analytic, prototyping, and/or technology demonstration efforts conducted by the S&T
           community; technology insertion/transition can occur at any point in the life cycle
The ICD, AoA Guidance, and AoA Study Plan should be available prior to the start of
the MSA phase. Results of other related analyses may be available, for example from
the Capability Based Assessment (see DAG section 4.3.10. Stakeholder Requirements
Definition Process) or other prior analytic and/or prototyping efforts conducted by the
S&T community.
Activities
The MSA phase activities begin after a favorable MDD review has been held (see DAG
section 4.2.2. Pre-Materiel Development Decision) and end when the phase-specific
entrance criteria for the next program milestone, designated by the MDA, have been
met. Figure 4.2.3.F2 provides the end-to-end perspective and the integration of SE
technical reviews and audits across the acquisition life cycle.
Referring back to Figure 4.2.3.F1, which shows the major blocks of technical activities in
the MSA phase:
    •     Conduct AoA. Includes all activities and analyses conducted by the AoA Study
          team under the direction of the Senior Advisory Group / Executive Steering
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      205
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         Committee (SAG/ESC) and CAPE, or Service equivalent. Concludes with a final
         ESC/SAG; produces AoA Report. Systems engineers should support this activity,
         though in DoD policy the AoA is to be conducted by an organization independent
         from the Program Manager.
    •    Perform Analysis to Support Selection of a Preferred Materiel Solution.
         Includes all engineering activities and technical analysis performed to support
         Service selection of the preferred materiel solution by balancing cost,
         performance, schedule, and risk.
    •    Perform Operational Analysis on Preferred Materiel Solution. Supports the
         definition of the performance requirements in the operational context, Functional
         Capabilities Board (FCB) review, and the development of the draft CDD (see
         CJCSI 3170.01 Joint Capabilities Integration and Development System (JCIDS)
         and DAG section 4.3.10. Stakeholders Requirements Definition Process). The
         Systems Engineer should support the operational requirement/user/operational
         test community to ensure the concept of operations (CONOPS) is detailed
         enough to verify and validate system performance and operational capability.
         This activity could include the development of design reference missions/use
         cases that assist in the verification and validation process. Through analysis, the
         Systems Engineer also helps to identify key technology elements, determine
         external interfaces, establish interoperability requirements, and identify critical
         program information.
    •    Perform Engineering and Technical Analysis on Preferred Materiel
         Solution. This includes all engineering activities and technical analysis
         performed on the Service-selected preferred materiel solution in support of the
         development and maturation of a materiel solution concept, associated system
         specification, and technical plans for the next phase.
    •    Establish Program Framework and Strategies. All activities to converge on the
         overarching strategies and plans for the acquisition and sustainment of the
         system. Attention should be given to identifying and documenting agreements
         with external organizations. This documentation should include, for example, the
         contributions of S&T organizations and plans for transitioning technology into a
         program.
    •    Prepare for Initial Review Milestone and Next Phase. Includes all activities to
         compile technical and programmatic analysis and plans to meet the entrance
         criteria for the next program milestone designated by the MDA. See DoDI
         5000.02 for phase exit criteria and PDUSD(AT&L) memorandum, "Improving
         Milestone Process Effectiveness."
The technical review typically conducted in the MSA phase is the Alternative Systems
Review (ASR) (see DAG section 4.2.9. Alternative Systems Review).
For a more detailed discussion of MSA phase activities, refer to the white paper on the
MSA Activities Model.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      206
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Outputs and Products
The knowledge gained during this phase, based on both the AoA and other analyses,
should provide confidence that a technically feasible solution approach matches user
needs and is affordable with reasonable risk, see Table 4.2.3.T2. Technical outputs
associated with technical reviews in this phase are addressed later in this chapter.
PPP
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      207
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                    Technical Outputs from MSA Phase
 Consideration of technology issues
      •    Affordability targets are established and treated as Key Performance Parameters (KPPs) at the
           next program milestone designated by the MDA
      •    Identify the likely design performance points where trade-off analyses occur during the next
           phase
      •    Value engineering results, as appropriate (see DAG section 4.3.19.3. Value Engineering)
      •    See DAG Chapter 3 Affordability and Life-Cycle Resource Estimates
 Informed advice to the developmental test and evaluation (DT&E) planning including Early Operational
 Assessments (EOAs)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      208
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                    Technical Outputs from MSA Phase
 Informed advice to the Request for Proposal (RFP)
• Informed advice including system specification, SOW, CDRLs, and source selection criteria
The primary objective of the Technology Development (TD) phase is to reduce technical
risk and develop a sufficient understanding of the materiel solution to support sound
investment decisions at the pre- Engineering and Manufacturing Development (EMD)
Review and at Milestone B regarding whether to initiate a formal acquisition program.
The Systems Engineer supports the production of a preliminary system design that
achieves a suitable level of system maturity for low-risk entry into EMD (see Figure
4.2.4.F1.). Usually the Systems Engineer implements a strategy of competitive
prototyping on a system element or subsystem level, balancing capability needs and
design considerations to synthesize system requirements for a preliminary end-item
design for the system.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      209
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
 Figure 4.2.4.F1. Systems Engineering Activities in the Technology Development
                                     Phase
During the TD phase, the program develops and demonstrates prototype designs to
reduce technical risk, validate design approaches, validate cost estimates, and refine
requirements. In, addition, the TD phase efforts ensure the level of expertise required to
operate and maintain the product is consistent with the force structure. Technology
development is an iterative process of maturing technologies and refining user
performance parameters to accommodate those technologies that do not sufficiently
mature (requirements trades). The Initial Capabilities Document, the Technology
Development Strategy (TDS), Systems Engineering Plan (SEP), and draft Capability
Development Document (CDD) guide the efforts of this phase.
There are two key technical objectives in the TD phase: technical risk reduction and
initial system development activity, culminating in preliminary design. The Systems
Engineer in the TD phase manages activities to evaluate prototyped solutions
(preferably competitive prototypes) against performance, cost, and schedule constraints
to balance the total system solution space. This information can then be used to inform
the finalization of the system performance specification as a basis for functional analysis
and preliminary design.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      210
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Effective systems engineering (SE), applied in accordance with the SEP and gated by
technical reviews, reduces program risk, identifies potential management issues in a
timely manner, and supports key program decisions. The TD phase provides the
Program Manager with a preliminary design and allocated baseline that are realistic and
credible.
The program office team provides technical management and may employ industry,
Government laboratories, the Service science and technology (S&T) community, or
Federally Funded Research and Development Centers (FFRDCs)/universities to
accomplish specific risk-reduction or prototype tasks as described in the SEP.
In addition to the general roles and responsibilities described in DAG section 4.1.4.
Engineering Resources, during this phase it is the Systems Engineer’s responsibility to:
    •    Lead and manage the execution of the technical activities as documented in the
         SEP
    •    Plan and execute technical reviews, including the System Requirements Review
         (SRR), System Functional Review (SFR), and Preliminary Design Review (PDR)
    •    Measure and track program maturity using technical performance measures,
         requirements stability, and integrated schedules
    •    Support award of TD phase contract(s), as necessary
    •    Balance and integrate key design considerations
    •    Maintain the Systems Engineering Plan (SEP), including generating the update in
         support of Milestone B
    •    Lead initial development of the system to include functional analysis, definition of
         the functional and allocated baselines, and preliminary design (see DAG sections
         4.3.11. Requirements Analysis Process and 4.3.12. Architecture Design Process)
    •    Support configuration management of the baselines, since they are required in
         later technical reviews, audits, and test activities (e.g., functional baseline at the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      211
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
          Functional Configuration Audits (FCAs))
    •     Conduct technical activities in support of the pre-EMD review
    •     Conduct a rigorous and persistent assessment of technical risk, determine risk
          mitigation plans, and work with the Program Manager to resource the mitigation
          plans
    •     Support the Technology Readiness Assessment (TRA) including creation of the
          plan, the pre-EMD preliminary TRA, and the TRA final report
    •     Support requirements management and monitor for unnecessary requirements
          growth (e.g., derived versus implied requirements)
    •     Manage interfaces and dependencies
Inputs
Table 4.2.4.T1 summarizes the primary inputs associated with this pre-systems
acquisition part of the life cycle (see DoDI 5000.02).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      212
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                                 Inputs for TD Phase
 Trade study results
      •    Includes identification of key system elements to be prototyped prior to Milestone B, see DTM
           09-027
Affordability Assessment
      •    Affordability targets are established and treated as a Key Performance Parameters (KPPs) at
           Milestone A
      •    Affordability targets drive engineering trade-offs and sensitivity analyses about capability
           priorities in the TD phase
      •    See DAG Chapter 3 Affordability and Life-Cycle Resource Estimates
TDS
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      213
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                                 Inputs for TD Phase
 Test and Evaluation Strategy (TES)
      •    Other prior analytic, prototyping, and/or technology demonstration efforts done by the S&T
           community. Technology insertion/transition can occur at any point in the life cycle
Activities
The TD phase activities begin when a favorable Milestone A decision has been made
(see DAG section 4.2.3. Materiel Solution Analysis Phase) and end with a successful
Milestone B decision. Figure 4.2.4.F2 provides the end-to-end perspective and the
integration of SE technical reviews and audits across the acquisition life cycle.
The TD phase addresses a set of critical activities leading to the decision to establish a
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      214
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
program of record. The SE activities provide the technical foundation for this decision.
Depending on the nature of the technology development strategy, the order and
characteristics of these activities may change. During the TD phase, systems engineers
follow comprehensive, iterative processes to accomplish the following:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      215
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Develop Preliminary Design. See DAG section 4.2.12. Preliminary Design
         Review for additional information.
    •    Develop Allocated Technical Performance Measures (TPMs). The allocated
         baseline establishes the first physical representation of the system as subsystem
         elements with system-level capabilities allocated to subsystem-level technical
         performance measures.
    •    Complete PDR Report. After the PDR, the Program Manager develops the PDR
         Report with support from the Systems Engineer. The Program Manager provides
         the report to the MDA to support a Milestone B decision. The report includes
         recommended requirements trades based upon an assessment of cost,
         schedule, performance, and risk.
    •    Support pre-EMD review. The purpose of the MDA-level review is to assess the
         AS, RFP, and key related planning documents and determine whether program
         plans are affordable and executable and reflect sound business arrangements.
         Specific SE attention is given to engineering trades and their relationship to
         program requirements and risk management.
    •    Finalize Documents. The Systems Engineer updates the SEP and PPP and
         provides inputs for updating the LCSP, TEMP, and other program documents.
Test activities during the TD phase that depend on SE support and involvement include
developmental test and evaluation of system and/or system element prototypes and
Early Operational Assessments (EOAs). Developmental Test and Evaluation (DT&E)
activities, for example, should be closely coordinated between the engineering and test
communities since DT&E activities support:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      216
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Outputs and Products
The technical outputs identified in Table 4.2.4.T2 are some of the inputs necessary to
support SE activities in the EMD phase. The outputs should support the technical
recommendation at Milestone B that an affordable solution has been found for the
identified need with acceptable risk, scope, and complexity. Technical outputs
associated with technical reviews in this phase are addressed later in this chapter.
SEP (updated)
      •    If programs enter the acquisition life cycle at Milestone B, this is their initial SEP
      •    PDUSD(AT&L) memorandum, "Document Streamlining - Program Strategies and Systems
           Engineering Plan," April 20, 2011
      •    See DAG section 4.1.2 Systems Engineering Plan
 Updated Integrated Master Plan (IMP), Integrated Master Schedule (IMS), and memoranda of
 agreement (MOAs)/ memoranda of understanding (MOUs)
 RAM-C Report (updated)
      •    Attachment to SEP as directed by DTM 11-003; if programs enter the acquisition life cycle at
           Milestone B, this is their initial RAM-C Report
RGC (updated)
PPP (updated)
      •    If programs enter the acquisition life cycle at Milestone B, this is their initial PPP
      •    PDUSD(AT&L) memorandum, "Document Streamlining - Program Protection Plan (PPP)," July
           18, 2011
      •    See DAG Chapter 13 Program Protection
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      217
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                      Technical Outputs from TD Phase
 Assumptions and constraints
      •    TRA Plan
      •    Confirmation at the end of TD phase that critical technologies have been demonstrated in a
           relevant environment
      •    Preliminary TRA required at pre-EMD review
      •    TRA final report
      •    Including identification of key system elements to be prototyped in EMD Phase and documented
           in the Acquisition Strategy (AS)
 Preliminary Design Review (PDR) Report and Post PDR Assessment (produced by DASD(SE) for
 MDAPs)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      218
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                      Technical Outputs from TD Phase
 Informed advice to Acquisition Program Baseline (APB)
• APB inputs include the SE affordability assessments, schedule inputs, and performance inputs
 Establishes technical information that is the basis of the cost analysis requirements description (CARD)
 and manpower estimates
 Informed advice to Affordability Assessment
      •    System support and maintenance objectives and requirements established; updated will cost
           values and affordability targets as documented in the Life-Cycle Sustainment Plan (LCSP),
           including Informed advice to manpower estimates
      •    PDUSD(AT&L) memorandum, "Document Streamlining - Life-Cycle Sustainment Plan (LCSP),"
           September 14, 2011
      •    See DAG Chapter 5 Life-Cycle Logistics
• See DAG Chapter 7 Acquiring Information Technology, Including National Security Systems
 Early developmental test and evaluation (DT&E) assessments, including Early Operational Assessments
 (EOAs)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      219
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                      Technical Outputs from TD Phase
 Informed advice to draft and final Request for Proposal (RFP)
• Informed advice including system specification, SOW, CDRLs, and source selection criteria
The primary objective of the Engineering and Manufacturing Development (EMD) phase
is to develop the product baseline, verify it meets the system functional and allocated
baselines, and transform the preliminary design into a producible design, all within the
schedule and cost constraints of the program. Systems engineering (SE) activities
support development of the detailed design, verification that requirements are met,
reduction in system-level risk, and assessment of readiness to begin production and/or
deployment. The core SE activities support the two efforts associated with the EMD
phase as defined in DoDI 5000.02 for weapon systems acquisition and identified in
figure 4.2.5.F1: integrated system design and system capability and manufacturing
process demonstration.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      220
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         feasible)
    •    Demonstrate system maturity and readiness to begin production for operational
         test and /or deployment and sustainment activities
The EMD phase includes technical assessment and control efforts, including value
engineering techniques described in DAG section 4.3.19.3. Value Engineering, to
effectively manage risks and increase confidence in meeting system performance,
schedule, and cost goals. SE activities should be integrated with EMD phase-specific
test and evaluation and logistics and sustainment activities identified in DAG Chapter 9
Test and Evaluation and Chapter 5 Life-Cycle Logistics, respectively. The planning,
scheduling, and conduct of event-driven technical reviews (Critical Design Review
(CDR), Functional Configuration Audit (FCA), System Verification Review (SVR), and
Production Readiness Review (PRR)) are vital to provide key points for assessing
program maturity and the effectiveness of risk-reduction strategies.
A well-planned EMD phase Systems Engineering Plan (SEP) builds on the results of
previous activities and significantly increases the likelihood of a successful program
compliant with the approved Acquisition Program Baseline (APB).
Implementing the technical planning as defined in the approved SEP guides the
execution of the complex and myriad tasks associated with completing the detailed
design and integration, and supports developmental test and evaluation activities. The
SEP also highlights the linkage between Technical Performance Measures (TPM), risk
management, and earned-value management activities to support tracking of cost
growth trends. Achieving predefined EMD technical review criteria provides confidence
that the system meets stated performance requirements (including interoperability and
supportability requirements) and that design and development have matured to support
the initiation of the Production and Deployment (P&D) phase.
In addition to the general roles and responsibilities described in DAG section 4.1.4
Engineering Resources, during this phase it is the Systems Engineer’s responsibility to:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      221
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Manage the system design to satisfy the operational requirements within the
         constraints of cost and schedule and to evaluate the system design, identify
         deficiencies, and make recommendations for corrective action
    •    Conduct or support the technical evaluation in support of source selection for the
         EMD contract award
    •    Maintain requirements traceability and linkage to the initial product baseline
    •    Conduct event-driven technical reviews, advising the Program Manager on
         review criteria readiness
    •    Lead preparation and conduct of technical reviews
    •    Track and report major (Class I) baseline changes and recommend the path
         forward in accordance with the Configuration Management (CM) process (see
         DAG section 4.3.7. Configuration Management Process for definition of Class I)
    •    Support determination of production rates and delivery schedules
    •    Support test and evaluation activities: identify system evaluation targets driving
         system development and support operational assessments as documented in the
         Test and Evaluation Master Plan (TEMP) (see DAG Chapter 9 Test and
         Evaluation)
    •    Align the SEP with the TEMP on SE processes, methods, and tools identified for
         use during test and evaluation
    •    Analyze deficiencies discovered from operational assessments and verification
         methods (developmental test and evaluation); develop and implement solutions
         to including, but not limited to, rebalancing of system requirements
    •    Support logistics and sustainment activities as documented in the Life-Cycle
         Sustainment Plan (LCSP) (see DAG Chapter 5 Life-Cycle Logistics)
    •    Maintain the SEP including generating the update in support of Milestone C
    •    Ensure manufacturing process development and maturation efforts
    •    Develop approaches and plans to verify mature fabrication and manufacturing
         processes and determine manufacturing readiness (see the Manufacturing
         Readiness Level (MRL) Deskbook as one source for assessing manufacturing
         readiness)
    •    Conduct a rigorous production risk assessment and determine risk mitigation
         plans
    •    Identify system design features that enhance producibility (efforts usually focus
         on design simplification, fabrication tolerances, and avoidance of hazardous
         materials)
    •    Conduct producibility trade studies to determine the most cost-effective
         fabrication and manufacturing process
    •    Assess Low-Rate Initial Production (LRIP) feasibility within program constraints
         (may include assessing contractor and principal subcontractor production
         experience and capability, new fabrication technology, special tooling, and
         production personnel training requirements)
    •    Identify long-lead items and critical materials
    •    Support update to production costs as a part of life-cycle cost management
    •    Continue to support the configuration management process to control changes to
         the product baseline during test and fielding
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      222
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Inputs
Table 4.2.5.T1. summarizes the primary inputs associated with this systems acquisition
part of the life cycle (see DoDI 5000.02).
      •    If programs enter the acquisition life cycle at Milestone B, this is their initial SEP
      •    PDUSD(AT&L) memorandum, "Document Streamlining - Program Strategies and Systems
           Engineering Plan," April 20, 2011
      •    See DAG section 4.1.2 Systems Engineering Plan
      •    Attachment to SEP as directed by DTM 11-003; if programs enter the acquisition life cycle at
           Milestone B, this is their initial RAM-C Report
      •    If programs enter the acquisition life cycle at Milestone B, this is the initial PPP
      •    Includes Security Classification Guide (SCG), Counterintelligence Support Plan, Criticality
           Analysis, Anti-Tamper Plan, and Acquisition Information Assurance (IA) Strategy
      •    PDUSD(AT&L) memorandum, "Document Streamlining - Program Protection Plan (PPP)," July
           18, 2011
      •    See DAG Chapter 13 Program Protection
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      223
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                               Inputs for EMD Phase
 Environment, safety, and occupational health (ESOH) analyses
• See DAG Chapter 7 Acquiring Information Technology, Including National Security Systems
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      224
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                               Inputs for EMD Phase
 Test and Evaluation Master Plan (TEMP)
 Informed advice to the developmental test and evaluation (DT&E) planning including Operational
 Assessments (OAs)
      •    Other prior analytic, prototyping, and/or technology demonstration efforts performed by the S&T
           community. Technology insertion/transition can occur at any point in the life cycle
Activities
The EMD phase activities begin when a favorable Milestone B decision has been made
(see DAG section 4.2.4. Technology Development Phase) and end with a successful
Milestone C decision. Figure 4.2.5.F2 provides the end-to-end perspective and the
integration of SE technical reviews and audits across the acquisition life cycle.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      225
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
SE activities to support the integrated system design effort include:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      226
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •     Critical Design Review (CDR) (mandated, establishes initial product baseline,
          see DAG section 4.2.13. Critical Design Review)
    •     System Verification Review/Functional Configuration Audit (SVR/FCA) (see DAG
          section 4.2.14. System Verification Review/Functional Configuration Audit)
    •     Production Readiness Review (PRR) (DAG section 4.2.15. Production
          Readiness Review)
Test activities during the EMD phase that depend on SE support and involvement
include Test Readiness Reviews (TRRs), Developmental Test and Evaluation (DT&E),
and Operational Assessments (OAs). The Systems Engineer, in collaboration with the
Chief Developmental Tester, should identify system evaluation targets driving system
development and support operational assessments as documented in the Test and
Evaluation Master Plan (TEMP). Associated SE activities and plans should be in the
SEP (see DAG section 4.1.2. Systems Engineering Plan, 4.2.8. Technical Reviews and
Audits Overview, and DAG Chapter 9 Test and Evaluation).
The technical outputs and products identified in Table 4.2.5.T2 and are some of the
inputs necessary to support SE processes in the P&D phase. They should support the
technical recommendation at Milestone C that manufacturing processes are mature
enough to support Low-Rate Initial Production (LRIP) and generate production-
representative articles for operational test and evaluation (OT&E). Technical outputs
associated with technical reviews in this phase are addressed later in this chapter.
      •    Updated functional, allocated, and product baselines; verified production processes, and
           verification results/ decisions
      •    Associated technical products including associated design and management decisions
SEP (updated)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      227
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                    Technical Outputs from EMD Phase
 RGC (updated)
PPP (updated)
ESOH analyses
      •    Updated Programmatic Environment, Safety, and Occupational Health Evaluation (PESHE) and
           NEPA/E.O. 12114 Compliance Schedule
      •    Risk assessment identifying mitigation plans for acceptable risks to allow the program to initiate
           production, deployment, and operational test and evaluation activities
      •    Update system of systems (SoS) risks associated with governance, interdependencies, and
           complexity
Manufacturing readiness
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      228
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                    Technical Outputs from EMD Phase
 System performance specification (updated if necessary) including verification matrix
 Product baseline
 Other technical information such as models and simulations generated during the EMD phase
 Results of EMD prototyping activities
 Manufacturing prototyping activities support P&D phase
 Post-Critical Design Review (CDR) Assessment (produced by DASD(SE) for Major Defense Acquisition
 Programs)
 Informed advice to APB
      •    Updated will cost values and affordability targets as documented in the Acquisition Program
           Baseline and Acquisition Strategy
 Establishes technical information that is the basis of the updates to the Cost Analysis Requirements
 Description (CARD) and manpower estimates
 Informed advice to Affordability Assessment
      •    Should cost goals updated to achieve efficiencies and control unproductive expenses without
           sacrificing sound investment in product affordability.
      •    Value engineering results, as appropriate. See DAG section 4.3.19.3. Value Engineering.
      •    See DAG Chapter 3 Affordability and Life-Cycle Resource Estimates
Manufacturing, performance, and quality metrics critical to program success are identified and tracked
Production budget/cost model validated and resources considered sufficient to support LRIP and FRP
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      229
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                    Technical Outputs from EMD Phase
 Informed advice to LCSP (updated)
ISP of Record
• See DAG Chapter 7 Acquiring Information Technology, Including National Security Systems
• Informed advice including system specification, SOW, CDRLs, and source selection criteria
The objective of the Production and Deployment (P&D) phase is to validate the product
design and to deliver the quantity of systems required for full operating capability,
including all enabling system elements and supporting material and services. Systems
engineering (SE) in P&D delivers the final product baseline as validated during
operational testing, and supports deployment and transition of capability to all end
users, the warfighters, and supporting organizations. SE activities, for example
maintenance approach, training, and technical manuals, should be integrated with P&D
phase-specific test and evaluation and logistics and sustainment activities identified in
DAG Chapter 9 Test and Evaluation and Chapter 5 Life-Cycle Logistics, respectively.
This phase typically has several major efforts as shown in Figure 4.2.6.F1: Low-Rate
Initial Production (LRIP), Operational Test and Evaluation (OT&E), Full-Rate Production
(FRP) and Full Deployment (FD), and deployment of capability in support of the Initial
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      230
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
and Full Operational Capabilities. The Full-Rate Production Decision Review (FRP DR)
and/or Full Deployment Decision Review (FD DR) serves as a key decision point
between LRIP (and OT&E) and FRP/FD.
During deployment, units attain Initial Operational Capability (IOC), then Full
Operational Capability (FOC).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      231
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Roles and Responsibilities
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      232
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •     Update and maintain system certifications and interfaces with external systems,
          as necessary
Inputs
Table 4.2.6.T1 summarizes the primary inputs associated with this systems acquisition
part of the life cycle.
SEP
      •    Updated functional, allocated, and product baselines; verified and validated production
           processes, and validation results / decisions
      •    Updated technical products including associated design and management decisions
      •    PDUSD(AT&L) memorandum, "Document Streamlining - Program Strategies and Systems
           Engineering Plan," April 20, 2011
      •    See DAG section 4.1.2 Systems Engineering Plan
PPP
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      233
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                               Inputs for P&D Phase
 Assumptions and constraints
Risk assessment
Manufacturing readiness
Manufacturing, performance and quality metrics critical to program success are identified and tracked
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      234
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                               Inputs for P&D Phase
 Affordability Assessment
LCSP
TEMP
• See DAG Chapter 7 Acquiring Information Technology, Including National Security Systems
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      235
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                               Inputs for P&D Phase
 Other analyses
      •    Other prior analytic, prototyping, and/or technology demonstration efforts completed by the
           science and technology (S&T) community; technology insertion/transition can occur at any point
           in the life cycle
Activities
The P&D phase SE activities begin when a favorable Milestone C decision has been
made (see DAG section 4.2.5. Engineering and Manufacturing Development Phase)
and end when FOC is achieved. Figure 4.2.6.F2 provides the end-to-end perspective
and the integration of SE technical reviews and audits across the acquisition life cycle.
    •     Provide technical support to prepare for the Operations and Sustainment (O&S)
          phase, reviewing and providing inputs on the maintenance approach, acquisition
          strategy, training, and technical manuals
    •     Determine root cause of problems, identify corrective actions, and manage to
          completion
    •     Analyze system deficiencies generated during OT&E, acceptance testing,
          production, and deployment
    •     Address problem/failure reports through the use of a comprehensive data
          collection approach like a Failure Reporting, Analysis and Corrective Action
          System (FRACAS)
    •     Manage and control of configuration updates (hardware, software, and
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      236
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         specifications) to the product baseline
    •    Re-verify and validate production configuration
In both the Production and Deployment and O&S phases the Systems Engineer should
identify and plan for potential obsolescence impacts (i.e., Diminishing Manufacturing
Sources and Material Shortages (DMSMS)). DMSMS problems are an increasing
concern as the service lives of DoD weapon systems are extended and the product life
cycle for high-technology system elements decreases.
The PCA is a SE audit typically conducted in the P&D phase (see DAG section 4.2.16.
Physical Configuration Audit for additional information regarding the PCA).
Test activities during the P&D phase that depend on SE support and involvement
include the Assessment of Operational Test Readiness (AOTR) for MDAPs, Operational
Test Readiness Reviews (OTRRs), initial and follow-on OT&E (IOT&E and FOT&E),
and live-fire test and evaluations (LFT&E), as appropriate (see DAG Chapter 9 Test and
Evaluation). In addition, any corrective actions or design changes implemented in
response to test identified deficiencies require additional regression testing.
The Systems Engineer, in collaboration with the Chief Developmental Tester, should
identify technical support needed for operational assessments and document in the Test
and Evaluation Master Plan (TEMP). Associated SE activities and plans should be in
the SEP (see DAG section 4.1.2. Systems Engineering Plan, 4.2.8. Technical Reviews
and Audits Overview, and DAG Chapter 9 Test and Evaluation).
The technical outputs and products from the P&D phase identified in Table 4.2.6.T2 are
some of the inputs necessary to support SE processes in the O&S phase. They should
support the program’s transition into full operations and sustainment. Technical outputs
associated with technical reviews in this phase are addressed later in this chapter.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      237
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                Table 4.2.6.T2. Technical Outputs Associated with P&D Phase
 Validated system
      •    Updated functional, allocated, and product baselines; verified and validated production
           processes, and validation results / decisions
      •    Associated technical products including associated design and management decisions
 PPP (updated)
      •    Updated at FRP DR and/or FD DR
      •    PDUSD(AT&L) memorandum, "Document Streamlining - Program Protection Plan (PPP)," July
           18, 2011
      •    See DAG Chapter 13 Program Protection
 ESOH analyses
      •    Updated Programmatic Environment, Safety, and Occupational Health Evaluation (PESHE) and
           NEPA/EO 12114 Compliance Schedule
Acceptance test data to assess product conformance and to support DD250 of end items
Other technical information such as models and simulations generated during the P&D phase
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      238
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                    Technical Outputs from P&D Phase
 Technical information that is the basis of the updates to the Cost Analysis Requirements Description
 (CARD) and manpower estimates
Industrial base capabilities; updated manufacturing processes and supply chain sources
Draft and final RFP(s) for production and SE support to O&S activities
The objective of the Operations and Support (O&S) phase is to execute a support
program that meets operational support performance requirements and sustains the
system in the most cost-effective manner over its total life cycle. Planning for this phase
begins in the Materiel Solution Analysis (MSA) phase, matures through the Technology
Development (TD) and Engineering and Manufacturing Development (EMD) phases,
and is documented in the Life-Cycle Sustainment Plan (LCSP). Systems engineering
(SE) in the O&S phase assesses whether the fielded system and enabling system
elements continue to provide the needed capability in a safe, sustainable, and cost-
effective manner. SE efforts consist of data collection, assessment, and corrective
action cycles to maintain a system’s operational suitability and operational
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      239
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
effectiveness.
Sustainment activities supporting system operations begin in this phase and should
address two major efforts: life-cycle sustainment and disposal. SE efforts during life-
cycle sustainment include Environment, Safety, and Occupational Health (ESOH)
assessments, technology refresh, functionality modification, and life-extension
modifications.
When the system no longer provides an effective or efficient capability to the warfighter,
the Department should make an informed decision to either modify or dispose of it.
However, a related proactive aspect in the Production and Deployment and O&S
phases is engineering analysis to identify potential obsolescence impacts (i.e.,
Diminishing Manufacturing Sources and Material Shortages (DMSMS)). DMSMS
problems are an increasing concern as the service lives of DoD weapon systems are
extended and the product life cycle for high-technology system elements decreases
(see DAG section 4.3.18.8 Diminishing Manufacturing Sources and Material Shortages).
    •    Refine the maintenance program to minimize total life-cycle cost while achieving
         readiness and sustainability objectives
    •    Assess end-user feedback and conduct engineering investigations as required
    •    Lead teams to translate end-user feedback into corrective action plans and
         recommend technical changes
    •    Develop and implement approved system proposed changes to ensure end-user
         needs continue to be met
    •    Conduct ESOH risk assessments and maintain oversight of critical safety item
         supply chain management
    •    Conduct analysis to identify and mitigate potential obsolescence impacts (i.e.,
         Diminishing Manufacturing Sources and Material Shortages (DMSMS))
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      240
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •     Support implementation of follow-on development efforts in response to formal
          decisions to extend the weapon system’s service life (SLEP) or to initiate a major
          modification (may be treated as a stand-alone acquisition program)
    •     Update and maintain system certifications and external SoS interfaces
Inputs
Table 4.2.7.T1 summarizes the primary inputs associated with this sustainment part of
the life cycle.
Risk assessment
Field failures
Other technical information, such as models and simulations generated during the P&D phase
 LCSP
      •    PDUSD(AT&L) memorandum, "Document Streamlining - Life-Cycle Sustainment Plan (LCSP),"
           September 14, 201
      •    See DAG Chapter 5 Life-Cycle Logistics
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      241
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                               Inputs for O&S Phase
 Program Protection Plan (PPP)
      •    PDUSD(AT&L) memorandum, "Document Streamlining - Program Protection Plan (PPP)," July
           18, 2011
      •    See DAG Chapter 13 Program Protection
 Other analyses
      •    End-user feedback and trouble reports
      •    Other prior analytic, prototyping, and/or technology demonstration efforts conducted by the
           science and technology (S&T) community
      •    Technology insertion/transition can occur at any point in the life cycle
Activities
The O&S phase overlaps with the Production and Deployment phase, since O&S
activities begin when the first system is fielded. O&S ends when a system is
demilitarized and disposed of. Figure 4.2.7.F1 provides the end-to-end perspective and
the integration of SE technical reviews and audits across the acquisition life cycle.
SE activities should be integrated with O&S phase-specific test and evaluation and
logistics and sustainment activities identified in DAG Chapter 9 Test and Evaluation and
Chapter 5 Life-Cycle Logistics, respectively. The O&S activities in which the Systems
Engineer should participate include:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      242
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Address problem/failure reports through the use of a comprehensive data
         collection approach such as a Failure Reporting, Analysis and Corrective Action
         System (FRACAS)
    •    Process and analyze mission data
    •    Manage preplanned product improvements (P3I)
    •    Develop and implement technology refresh schedules
    •    Conduct technology insertion efforts as needed to maintain or improve system
         performance
    •    Update system safety assessments
    •    Perform engineering analysis to investigate the impact of DMSMS issues
    •    Work with vendors and the general technical community to determine
         opportunities for technology incursion to increase reliability and affordability
The disposal activities in which the Systems Engineer should participate include:
    •    Support demilitarizing and disposing of the system; in accordance with all legal
         and regulatory requirements and policy relating to safety (including explosives
         safety), security, and the environment
    •    Document lessons learned
    •    Archive data
The technical review conducted in O&S is the In-Service Review (ISR) (see DAG
section 4.2.17. In-Service Review). ISRs are typically used to track, monitor, and assess
system performance from the time an Initial Operational Capability (IOC) is reached
until retirement or disposal of the system. They are often used to prioritize system
modifications due to deficiencies or integration of additional capability, or to respond to
external needs associated with SoS implementations.
The technical outputs and products identified in Table 4.2.7.T2 are necessary to support
SE processes to sustain the system, including modifications. Technical outputs
associated with technical reviews in this phase are addressed later in this chapter.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      243
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                    Technical Outputs from O&S Phase
 Engineering Change Proposal (ECP) packages
For DoD weapon systems development, a properly tailored series of technical reviews
and audits provide key points throughout the life cycle to evaluate significant
achievements and assess technical maturity and risk. DoDI 5000.02, Enclosure 4
presents the statutory, regulatory, and milestone requirements for acquisition programs.
Properly align the technical reviews to support knowledge-based milestone decisions to
streamline the acquisition life cycle and save precious taxpayer dollars. As a companion
to DoDI 5000.02, see the OUSD(AT&L) memorandum, "Expected Business Practice:
Document Streamlining - Program Strategies and Systems Engineering Plan" dated
April 20, 2011.
Technical reviews and audits allow the Program Manager and Systems Engineer to
jointly define and control the program’s technical effort by establishing the success
criteria for each review and audit. A well-defined program facilitates effective monitoring
and control through increasingly mature points (see Technical Maturity Point table in
DAG section 4.2.1. Life-Cycle Expectations).
Technical reviews of program progress should be event driven and conducted when the
system under development meets the review entrance criteria as documented in the
SEP. Systems engineering (SE) is an event-driven process based on successful
completion of key events as opposed to arbitrary calendar dates. As such, the SEP
should discuss the timing of events in relation to other SE and program events. While
the initial SEP and Integrated Master Schedule have the expected occurrence in the
time of various milestones (such as overall system CDR), the plan should accommodate
and be updated to reflect changes to the actual timing of SE activities, reviews, and
decisions.
Figure 4.2.8.F1 provides the end-to-end perspective and the integration of SE technical
reviews and audits across the acquisition life cycle.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      244
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Properly structured, technical reviews and audits support the Defense Acquisition
System by:
Figure 4.2.8.F1 illustrates the notional sequence of technical reviews and audits. It also
provides typical timing associated with the acquisition phases. Technical reviews should
occur when the requisite knowledge is expected and required. The guidance provided in
DAG sections 4.2.9. through 4.2.17. defines the entrance and exit criteria for the level of
maturity expected at each technical review and audit. OSD established the expected
reviews and audits for each acquisition phase in the outline for the Systems Engineering
Plan (SEP). These policy and guidance documents provide a starting point for the
Program Manager and Systems Engineer to develop the program’s unique set of
technical reviews and audits. Tailoring is expected to best suit the program objectives
(see DAG section 4.1. Introduction). The SEP captures the output of this tailoring and is
reviewed and approved to solidify the program plan.
Programs that tailor the timing and scope of these technical reviews and audits to
satisfy program objectives increase the probability of successfully delivering required
capability to the warfighter. Technical reviews provide the forum to frame important
issues and define options necessary to balance risk in support of continued
development.
The technical baseline (including the functional, allocated and product baselines)
established at the conclusion of certain technical reviews inform all other program
activity. Accurate baselines and disciplined reviews serve to integrate and synchronize
the system as it matures, which facilitates more effective milestone decisions and
ultimately provides better warfighting capability for less money. The technical baseline
provides an accurate and controlled basis for:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      245
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Managing change
    •    Cost estimates, which inform the PPBE process and ultimately the Acquisition
         Program Baseline (APB)
    •    Program technical plans and schedules, which also inform the APB
    •    Contracting activity
    •    Test and Evaluation efforts
    •    Risk analysis and risk balancing
    •    Reports to acquisition executives and Congress
The Program Manager and the Systems Engineer need to keep in mind that technical
reviews and audits provide visibility into the quality and completeness of the developer’s
work products. These requirements should be captured in the contract specifications or
Statement of Work. The program office should consider delivering the SEP with the
Request for Proposal (RFP) and having it captured in the contractor’s SE Management
Plan (SEMP); this best practice also should include delineating entrance criteria and
associated design data requirements needed to support the reviews. The configuration
and technical data management plans should clearly define the audit requirements.
For complex systems, reviews and audits may be conducted for one or more system
elements depending on the interdependencies involved. These incremental system
element-level reviews lead to an overall system-level review or audit (e.g., PDR, CDR,
or PRR). After all incremental reviews are complete, an overall summary review is
conducted to provide an integrated system analysis and capability assessment that
could not be conducted by a single incremental review. Each incremental review should
complete a functional or physical area of design. This completed area of design may
need to be reopened if other system elements drive additional changes in this area. If
the schedule is being preserved through parallel design and build decisions, any system
deficiency that leads to reopening design may result in rework and possible material
scrap.
Test readiness reviews (TRR) are used to assess a contractor’s readiness for testing
configuration items, including hardware and software. They typically involve a review of
earlier or lower-level test products and test results from completed tests and a look
forward to verify the test resources, test cases, test scenarios, test scripts, environment,
and test data have been prepared for the next test activity. TRRs typically occur in the
EMD and P&D phase of a program.
For each technical review, a technical review chair is identified and is responsible for
evaluating products, determining the criteria are met, and determining that actions items
are closed. The Service chooses the technical review chair who could be the Program
Manager, Systems Engineer, or independent subject matter expert selected according
to the Service’s guidance. This guidance may identify roles and responsibilities
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      246
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
associated with technical reviews and audits. It also may specify the types of design
artifacts required for various technical reviews. In the absence of additional guidance,
each program should develop and document its tailored design review plan in the SEP.
The following notional duties and responsibilities associated with the Program Manager
and Systems Engineer should be considered in the absence of specific Service or lower
level (e.g., System Command or Program Executive Officer) guidance:
    •    Co-developing with the Systems Engineer the technical objectives of the program
         that guide the technical reviews and audits
    •    Co-developing with the Systems Engineer the earned value credit derived from
         the review
    •    Approving, funding, and staffing the planned technical reviews and audits;
         documenting this plan in the SEP and applicable contract documents
    •    Ensuring the plan includes independent subject matter experts to participate in
         each review (maintaining objectivity during these reviews with respect to
         satisfying the pre-established review criteria)
    •    Ensuring the plan provides timely and sufficient data to satisfy the statutory and
         regulatory requirements of DoDI 5000.02
    •    Controlling the configuration of each baseline and convening configuration
         steering boards when user requirement changes are warranted. This can lead to
         an unscheduled gateway into the Functional Capabilities Board (FCB) and JCIDS
         process not identified in Figure 4.2.8.F1 above.
    •    Co-developing with the Program Manager the technical objectives of the program
         that guide the technical reviews and audits
    •    Developing and documenting the technical review and audit plan in the SEP,
         carefully tailoring each event to satisfy program objectives and SEP outline
         guidance associated with the minimum technical reviews and audits. Technical
         review checklists are available on the DASD(SE) website.
    •    Ensuring the plan is event based with pre-established review criteria for each
         event, informed by the knowledge point objectives in Table 4.2.1.T1
    •    Identifying the resources required to support the plan, paying particular attention
         to the importance of the integrating activity leading up to the official review and
         audit. See Figure 4.2.8.F2.
    •    Ensuring technical reviews and audits are incorporated into the IMP and IMS
    •    Coordinating with Chief Development Tester to provide at each technical review:
         reliability growth progress to plan/assessments, DT&E activities to-date, planned
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      247
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         activities, assessments to-date, and risk areas
    •    Ensuring a status of applicable design considerations are provided at each
         technical review
    •    Establishing technical reviews and audits and their review criteria in the
         applicable contract documents (e.g., Statement of Work, IMP)
    •    Monitoring and controlling execution of the established plans
    •    Coordinating with the appointed technical review chairperson on the technical
         review plans and supporting execution of the technical reviews
    •    Assigning responsibilities for closure actions and recommend to the chairperson
         and Program Manager when a system technical review should be considered
         complete, see Figure 4.2.8.F2
The Program Manager and Systems Engineer should identify key stakeholders who
have an interest or role in the review, which may include:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      248
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Contracting Officer
    •    Defense Contract Management Agency (DCMA) and Defense Contract Audit
         Agency (DCAA)
    •    Product Support Manager
    •    Product Improvement Manager/Requirements Officer
    •    End-user Community
    •    Chief Developmental Tester
    •    Interdependent Acquisition Programs
    •    Business Financial Manager
    •    Deputy Assistant Secretary of Defense for Systems Engineering (DASD(SE))
    •    Service Technical Leadership such as chief engineers
    •    Independent Subject Matter Experts
Review Criteria
Specific review criteria are provided in each technical review and audit section below.
These criteria should be achieved and all action items closed before a technical review
is considered complete. The Systems Engineer may want to consider the technical
review-specific checklists available at DAU’s website as a resource.
Contract incentives are frequently tied to completion of technical reviews. The developer
may have a strong incentive to call the review complete as soon as possible. The
review chairperson and Systems Engineer should exercise best judgment in an
objective, informed manner to ensure the reviews are not prematurely declared
complete.
The Alternative Systems Review (ASR) is held to support a dialogue between the end
user and acquisition community and leads to a draft performance specification for the
preferred materiel solution. The ASR typically occurs during the Materiel Solution
Analysis (MSA) phase, after completion of the Analysis of Alternatives (AoA) and before
Milestone A. It focuses technical efforts on requirements analysis.
The ASR should evaluate whether there is sufficient understanding of the technical
maturity, feasibility, and risk of the preferred materiel solution, in terms of addressing
the operational capability needs in the Initial Capabilities Document (ICD) and meeting
affordability, technology, and operational effectiveness and suitability goals.
The ASR helps the Program Manager and Systems Engineer ensure that further
engineering and technical analysis needed to draft the system performance
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      249
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
specification is consistent with customer needs.
CJCSI 3170.01 calls for a Functional Capabilities Board (FCB) review prior to Milestone
A. This FCB review should ensure compatibility between the operational capability
needs in the ICD and the maturity, feasibility, and risks of the preferred materiel
solution.
The ASR typically occurs after the AoA is complete and after a preferred materiel
solution is selected by the lead Service or Component but before the FCB review.
Figure 4.2.9.F1 provides the end-to-end perspective and the integration of SE technical
reviews and audits across the acquisition life cycle.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      250
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                    Figure 4.2.9.F1. Weapon System Development Life Cycle
This timing allows the focus of the ASR to be on the preferred materiel solution rather
than on all the alternatives and allows for some post-AoA technical analysis to be
completed and inform the FCB deliberations.
    •    The AoA results are an input to the ASR. The AoA should have evaluated a
         number of candidate materiel solutions and identified those alternatives that can
         meet the user requirements within the remaining trade space (including cost and
         affordability constraints).
    •    After the AoA is complete, the operational requirements community and the
         acquisition community collaboratively identify one or more preferred materiel
         solution(s) with the potential to be affordable, operationally effective and suitable,
         sustainable, and technically and technologically achievable (i.e., able to provide a
         timely solution to the stated operational capability need at an acceptable level of
         risk). This preferred materiel solution is also an input to the ASR.
    •    The draft concept of operations (CONOPS) should be available as an input to the
         ASR. It should have been available for use in the AoA and can then be used to
         support development of missions and operational scenarios used to evaluate the
         preferred materiel solution.
Table 4.2.9.T1 defines the suggested ASR artifacts and associated review criteria. The
review should not begin until these criteria are considered met. This is a best practice
review.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      251
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                Table 4.2.9.T1. ASR Products and Criteria
The Technical Review Chair determines when the review is complete. ASR technical
outputs should include, but not be limited to, the following products, including supporting
rationale and trade study results:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      252
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Informed advice to the user-developed draft Capability Development Document
         (CDD) required at Milestone A
    •    Are consistent with the preferred materiel solution (including its support concept)
         from the Materiel Solution Analysis (MSA) phase
    •    Are consistent with available technologies resulting from the prototyping efforts
    •    Adequately consider the maturity of interdependent systems
All system requirements and performance requirements derived from the Initial
Capabilities Document (ICD) or draft Capability Development Document (CDD) should
be defined and consistent with cost, schedule, risk, and other system constraints; and
with end user expectations. Also important to this review is a mutual understanding
(between the program office and the developer) of the technical risk inherent in the
system performance specification.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      253
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Approve, fund, and staff the SRR as planned in the SEP developed by the
         Systems Engineer
    •    Manage and approve changes to the system performance specification
    •    Establish the plan to SFR in applicable contract documents including the SE
         Master Plan, Integrated Master Schedule (IMS), and Integrated Master Plan
         (IMP)
    •    Ensure the plan includes independent subject matter experts to participate in
         each review
    •    Ensure all performance requirements, both explicit and derived, are defined and
         traceable (both directions) between requirements in the draft CDD including Key
         Performance Parameters (KPPs), Key System Attributes (KSAs), other system
         attributes, and the system performance specification (see CJCSI 3170.01 JCIDS)
    •    Ensure verification methods are identified for all system requirements
    •    Ensure risk items associated with system requirements are identified and
         analyzed, and mitigation plans are in place
    •    Ensure adequate plans are in place to complete the technical activities to
         proceed from SRR to the System Functional Review (SFR)
    •    Ensure plans to proceed to SFR allow for contingencies
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      254
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                    Figure 4.2.10.F1. Weapon System Development Life Cycle
 Table 4.2.10.T1 defines the suggested SRR products and associated review criteria.
 The review should not begin until these criteria are established, evidence has been
 received to support a case for success, and any prior technical review is completed and
 its action items closed. This is also an opportunity to assess whether technical
 requirements from all acquisition documentation (e.g., Program Protection Plan (PPP),
 Test and Evaluation Master Plan (TEMP), Reliability, Availability, Maintainability, and
 Cost Rationale (RAM-C) Report) are flowed to specifications. If the program’s TDS
 includes competing contractual efforts, an SRR should be held with each developer. A
 risk assessment tool for SRR preparation is the DoD SRR Checklist. This is a best
 practice review.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                       255
 production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                Product                                                           SRR Criteria
                                                      •    Contractor clearly demonstrates an understanding of the
                                                           system requirements consistent with the ICD and draft CDD
                                                      •    System requirements are sufficiently detailed and understood
                                                           to enable system functional definition and functional
                                                           decomposition
                                                      •    System requirements are assessed to be verifiable (see
                                                           Chief Developmental Tester in DAG Chapter 9 Test and
                                                           Evaluation)
                                                      •    Requirements can be met given the technology maturation
                                                           achieved and evidence from competitive prototyping
                                                      •    External interfaces to the system have been documented in
                                                           interface control documents
                                                      •    SoS technical interfaces are adequately defined, including
                                                           interdependences associated with schedule, test, and
                                                           configuration changes
System Performance
                                                      •    Preliminary identification of all software components (tactical,
Specification                                              support, deliverable, non-deliverable, etc.) are completed
                                                      •    Human Systems Integration (HSI) and sustainment
                                                           requirements have been reviewed and included in the overall
                                                           system design (see DAG section 4.3.18.10 and DAG
                                                           Chapter 6 Human Systems Integration)
                                                      •    Contractor has adequately expanded the system
                                                           specification to reflect tailored, derived, and correlated
                                                           design requirements
                                                      •    Bidirectional requirements traceability between the draft
                                                           CDD, the Statement of Work (SOW), and the System
                                                           Specification has been documented
                                                      •    System performance specification is approved, including
                                                           stakeholder concurrence, with sufficiently conservative
                                                           requirements to allow for design trade space
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                       256
 production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                Product                                                           SRR Criteria
                                                      •    Contractors Systems Engineering Management Plan (SEMP)
                                                           is complete and adequate
                                                      •    Cost and critical path drivers have been identified
                                                      •    The program schedule is executable with an acceptable level
                                                           of technical and cost risk
                                                      •    Adequate processes and metrics are in place for the program
                                                           to succeed
                                                      •    SE is properly staffed
                                                      •    Program is executable within the existing budget
                                                      •    Software functionality in the system specification is
                                                           consistent with the software sizing estimates and the
                                                           resource-loaded schedule
                                                      •    Programming languages and architectures, security
                                                           requirements, and operational and support concepts have
                                                           been identified
                                                      •    Hazards have been reviewed and mitigating courses of
                                                           action have been allocated within the overall system design
                                                      •    Key technology elements have been identified, readiness
                                                           assessed, and maturation plans developed
                                                      •    Software development strategy is complete and adequate
                                                      •    Program technical risks are adequately identified and
                                                           documented such that there is a clear understanding
                                                           regarding the contractor's ability to meet the specification
                                                           requirements
Technical Plans                                       •    Draft verification methodologies have been adequately
                                                           defined for each specification requirement
                                                      •    Certifying agencies have been identified and certification
                                                           requirements are understood
                                                      •    Draft test plans have been developed in support of the TD
                                                           phase (See Chief Developmental Tester in DAG Chapter 9
                                                           Test and Evaluation)
                                                      •    Government and contractor configuration management (CM)
                                                           strategies are complete and adequate
                                                      •    The Modeling and Simulation (M&S) Plan for life-cycle
                                                           support (including life-cycle costs / total ownership costs
                                                           (LCC/TOC), training devices, tactics, air vehicle, mission
                                                           system etc.) is complete and adequate to support system
                                                           design and operation
                                                      •    The manufacturing and production strategy is complete and
                                                           adequate
                                                      •    Integrated Master Schedule (IMS) adequately identifies the
                                                           critical path and is resourced at reasonable levels, based on
                                                           realistic performance/efficiency expectations
                                                      •    Unique work requirements for competitive prototyping have
                                                           been identified
                                                      •    Product support plan and sustainment concepts have been
                                                           defined with the corresponding metrics
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                       257
 production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Output and Products
The Technical Review Chair determines when the review is complete. Once the
products have been reviewed and approved in SRR, they provide a sound technical
basis for proceeding with system functional definition and preliminary design.
The System Functional Review (SFR) is held to evaluate whether the system functional
baseline satisfies the end-user requirements and capability needs and whether
functional requirements and verification methods support achievement of performance
requirements. At completion of the SFR, the system’s functional baseline is normally
taken under configuration control.
A successful SFR, which typically occurs during the Technology Development (TD)
phase, reduces the risk of continuing the technical effort toward the Preliminary Design
Review (PDR). The SFR is used to:
    •    Assess whether a balanced definition of the system’s major elements has been
         developed, including their functionality and performance requirements
    •    Assess whether the system functional baseline is technically achievable with
         regard to cost, schedule, and performance
    •    Confirm that the system performance specification (typically put on contract) is
         realistic and provides a sound technical foundation for preliminary design
    •    Establish functional baseline and verification criteria to be used during FCA
    •    Approve, fund, and staff the SFR as planned in the Systems Engineering Plan
         (SEP) developed by the Systems Engineer
    •    Manage and approve changes to the system performance specification
    •    Establish the plan to PDR in applicable contract documents including the SE
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      258
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         Management Plan (SEMP), Integrated Master Schedule (IMS), and Integrated
         Master Plan (IMP)
    •    Ensure the plan includes independent subject matter experts to participate in
         each review
    •    Control the configuration of the Government-controlled subset of the functional
         baseline
    •    Chair the configuration control board (CCB) for the system performance
         specification and other documentation used to control the system functional
         baseline
The SFR criteria are developed to best support the program’s technical scope and risk
and are documented in the program’s SEP at Milestone A. Figure 4.2.11.F1 provides
the end-to-end perspective and the integration of SE technical reviews and audits
across the acquisition life cycle.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      259
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                   Figure 4.2.11.F1. Weapon System Development Life Cycle
Table 4.2.11.T1 defines the suggested SFR products and associated review criteria.
The review should not begin until these criteria are considered met and any prior
technical review is completed and its action items closed. If the program’s Technology
Development Strategy (TDS) includes competing contractual efforts, an SFR should be
held with each participating developer. A readiness assessment tool for SFR
preparation is the DoD SFR Checklist. This is a best practice review.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      260
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                                         •    Identified and documented risks, including
                                                              ESOH mitigation measure requirements, at
          Risk Assessment                                     levels that warrant continued engineering
                                                              development
The Technical Review Chair determines when the review is complete. Once the
products have been reviewed and approved in SFR, they provide a sound technical
basis for proceeding into preliminary design.
The Preliminary Design Review (PDR) ensures the preliminary design and basic system
architecture are complete, and that there is technical confidence the capability need can
be satisfied within cost and schedule goals. The PDR provides the acquisition
community, end user, and other stakeholders with an opportunity to understand the
trade studies conducted during the preliminary design, and thus confirm that design
decisions are consistent with the user’s performance and schedule needs prior to formal
validation of the Capability Development Document (CDD). The PDR also establishes
the system’s allocated baseline.
The allocated baseline describes the functional and interface characteristics for all
system elements (allocated and derived from the higher-level product structure
hierarchy) and the verification required to demonstrate achievement of those specified
characteristics. The allocated baseline for each lower-level system element (hardware
and software) is usually established and put under configuration control at the system
element Preliminary Design Review (PDR). This process is repeated for each system
element and culminates in the Program Manager establishing the complete allocated
baseline at the system-level PDR. The Program Manager then verifies the allocated
baseline at the Functional Configuration Audit (FCA) and/or System Verification Review
(SVR) (see DAG section 4.3.7 Configuration Management Process).
The PDR is mandatory. According to DoD policy and guidance, PDR requirements
include the following:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      261
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         and PDUSD(AT&L) memorandum, "Improving Milestone Process Effectiveness")
    •    PDR Report is provided to the Milestone Decision Authority (MDA) prior to the
         Post-PDR Assessment and should reflect any requirements trades based upon
         the Program Manager’s assessment of cost, schedule, and performance risk (
         DoDI 5000.02)
    •    PDR Report should follow the PDR Report template which prescribes the content
         and responsibilities associated with all PDR completion memos
Any tailoring with respect to establishing an allocated baseline at PDR prior to Milestone
B should be consistent with the approved Technology Development Strategy (TDS) and
documented in the Systems Engineering Plan (SEP). In a competitive environment,
each developer should establish an allocated baseline to meet the definition prescribed
in the Request for Proposal (RFP) and associated system performance specification,
consistent with their individual design approach. Since the functional and allocated
baselines are critical to providing the Engineering and Manufacturing Development
(EMD) bidders with a complete technical package, best practices would dictate that the
PDR be completed prior to the pre-EMD Review, although this timing is optional under
policy. The tailoring strategy may also include conduct of a delta-PDR after Milestone B
if the allocated baseline has changed significantly.
In addition, the PDR represents agreement that the proposed plan to proceed to the
Critical Design Review (CDR) is executable and properly resourced. The PDR
establishes the allocated baseline, which is placed under formal configuration control at
this point. The maximum benefit of the PDR process is realized when the allocated
baseline is complete with the following attributes:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      262
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         Review, have been documented in interface control documents
    •    All internal interfaces of the system (system element to system element) have
         been documented in interface control documents
    •    Verification requirements to demonstrate achievement of all specified allocated
         performance characteristics have been documented
    •    Design constraints have been captured and incorporated into the requirements
         and design
Some of the benefits realized from a PDR with the attributes identified above would be
to:
    •    Establish the technical basis for the Cost Analysis Requirements Description
         (CARD), documenting all assumptions and rationale needed to support an
         accurate cost estimate for the APB; technically informed cost estimates enable
         better should cost / will cost management
    •    Establish the technical requirements for the detailed design, EMD contract
         specifications, and Statement of Work; inform the CDD
    •    Establish an accurate basis to quantify risk and identify opportunities
    •    Provide core evidence for the PDR Report
    •    Provide the technical foundation for section 2366b of title 10, United States Code
         certification required for all MDAPs
Some design decisions made at PDR may precipitate discussions with the operational
requirements community because they could have an impact on the CDD. Depending
upon the nature/urgency of the capability required and the current state of the
technology, incremental development might be required. In this case the Sponsor
should document these increments in the CDD and the Program Manager and Systems
Engineer should update relevant program plans.
The Program Manager and Systems Engineer may hold incremental PDRs for lower-
level system elements, culminating with a system-level PDR. The system PDR
assesses the preliminary design as captured in system performance specifications for
the lower-level system elements; it further ensures that documentation for the
preliminary design correctly and completely captures each such specification. The
Program Manager and Systems Engineer evaluate the designs and associated logistics
elements to determine whether they correctly and completely implement all allocated
system requirements, and whether they have maintained traceability to the CDD.
Though many Service systems commands or PEOs define the roles and responsibilities
of the Program Manager and Systems Engineer, the following notional duties and
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      263
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
responsibilities should be considered:
    •    Approve, fund, and staff the system PDR as planned in the SEP developed by
         the Systems Engineer
    •    Establish the plan to CDR in applicable contract documents including the SE
         Management Plan (SEMP), Integrated Master Schedule (IMS), and Integrated
         Master Plan (IMP)
    •    Ensure the SEP includes independent subject matter experts to participate in
         each review
    •    Control the configuration of the Government-controlled subset of the functional
         and allocated baselines; convene Configuration Steering Boards when changes
         are warranted
    •    Submit the PDR Report for approval consistent with the template guidance
    •    Develop and execute the system PDR plans with established quantifiable review
         criteria, carefully tailored to satisfy program objectives
    •    Ensure that the pre-established PDR criteria have been met
    •    Provide industry with an opportunity to participate in this PDR planning (pre-
         contract award is a best practice, where applicable)
    •    Support development of the PDR Report
    •    Ensure assessments and risks associated with all design constraints and
         considerations are conducted, documented, and provided (e.g., reliability and
         maintainability, corrosion, and Environment, Safety, and Occupational Health
         (ESOH) considerations)
    •    Determine the root cause of problems, identify corrective actions, and manage to
         completion
    •    Monitor and control the execution of the PDR closure plans
    •    Document the plan to CDR in the SEP and elsewhere as appropriate
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      264
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                   Figure 4.2.12.F1. Weapon System Development Life Cycle
The PDR review criteria are developed to best support the program’s technical scope
and risk and are documented in the program’s SEP no later than Milestone A. Table
4.2.12.T1 defines the products and associated review criteria. The system-level PDR
review should not begin until these criteria are considered met and any prior technical
review is complete and its action items closed. A readiness assessment tool for PDR
preparation is the DoD PDR Checklist. The PDR is a mandatory technical review.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      265
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
           Product                                                        PDR Criteria
                                         •    Analysis of system performance is complete and is assessed to
                                              meet requirements
                                         •    Preliminary design satisfies design considerations (see DAG section
                                              4.3.11 Requirements Analysis Process)
                                         •    Producibility assessments of key technologies are complete
                                         •    Preliminary system-level design is producible and assessed to be
                                              within the production budget
                                         •    Assessment of the technical effort and design indicates potential for
                                              operational test and evaluation success (operationally effective and
                                              suitable)
                                         •    All Critical Safety Items (CSIs) and Critical Application Items (CAIs)
                                              are identified
                                         •    Functional failure mode, effects, and criticality analysis (FMECA) is
                                              completed
                                         •    Estimate of system reliability and maintainability updated, based on
System Baseline                               engineering analyses, initial test results, or other sources of
Documentation                                 demonstrated reliability and maintainability
(Allocated)                              •    Computer system and software architecture designs have been
                                              established; all Computer Software Configuration Items (CSCIs),
                                              Computer Software Components (CSCs), and Computer Software
                                              Units (CSUs) have been defined
                                         •    Software Requirements Specifications (SRSs) and Interface
                                              Requirement Specifications (IRSs), including verification plans, are
                                              complete and baselined for all CSCs and satisfy the system
                                              functional requirements
                                         •    Interface control documents trace all software interface requirements
                                              to the CSCIs and CSUs
                                         •    Preliminary software design has been defined and captured
                                         •    All required software-related documents are baselined and delivered
                                         •    System-allocated baseline documentation is sufficiently complete
                                              and correct to enable detailed design to proceed with proper
                                              configuration management
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      266
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
           Product                                                        PDR Criteria
                                         •    All entry criteria stated in the contract (e.g., Statement of Work
                                              (SOW), SEP, approved SEMP and system specification) have been
                                              satisfied
                                         •    Integrating activities of any lower-level PDRs have occurred;
                                              identified issues are documented in action plans
                                         •    Plan to CDR is accurately documented in the SEP as well as the
                                              IMP and IMS
                                         •    Program is properly staffed
                                         •    Adequate processes and metrics are in place for the program to
                                              succeed
                                         •    Program schedule, as depicted in the updated IMS (see DAG
                                              Section 4.3.2.2. Integrated Master Plan/Integrated Master Schedule)
                                              is executable within acceptable technical and cost risks
                                         •    Program is executable with the existing budget and the approved
                                              product baseline
                                         •    Trade studies and system producibility assessments are under way
                                         •    Majority of manufacturing processes have been defined,
                                              characterized, and documented
                                         •    Logistics (sustainment) and training systems planning and
                                              documentation are sufficiently complete to support the review
                                         •    Life Cycle Sustainment Plan (LCSP) is approved, including updates
                                              on program sustainment development efforts and schedules based
                                              on current budgets and firm supportability design features
Technical Plans                          •    LCSP includes software support requirements
                                         •    Long-lead and key supply chain elements are identified
                                         •    Computer system and software design/development approach have
                                              been confirmed through analyses, demonstrations, and prototyping
                                              in a relevant environment
                                         •    Software increments have been defined and capabilities allocated to
                                              specific increments
                                         •    Software trade studies addressing commercial-off-the-shelf, reuse,
                                              and other software-related issues are completed
                                         •    Software development process is defined in a baselined Software
                                              Development Plan and reflected in the IMP and IMS
                                         •    Software development schedules reflect contractor software
                                              processes and IMP/IMS software events for current and future
                                              development phases
                                         •    Software development environment and test/integration labs have
                                              been established with sufficient fidelity and capacity
                                         •    Software metrics have been defined and a reporting process has
                                              been implemented; metrics are being actively tracked and assessed
                                         •    TEMP addresses all CSCI plans, test facilities, and test plans,
                                              including testing required to support incremental approaches and
                                              regression tests
                                         •    Software development estimates (i.e., size, effort (cost), and
                                              schedule) are updated
The Technical Review Chair determines when the review is complete. Completion of the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      267
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
PDR establishes that:
    •    Technical data for the allocated baseline are complete, satisfy the system
         specification, and provide a sufficient foundation for detailed design to proceed
    •    Risks have been balanced and are acceptable with any risk mitigation plans
         approved and documented in the IMS
    •    Feasibility, cost and schedule are determined to be within acceptable risk
         margins
    •    IMS is updated (including systems and software critical path drivers) and
         includes all activities required to complete CDR (assuming same developer
         responsible for PDR and CDR)
    •    Corrective action plans for issues identified in the PDR have been completed
    •    CARD is updated and reflects the design in the allocated baseline
    •    LCSP is updated to reflect development efforts and schedules
The Critical Design Review (CDR) confirms the system design is stable and is expected
to meet system performance requirements, confirms the system is on track to achieve
affordability and should cost goals as evidenced by the detailed design documentation,
and establishes the system’s initial product baseline. The system CDR occurs during
the EMD phase and typically marks the end of the integrated system design efforts and
readiness to continue with system capability and manufacturing process demonstration
activities.
The CDR provides the acquisition community with evidence that the system, down to
the lowest system element level, has a reasonable expectation of satisfying the
requirements of the system performance specification as derived from the Capability
Development Document (CDD) within current cost and schedule constraints.
The CDR establishes the initial product baseline for the system and its constituent
system elements. It also establishes requirements and system interfaces for enabling
system elements such as support equipment, training system, maintenance, and data
systems. At this point the system has reached the necessary level of maturity to start
fabricating, integrating, and testing pre-production articles with acceptable risk.
The product baseline describes the detailed design for production, fielding/deployment,
and operations and support. The product baseline prescribes all necessary physical
(form, fit, and function) characteristics and selected functional characteristics
designated for production acceptance testing and production test requirements. It is
traceable to the system performance requirements contained in the Capability
Development Document (CDD). The initial system element product baseline is
established and placed under configuration control at the system element CDR and
verified later at the Physical Configuration Audit (PCA). In accordance with DoDI
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      268
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
5000.02, the Program Manager assumes control of the initial product baseline for all
Class I configuration changes at the completion of the system level CDR to the extent
that the competitive environment permits. This does not necessarily mean that the
Program Manager takes delivery and acceptance of the Technical Data Package (TDP)
(for more information, see DAG section 4.3.7. Configuration Management Process).
The Systems Engineer documents the approach for the CDR in the Systems
Engineering Plan (SEP). This includes identification of criteria, and artifacts defining the
product baseline.
The Program Manager reviews and approves the approach, ensures the required
resources are available, and recommends independent review participants.
The Program Manager and Systems Engineer may hold incremental CDRs for lower-
level system elements, culminating with a system-level CDR. The system CDR
assesses the final design as captured in system performance specifications for the
lower-level system elements; it further ensures that documentation for the detailed
design correctly and completely captures each such specification. The Program
Manager and Systems Engineer evaluate the detailed designs and associated logistics
elements to determine whether they correctly and completely implement all allocated
system requirements, and whether they have maintained traceability to the CDD.
    •    Approve, fund, and staff the system CDR as planned in the SEP developed by
         the Systems Engineer
    •    Establish the plan to the System Verification Review (SVR) in applicable contract
         documents including the SE Management Plan (SEMP), Integrated Master
         Schedule (IMS), and Integrated Master Plan (IMP)
    •    Ensure the plan includes independent subject matter experts to participate in
         each review
    •    Control the configuration of the Government-controlled subset of the functional,
         allocated, and product baselines; convene Configuration Steering Boards (CSBs)
         when changes are warranted
    •    Develop and execute the system CDR plans with established quantifiable review
         criteria, carefully tailored to satisfy program objectives
    •    Ensure that the pre-established review criteria have been met to ensure the
         design has been captured in the allocated baseline and initial product baseline
    •    Ensure assessments and risks associated with all design constraints and
         considerations are conducted, documented, and provided (e.g., reliability and
         maintainability, corrosion, and Environment, Safety, and Occupational Health
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      269
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         (ESOH) considerations)
    •    Determine the root cause of problems, identify corrective actions, and manage to
         completion
    •    Monitor and control the execution of the CDR closure plans
    •    Document the plan to SVR in the SEP and elsewhere as appropriate
The CDR review criteria are developed to best support the program’s technical scope
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      270
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
and risk and are documented in the program’s SEP no later than Milestone B. Table
4.2.13.T1 defines the products and associated review criteria. The system-level CDR
review should not begin until these criteria are considered met and any prior technical
review is complete and its action items closed. A readiness assessment tool for CDR
preparation is the DoD CDR Checklist. The CDR is a mandatory technical review.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      271
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
        Product                                                       CDR Criteria
                                   •    Key product characteristics having the most impact on system
                                        performance, assembly, cost, reliability, and sustainment or ESOH have
                                        been identified to support production decisions
                                   •    Initial product baseline documentation is sufficiently complete and
                                        correct to enable hardware fabrication and software coding to proceed
                                        with proper configuration management
                                   •    Assessment of the technical effort and design indicates potential for
                                        operational test and evaluation success (operationally effective and
                                        suitable) (see DAG Chapter 9 Test and Evaluation)
                                   •    100% of Critical Safety Items and Critical Application Items have
                                        completed drawings, specifications and instructions
                                   •    Failure mode, effects, and criticality analysis (FMECA) is complete
                                   •    Estimate of system reliability and maintainability based on engineering
                                        analyses, initial test results or other sources of demonstrated reliability
                                        and maintainability
                                   •    Detailed design satisfies sustainment and Human Systems Integration
System Baseline                         (HSI) requirements (see DAG Chapter 6 Human Systems Integration)
Documentation                      •    Software functionality in the approved initial product baseline is
(Product)                               consistent with the updated software metrics and resource-loaded
                                        schedule
                                   •    Software and interface documents are sufficiently complete to support
                                        the review
                                   •    Detailed design is producible and assessed to be within the production
                                        budget
                                   •    Process control plans have been developed for critical manufacturing
                                        processes
                                   •    Critical manufacturing processes that affect the key product
                                        characteristics have been identified, and the capability to meet design
                                        tolerances has been determined
                                   •    Verification (developmental test and evaluation (DT&E)) assessment to
                                        date is consistent with the product baseline and indicates the potential
                                        for test and evaluation success (see Test and Evaluation Master Plan
                                        (TEMP) and Chief Developmental Tester in DAG Chapter 9 Test and
                                        Evaluation)
                                   •    All risk assessments and risk mitigation plans have been updated,
                                        documented, formally addressed, and implemented
                                   •    Test and evaluation strategy defined in the TEMP accounts for risks with
                                        a mitigation plan; necessary integration and test resources are
                                        documented in the TEMP and current availabilities align with the
Risk Assessment                         Program’s IMS (Systems Engineer coordinates with Chief
                                        Developmental Tester in this area; see DAG Chapter 9 Test and
                                        Evaluation)
                                   •    ESOH risks are known and being mitigated
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      272
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
        Product                                                       CDR Criteria
                                   •    PDR is successfully completed; all PDR actions are closed
                                   •    Integrating activities of any lower-level CDRs have occurred; identified
                                        issues are documented in action plans
                                   •    All entry criteria stated in the contract (e.g., SOW, SEP, approved
                                        SEMP, and system specification) have been satisfied
                                   •    Adequate processes and metrics are in place for the program to
                                        succeed
                                   •    Program schedule as depicted in the updated IMS (see DAG section
                                        4.3.2.2. Integrated Master Plan/Integrated Master Schedule) is
                                        executable (within acceptable technical/cost risks)
                                   •    Program is properly staffed
                                   •    Program is executable with the existing budget and the approved initial
Technical Plans                         product baseline
                                   •    Detailed trade studies and system producibility assessments are under
                                        way
                                   •    Materials and tooling are available to meet the pilot line schedule
                                   •    Logistics (sustainment) and training systems planning and
                                        documentation are sufficiently complete to support the review
                                   •    Life-Cycle Sustainment Plan (LCSP), including updates on program
                                        sustainment development efforts and schedules based on current
                                        budgets, test and evaluation results, and firm supportability design
                                        features, is approved
                                   •    Long-lead procurement plans are in place; supply chain assessments
                                        are complete
The Technical Review Chair determines when the review is complete. Completion of the
CDR should provide the following:
Note that baselines for some supporting items might not be at the detailed level and
may lag the system-level CDR. Enabling systems may be on different life-cycle
timelines. The CDR agenda should include a review of all this information, but any
statement that all detailed design activity on these systems is complete may lead to
misunderstandings. As an example, development of simulators and other training
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      273
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
systems tends to lag weapon system development.
The System Verification Review (SVR) is the technical assessment point at which the
actual system performance is verified to meet the requirements in the system
performance specification and is documented in the system functional baseline. The
Functional Configuration Audit (FCA) is the technical audit during which the actual
performance of a system element is verified and documented to meet the requirements
in the system element performance specification in the allocated baseline. Further
information on FCA can be found in MIL-HDBK-61A. SVR and FCA are sometimes
used synonymously when the FCA is at the system level. The SVR/FCA typically occurs
during the Engineering and Manufacturing Development (EMD) phase.
When a full-up system prototype is not part of the program’s acquisition strategy, the
FCA is used to validate system element functionality. Other system-level analysis is
then used to ascertain whether program risk warrants proceeding to system initial
production for Operational Test and Evaluation (OT&E). Verification of system
performance is later accomplished on a production system.
A successful SVR/FCA reduces the risk when proceeding into initial production for the
system to be used in operational test and evaluation (OT&E). The SVR/FCA is used to:
    •    Approve, fund, and staff the SVR/FCA as planned in the Systems Engineering
         Plan (SEP) developed by the Systems Engineer
    •    Establish the plan to the Production Readiness Review (PRR) in applicable
         contract documents including the SE Management Plan (SEMP), Integrated
         Master Schedule (IMS), and Integrated Master Plan (IMP)
    •    Ensure the SEP includes independent subject matter experts to participate in
         each technical review/audit
    •    Continue to control Class I changes to the system product baseline (see DAG
         section 4.3.7. Configuration Management Process)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      274
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
The unique Systems Engineer responsibilities associated with an SVR/FCA include:
    •    Develop and execute the SVR/FCA plans with established quantifiable review
         criteria, carefully tailored to satisfy program objectives
    •    Ensure the pre-established technical review/audit criteria have been met
    •    Ensure all requirements in the system performance specification have been
         verified through the appropriate verification method and have been appropriately
         documented
    •    Ensure technical risk items associated with the verified system product baseline
         are identified and analyzed, and mitigation plans are in place
    •    Monitor and control the execution of the SVR/FCA closure plans
    •    Ensure adequate plans and resources are in place to accomplish the necessary
         technical activities between SVR, PRR and Physical Configuration Audit (PCA);
         these plans should allow for contingencies
The SVR/FCA criteria are developed to best support the program’s technical scope and
risk and are documented in the program’s SEP no later than Milestone B. Table
4.2.14.T1 defines the suggested SVR/FCA products and associated review criteria. The
review should not begin until these criteria are considered met and any prior technical
review is complete and its action items closed. A readiness assessment tool for SVR
preparation is the DoD SVR Checklist. This is a best practice review.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      275
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                           Table 4.2.14.T1. SVR/FCA Products and Criteria
The Technical Review Chair determines when the review is complete. Once the
products have been reviewed and approved in SVR/FCA, they provide a sound
technical basis for proceeding into initial production for the system to be used in OT&E.
The Production Readiness Review (PRR) for the system determines whether the
system design is ready for production, and whether the developer has accomplished
adequate production planning for entering Low-Rate Initial Production (LRIP) and Full-
Rate Production (FRP). Production readiness increases over time with incremental
assessments accomplished at various points in the life cycle of a program.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      276
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
In the early stages, production readiness assessments should focus on high-level
manufacturing concerns such as the need for identifying high-risk and low-yield
manufacturing processes or materials, or the requirement for manufacturing
development efforts to satisfy design requirements. As the system design matures, the
assessments should focus on adequate production planning, facilities allocation,
producibility changes, identification and fabrication of tools and test equipment, and
long-lead items. The system PRR, held prior to Milestone C, should provide evidence
that the system can be produced with low risk and no breaches in cost, schedule, and
performance thresholds. See also the System Capability and Manufacturing Process
Demonstration described in DoDI 5000.02 Enclosure 2 paragraph 6.c.(6)(d).
For complex systems, a PRR may be conducted for one or more system elements. In
addition, periodic production readiness assessments should be conducted during the
Engineering and Manufacturing Development phase to identify and mitigate risks as the
design progresses. The incremental reviews lead to an overall system PRR. See DAG
section 4.2.8. Technical Reviews and Audits Overview for more on this incremental
approach.
The unique Program Manager responsibilities associated with a system PRR include:
    •    Approve, fund, and staff the PRR as planned in the Systems Engineering Plan
         (SEP) developed by the Systems Engineer
    •    Establish the plan to Physical Configuration Audit (PCA) in applicable contract
         documents including the SE Management Plan (SEMP), Integrated Master
         Schedule (IMS), and Integrated Master Plan (IMP)
    •    Ensure the plan includes independent subject matter experts to participate in
         each review
    •    Determine if the readiness of manufacturing processes, quality management
         system, and production planning (i.e., facilities, tooling and test equipment
         capacity, personnel development and certification, process documentation,
         inventory management, supplier management, etc.) provide low-risk assurances
         for supporting LRIP and FRP
    •    Continue to control Class I changes to the system product baseline (see DAG
         section 4.3.7. Configuration Management Process)
The unique Systems Engineer responsibilities associated with a system PRR include:
    •    Develop and execute the PRR plans with established quantifiable review criteria,
         carefully tailored to satisfy program objectives
    •    Ensure that the pre-established review criteria have been met to ensure the
         production capability forms a satisfactory, affordable, and sustainable basis for
         proceeding into LRIP and FRP
    •    Advise the Program Manager on whether production capability forms a
         satisfactory, affordable, and sustainable basis for proceeding into LRIP and FRP
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      277
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Ensure adequate plans and resources are in place to proceed from PRR to PCA
         and FRP Decision Review (DR)
    •    Ensure plans to proceed to PCA and FRP DR allow for contingencies
    •    Ensure production implementation supports overall performance and
         maintainability requirements
    •    Monitor and control the execution of the PRR closure plans
The PRR criteria are developed to best support the program’s technical scope and risk
and are documented in the program’s SEP no later than Milestone B. Table 4.2.15.T1
defines the suggested PRR products and associated review criteria. The review should
not begin until these criteria are considered met and any prior technical review is
completed and its action items closed. A readiness assessment tool for PRR
preparation is the DoD PRR Checklist. This is a best practice review.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      278
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
            Product                                                       PRR Criteria
                                           •    Producibility trade studies and risk assessments are completed
                                           •    Manufacturing, production, and quality risks are identified, and a
                                                mitigation plan exists to mitigate those risk(s)
 Risk Assessment                           •    Environment, Safety, and Occupational Health (ESOH) risks are
                                                known and mitigated
A follow-on PRR may be appropriate in the Production and Deployment (PD) phase for
the prime contractor and major subcontractors if:
The PRR is designed as a system-level preparation tool and should be used for
assessing risk as the system transitions from development to FRP. For more
information see the approaches described in DAG section 4.3.18.18. Producibility,
Quality, and Manufacturing Readiness.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      279
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Outputs and Products
The Technical Review Chair determines when the review is complete. Results of the
PRR and associated manufacturing readiness assessments are typically documented in
a written report or out-brief. The results should be reported based on the criteria
documented in the SEP, using the PRR checklist. Another source of information is the
Manufacturing Readiness Level Deskbook to be used as appropriate.
The Physical Configuration Audit (PCA) is a formal examination to verify the "to be
fielded" configuration of a validated system against its design and manufacturing
documentation. The objective of the PCA is to resolve any discrepancies between the
production-representative item that has successfully passed Operational Test and
Evaluation (OT&E) and the associated documentation currently under configuration
control. A successful PCA provides the Milestone Decision Authority (MDA) with
evidence that the product design is stable, the capability meets end-user needs, and
production risks are acceptably low. At the conclusion of the PCA, the final product
baseline is established and all subsequent changes are processed by formal
engineering change action. Further information can be found in MIL-HDBK-61A.
The PCA is an event-driven technical assessment and typically occurs during the
Production and Deployment (P&D) phase, after successful system validation but prior to
the Full-Rate Production Decision Review (FRP DR). A PCA conducted during FRP
may miss the opportunity to avoid costly defects built into production. While the system-
level PCA typically occurs before the FRP DR, other system element PCAs may be
conducted at various points in advance of the system-level PCA.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      280
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Roles and Responsibilities
The unique Program Manager responsibilities associated with a system PCA include:
    •    Approve, fund, and staff the PCA as planned in the Systems Engineering Plan
         (SEP) developed by the Systems Engineer
    •    Establish the plan to FRP DR in applicable contract documents including the SE
         Management Plan (SEMP), Integrated Master Schedule (IMS), and Integrated
         Master Plan (IMP)
    •    Ensure the plan includes independent subject matter experts to participate in
         each review
    •    Determine if the readiness of manufacturing processes, quality management
         system, and production planning (i.e., facilities, tooling and test equipment
         capacity, personnel development and certification, process documentation,
         inventory management, supplier management, etc.) provide low-risk assurances
         for supporting FRP
    •    Continue to control Class I changes to the system product baseline (see DAG
         section 4.3.7. Configuration Management Process)
The unique Systems Engineer responsibilities associated with a system PCA include:
    •    Develop and execute the PCA plans with established quantifiable review criteria,
         carefully tailored to satisfy program objectives
    •    Coordinate with configuration management and manufacturing SMEs and the
         production contractor/production facility to develop an efficient approach to the
         PCA
    •    Identify method(s) of examining the production-representative item (e.g.,
         disassembly, inspection, and reassembly) and verify the item against related
         design documentation
    •    Ensure that the pre-established review criteria have been met to ensure the
         production capability forms a satisfactory, affordable, and sustainable basis for
         proceeding with FRP
    •    Advise the Program Manager on whether production capability forms a
         satisfactory, affordable, and sustainable basis for proceeding into FRP
    •    Ensure adequate plans and resources are in place to get from PCA to Full
         Operational Capability (FOC)
    •    Ensure plans to get to FOC allow for contingencies
    •    Ensure production implementation supports overall performance and
         maintainability requirements
    •    Monitor and control the execution of the PCA closure plans
When the program does not plan to control the detailed design or purchase the item’s
technical data, the developer should conduct an internal PCA to define the starting point
for controlling the detailed design of the item and establishing a product baseline.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      281
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Inputs and Audit Criteria
Figure 4.2.16.F1 provides the end-to-end perspective and the integration of SE
technical reviews and audits across the acquisition life cycle.
                   Figure 4.2.16.F1. Weapon System Development Life Cycle
                                                                               The
PCA criteria are developed to best support the program’s technical scope and risk and
are documented in the program’s SEP no later than Milestone C. The PCA is conducted
when these criteria are considered to be met.
Table 4.2.16.T1 defines the suggested PCA products and associated review criteria.
The review should not begin until the Systems Engineer judges that all criteria have
been met. The DoD PCA Checklist and can be used for assessing readiness for the
audit. This is a best practice audit.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      282
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Outputs and Products
The Technical Review Chair determines when the review is complete. The primary
output of the PCA is a verified product baseline that accurately reflects the validated
system and supports a favorable FRP DR.
The Program Manager establishes the relationships between the developer and the
end-user to facilitate system feedback and assessment. In addition, the Program
Manager determines priorities and approves changes and implementation plans.
The Systems Engineer supports efforts to translate end user feedback into corrective
action plans for possible modifications, technology refresh and/or insertion, Diminishing
Manufacturing Sources and Material Shortages (DMSMS) issues, and other types of
system or system element improvements.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      283
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Inputs and Review Criteria
To enter an ISR, the review should not begin until these criteria are considered met:
The Technical Review Chair determines when the review is complete. The ISR should
result in a plan of corrective action for all issues recommended by the Systems
Engineer as warranting resolution. Areas of particular interest usually include:
    •    System problems are categorized and support the operating and support
         requirements determination process
    •    Required budgets (in terms of work years) are established to address all system
         problems in all priority categories
    •    Current levels of system operational risk and system readiness are quantified
         and related to current operations and systems and procurement budgets
    •    Future levels of system operational risk and system readiness are quantified and
         related to future operations and systems and procurement budgets
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      284
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
4.3. Systems Engineering Processes
The systems engineering (SE) processes are used by contractor and Government
organizations to provide a framework and methodology to plan, manage, and implement
technical activities throughout the acquisition life cycle. SE planning and execution
should focus on applying the processes and tools in a rigorous, integrated, and
disciplined manner to achieve a system solution that balances performance, cost,
schedule, and risk. The eight technical management processes provide a consistent
framework for managing technical activities and identifying the technical information and
events critical to the success of the program. The eight technical processes ensure the
system design and the delivered capability reflect the requirements that the
stakeholders have expressed. As a whole, the SE processes provide a systematic
approach focused on providing needed capability to the operational end user.
Successful implementation of the SE processes results in an integrated capability
solution that is:
All organizations performing SE should scale their application and use of these
processes to the type of product or system being developed. This scaling should reflect
the system’s maturity and complexity, size and scope, life-cycle phase, and other
relevant considerations. Disciplined application of the SE processes provides a
technical framework that enables sound decision making, increases product knowledge
and system maturity, and helps reduce risk. The following subsections, as indicated in
Table 4.3.1.T1, discuss the SE processes in more detail.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      285
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Interface Management (4.3.9)                              Transition (4.3.17)
The Program Manager and Systems Engineer use the technical management
processes as insight and control functions for the overall technical development of the
system throughout the acquisition life cycle. They use the technical processes to
design, create, and analyze the system, system elements, and enabling system
elements required for production, integration, test, deployment, support, operation, and
disposal.
The SE processes, and their constituent activities and tasks, are not meant to be
performed in a particular time-dependent or serial sequence. The Program Manager
and Systems Engineer apply the processes iteratively, recursively and in parallel (as
applicable) throughout the life cycle to translate identified capability needs into balanced
and integrated system solutions. The Systems Engineer is responsible for developing
the plan and applying the SE processes across the program, monitoring execution
throughout the life cycle, and taking necessary steps to improve process efficiency and
effectiveness.
Table 4.3.1.T2 is a representation of how much effort is typically focused on each of the
SE processes throughout the acquisition life cycle. The Program Manager and Systems
Engineer should apply appropriate resources with requisite skill sets to ensure
successful execution of each process.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      286
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Table 4.3.1.T2. Notional Emphasis of Systems Engineering Processes Throughout
                the Defense Weapon System Acquisition Life Cycle
The Technical Planning process includes defining the scope of the technical effort
required to develop, field, and sustain the system, as well as providing critical
quantitative inputs to program planning and life-cycle cost estimates. Technical planning
provides the Program Manager and Systems Engineer with a framework to accomplish
the technical activities that collectively increase product maturity and knowledge and
reduce technical risks. Defining the scope of the technical effort provides:
    •    An accurate basis for program cost and schedule estimates, documented in the
         Independent Cost Estimate (ICE), Cost Analysis Requirements Description
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      287
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         (CARD), and Acquisition Program Baseline (APB);
    •    A foundation for risk identification and management (see DAG section 4.3.6. Risk
         Management Process);
    •    Quantitative measures supporting the Technical Assessment process (see DAG
         section 4.3.4. Technical Assessment Process) identifying system maturity; and
    •    An accurately constructed and resourced IMS supporting the assignment of
         Earned Value.
The resulting program cost estimates and risk assessments are essential to support
milestone decisions, establish the plan for accomplishing work against which contract
performance is measured, and enable mandatory program certifications (e.g., section
2366a or section 2366b title 10 United States Code).
Technical planning includes the program’s plan for technical reviews and audits (see
DAG sections 4.2.8. through 4.2.17.). It should also account for resources (skilled
workforce, support equipment/tools, facilities, etc.) necessary to develop, test, produce,
deploy, and sustain the system.
Technical planning should be performed in conjunction with, and address, key elements
and products of all the other SE processes to ensure the program’s technical plan is
comprehensive and coherent. For example, it should be used with the Technical
Assessment process to evaluate the progress and achievements against requirements,
plans, and overall program objectives. If significant variances are detected, this process
includes re-planning as appropriate.
The Program Manager and Systems Engineer should ensure that technical planning
remains current throughout the acquisition life cycle. They should initiate technical
planning activities early in the life cycle prior to the Materiel Development Decision (see
DAG section 4.2.2. Pre-Materiel Development Decision) and during the Materiel
Solution Analysis (MSA) phase (see DAG section 4.2.3. Materiel Solution Analysis
Phase). Beginning in MSA, programs begin to capture their technical planning in the
Systems Engineering Plan (SEP) (see DAG section 4.1.2. Systems Engineering Plan),
which is required at each milestone review from Milestone A to Milestone C. As the
system matures and issues arise throughout the life cycle, the Program Manager and
Systems Engineer should consistently look for root cause(s) and implement corrective
actions in order to enable programmatic and technical success. Modifications to the SE
processes and SEP may be required because of root cause and corrective action
analysis and implementation.
The Program Manager is ultimately responsible for all program plans. The Systems
Engineer is responsible for:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      288
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         (SEMP)
    •    Providing key technical inputs and ensuring SEP alignment to other program
         plans (Technology Development Strategy/Acquisition Strategy (TDS/AS), Test
         and Evaluation Strategy/Test and Evaluation Master Plan (TES/TEMP), Life-
         Cycle Sustainment Plan (LCSP) and Programmatic Environment, Safety, and
         Occupational Health Evaluation (PESHE)
Technical Planning should reflect the context of the organization and comply with all
applicable policies. The Program Manager and Systems Engineer should consider all
relevant constraints when identifying technical tasks, sequencing these tasks, and
estimating resources and budgets. Inputs to the technical planning process vary over
time as the program evolves and the system matures. Technical Planning includes the
following activities:
Key factors that the Systems Engineer should consider when accomplishing technical
planning include:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      289
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
In addition to the SEP, the technical planning effort supports the development of the
following documents:
Other useful resources available to assist the Program Manager and Systems Engineer
in the Technical Planning process can be found in the "Guidance & Tools" section of the
ODASD(SE) Policy and Guidance website.
The Work Breakdown Structure (WBS) provides a consistent and visible framework for
materiel items and contracts within a program throughout its life cycle. It provides a
product-oriented division of tasks by breaking down work scope for authorization,
tracking, and reporting purposes. The WBS is defined, developed, and maintained
throughout the acquisition life cycle based on a disciplined application of the systems
engineering (SE) process. The goal is to develop a WBS that defines the logical
relationship among all program elements to a specified level. The WBS integrates
technical, cost, and schedule parameters, giving the Program Manager a tool to:
There are two types of WBS: (1) the Program WBS and (2) the Contract WBS (including
flow-down reporting requirements). The Program WBS provides a framework for
specifying program objectives. Each WBS element provides logical summary levels for
assessing technical accomplishments, for supporting the required event-based technical
reviews, and for measuring cost and schedule performance. It represents the entire
program from the Government Program Manager’s responsibility. The contract WBS is
the Government - approved WBS for program reporting purposes and includes all
program elements (for example, hardware, software, services, data, or facilities), which
are the contractor’s responsibility. It includes the contractor’s discretionary extension to
lower levels, in accordance with Government direction and the contract Statement of
Work (SOW). The WBS depicts the system as a product-oriented tree, which may be
found in a system model. Requirements for developing a WBS are found in MIL-STD-
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      290
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
881C. The Program Manager, in conjunction with the Systems Engineer, should
develop a comprehensive WBS early in the program to support planning, cost and
schedule estimates, and risk mitigation activities.
The WBS provides a common thread for the Earned Value Management System
(EVMS), the Integrated Master Plan (IMP) and the Integrated Master Schedule (IMS),
allowing consistency in understanding and communicating program cost and schedule
performance. Additional information about EVMS can be found in DAG Chapter 11
Program Management Activities.
Planning tasks by WBS elements serves as the basis for mapping the development of
the technical baseline for estimating and scheduling resource requirements (people,
facilities, and equipment). By breaking the system into successively smaller pieces, the
Program Manager can ensure all system elements and enabling system elements are
identified in terms of cost, schedule, and performance goals in order to reduce risk.
DoDI 5000.02 requires use of the IMS, and the Integrated Master Plan and Integrated
Master Schedule Preparation and Use Guide provides additional guidance on
developing and implementing these technical planning tools.
A program should have an adequate IMP and IMS and should require the same from its
contractor(s). The IMP and IMS communicate the expectations of the program team and
provide traceability to the management and execution of the program by IPTs. They
also provide traceability to the WBS, the contract WBS (CWBS), the Statement of Work
(SOW), systems engineering (SE), and risk management, which together define the
products and key processes associated with program success.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      291
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
The IMP and IMS represent the basis for contractor cost reporting and the associated
assessments of contract performance, as defined at the Integrated Baseline Review
(IBR) (see DAG Chapter 11 Program Management Activities). The IMP and IMS help
the Program Manager and Systems Engineer:
The IMP documents the significant criteria necessary to complete the accomplishments,
and ties each to a key program event. The IMS expands on the IMP with an integrated
network of tasks, subtasks, activities, schedule for deliverables, and milestones with
sufficient logic and durations. The IMS also serves as a tool for time-phasing work and
assessing technical performance. IMS activities are thus traceable to the IMP and the
WBS, and allow integrated assessments of cost, schedule, technical performance, and
associated risks. This traceability serves to:
The IMP and IMS support effective management of program scope, risk, and day-to-day
efforts. During the initial stages of a program, the IMP provides an early understanding
of the required scope of work, key events, accomplishment criteria, and the likely
program structure by depicting the progression of work through the remaining phases.
Regular examination of the plan and schedule increases the documented level of detail
and provides confidence that these documents have properly identified and captured all
essential activities.
Early identification of and adherence to critical path tasks is essential to ensure that the
program remains on track toward achieving schedule and cost goals. The IMS provides
linkages between tasks to capture the relationship of predecessor and successor tasks
required to initiate or complete major tasks. The IMP and IMS collectively assist
stakeholder communication by establishing expectations and dependencies, particularly
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      292
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
for tasks performed by different organizations.
The Program Manager and Systems Engineer should determine an appropriate level of
detail for the IMS. For low-risk programs, developing the IMS at too high a level of detail
may fail to show critical path tasks. The IMS for a high-risk program would most likely
show lower levels of detail to aid risk management/mitigation efforts but would typically
carry a greater maintenance cost (tracking progress and updating status).
The initial IMP and IMS should address significant activities to provide a basis for
conducting further risk assessments including identification of tasks associated with
moderate to high risks that may emerge later in the life cycle. The IMS should be seen
as a tool used by stakeholders during each phase of the program. The IMS should
identify all risk mitigation activities for easy identification and tracking.
The Program Manager and Systems Engineer should monitor development of the IMS
by the developer to ensure that activity durations and resources are reasonable. This
oversight aids risk analysis and development of mitigation plans in the event that any of
those activities become delayed or over budget. The initial IMP should be part of the
preparation for the Milestone A decision.
The Systems Engineer also defines functional and life cycle inputs to integrate SE
processes and products and to provide an auditable sequence of tasks and schedules
that can be used to measure cost and schedule status. The development and analysis
of program IMP/IMS data:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      293
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                          Figure 4.3.2.2.F1. IMP/IMS Hierarchy and Content
The Program Manager should review the IMP and IMS for completeness, consistency,
and compatibility. In this review, the Program Manager should evaluate duration and
logic relationships to ensure they accomplish program goals, identify risks, and achieve
desired mitigation.
The Systems Engineer should ensure that the SEP and other technical planning
documents capture technical review criteria, event-driven outcomes, and mechanisms
for assessing technical maturity and risk in a manner consistent with tasks and
schedules identified in the IMP/IMS.
The Decision Analysis process transforms a broadly stated decision opportunity into a
traceable, defendable, and actionable plan. It encompasses one or more discrete
analyses at one or more lower (e.g., system element) levels and aggregates them into a
higher-level view (e.g., system "scorecard" presentation) relevant to the decision maker
and other stakeholders. Decision Analysis can be the central process for formulating,
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      294
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
managing, and executing an effective and efficient program at any point in the life cycle.
Decision Analysis and associated trade studies should be integrated with, and mutually
supportive of, aspects of several SE processes in the early stages of the program, in
particular:
A well-executed decision analysis or trade study helps the Program Manager and the
Systems Engineer understand the impact of various uncertainties, identify one or more
course(s) of action that balance competing objectives, and objectively communicate the
results to decision makers. As such, it provides the basis for selecting a viable and
effective alternative from among many under consideration.
Decision Analysis applies to technical decisions at all levels, from evaluating top-level
architectural concepts to sizing major system elements to selecting small design details.
The breadth and depth of the analysis should be scaled to both the scope of the
decision and the needs and expectations of the decision maker(s).
Decision Analysis teams generally include a lead analyst with a suite of reasoning tools;
subject matter experts with access to appropriate models and analytical tools; and a
representative set of end users and other stakeholders. A robust Decision Analysis
process acknowledges that the decision maker has full responsibility, authority, and
accountability for the decision at hand.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      295
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Synthesize results
    •    Analyze sensitivities
    •    Develop decision briefing with action/implementation plan(s)
    •    Make appropriate recommendation(s) to decision maker as expected/requested
Sound recommendations and action plans are the principal output of a well-framed and
well-executed Decision Analysis process. The ability to drill down quickly from overall
trade space visualizations to detailed analyses that support the synthesized views is
particularly useful to decision makers in understanding the basis of observations and
conclusions.
The Technical Assessment process allows the Systems Engineer to compare achieved
results against defined criteria to provide a fact-based understanding of the current level
of product knowledge, technical maturity, program status, and technical risk. This
assessment results in a better understanding of the health and maturity of the program,
giving the Program Manager a sound technical basis upon which to make program
decisions.
Disciplined technical assessment activities should begin early in the life cycle. They
should initially examine the status of development planning activities and efforts in the
Materiel Solution Analysis (MSA) phase. During the Technology Development (TD) and
Engineering and Manufacturing Development (EMD) phases, technical assessment can
provide a basis for tracking development of the system and lower-level system element
designs. Disciplined technical assessment supports the establishment of the various
baselines and the achievement of system verification. Technical assessment activities
are also used in manufacturing and production activities during the Production and
Deployment (P&D) phase, and these activities continue through the Operations and
Support (O&S) phase in support of reliability growth and sustainment engineering
efforts.
The Program Manager and Systems Engineer evaluate technical maturity in support of
program decisions at the key event driven technical reviews and audits (see DAG
sections 4.2.8. through 4.2.17.) that occur throughout the acquisition life cycle. The
Program Manager and Systems Engineer use various measures and metrics, including
Technical Performance Measures (TPM) and leading indicators, to gauge technical
progress against planned goals, objectives, and requirements. See DAG sections
4.3.4.1. Technical Measurement and Metrics and 4.3.4.2. Technical Performance
Measures for more information on measures/metrics and TPMs, respectively. The
Program Support Review (PSR) (see DAG section 4.3.4.3. Program Support Review) is
an assessment to identify and resolve planning and execution issues well before an
upcoming acquisition milestone review.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      296
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Technical assessments against agreed-upon measures enable data-driven decisions.
Evidence-based evaluations that communicate progress and technical risk are essential
for the Program Manager to determine the need for revised program plans or technical
risk mitigation actions throughout the acquisition life cycle.
The Program Manager should ensure that technical assessments occur throughout the
life cycle, and that appropriate resources are available to allow for program office
personnel and independent subject matter experts to participate. The Program Manager
and Systems Engineer should jointly plan for event-driven technical reviews and audits.
Review criteria (e.g., completion of baseline documents and artifacts appropriate for the
review) should support objective assessments of technical progress, maturity, and risk.
When required, the Program Manager should approve the performance measurement
baseline (PMB) (see DAG Chapter 11 Program Management Activities) to capture time-
phased measures against the Work Breakdown Structure (WBS) (see DAG section
4.3.2.1. Technical Measurement and Metrics) and a resource-allocated Integrated
Master Schedule (IMS) (see DAG section 4.3.2.2. Integrated Master Plan/Integrated
Master Schedule).
The Systems Engineer assists the Program Manager in planning and conducting the
Technical Assessment process. This includes advising on technical reviews and audits,
defining the technical documentation and artifacts that serve as review criteria for each
review/audit, and identifying TPMs. Specific activities include:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      297
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         audits
Technical assessments have close linkages to the Technical Planning and Decision
Analysis processes (see DAG section 4.3.2. Technical Planning Process and 4.3.3.
Decision Analysis Process, respectively); however, all SE processes (see DAG sections
4.3.2. through 4.3.17.) support activities that contribute to the assessment of program
status, technical maturity, and risk in various areas (e.g., schedule, technology,
manufacturing, threat).
Inputs to the Technical Assessment process should include approved program plans
"(e.g., Acquisition Program Baseline, Systems Engineering Plan, TPMs, etc.),
engineering products (i.e., drawings, specifications and reports, prototypes, system
elements, and engineering development modules), and current performance metrics.
Outputs may include various reports and findings (e.g., technical review reports,
corrective actions, Program Support Review findings, or test reports).
Measures and metrics assist the Program Manager and the Systems Engineer in efforts
to obtain insight into issues that have real or projected impacts on cost, schedule,
performance, and risk. These issues can be at any level: the entire system, any of the
various system elements or enabling system elements, and any or all of the SE
processes in use across the program. This insight enables the Program Manager and
others in leadership positions to make informed decisions.
Programs document their strategy for identifying, prioritizing, and selecting the set of
metrics for monitoring and tracking SE activities and performance in the Systems
Engineering Plan (SEP). The measures/metrics strategy should include:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      298
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Identification of roles, responsibilities, and authorities
In addition to TPMs and product measures, the Program Manager and the Systems
Engineer should ensure that technical planning identifies measures, metrics, and
leading indicators to assess the effectiveness of SE process execution within both the
Government program office and the developer’s SE organization. TPMs should be
managed by the cognizant Integrated Product Team (IPT).
Areas in which measures and metrics should be monitored include but are not limited
to:
Technical Performance Measures (TPMs) are a subset of metrics and measures that
evaluate technical progress (i.e., product maturity). TPM data support evidence-based
decisions at key knowledge points such as technical reviews and audits or milestone
decisions. TPMs compare the actual versus planned technical development and design.
They report progress in the degree to which system performance requirements are met.
Systems engineering (SE) uses TPMs to balance cost, schedule, and performance
throughout the life cycle when integrated with other management methods such as the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      299
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Work Breakdown Structure (WBS) and Earned Value Management System (EVMS).
The program’s Systems Engineering Plan (SEP) includes a minimum set of TPMs and
the plan to achieve them. The planning should show TPM values as a function of time,
aligned with key points in the program schedule (e.g., technical reviews). Decision
makers can see progress toward achieving the KPPs and KSAs by reviewing actual
values (achieved through analysis, test, demonstration, or other measurement) against
planned values.
    •    Have a time-phased profile with tolerance bands that can be predicted and
         substantiated during design, development, and test
    •    Be directly measurable during testing or readily derivable from analysis
    •    Be derived from the functional baseline and/or allocated baseline
    •    Provide an indication of risk associated with the system’s ability to meet specified
         performance requirements
    •    Be written using statistical criteria whenever possible
Systems Engineers from both the Government and the developer, in consultation with
the end user, identify a limited number of parameters for consideration as TPMs. This
generally occurs as part of the Architecture Design process (see DAG section 4.3.12.
Architecture Design Process), in conjunction with development of the physical
architecture and allocation of requirements to system elements. As the program
matures, the Technical Assessment and Risk Management processes (see DAG
sections 4.3.4. Technical Assessment Process and 4.3.6. Risk Management Process,
respectively) should inform the Program Manager and the Systems Engineer of
progress on risk mitigation actions, as well as emerging risks that could warrant adding
attributes that map to a medium or high risk on the list of TPMs.
The Program Manager, in coordination with the Systems Engineer and developer,
approves selected TPMs. The Program Manager should appropriately delegate
responsibility for management and reporting TPMs. The Systems Engineer defines,
collects, and analyzes performance measurement data for all TPMs to assess
performance over time against threshold and objective values. The Systems Engineer
should assess all TPMs at each technical review and audit.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      300
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
The technical effort documented in the SEP should reflect the events and measurement
activities needed for TPM reporting. TPM tracking should be an integral part of the
developer’s technical planning, and contractors should capture TPM tracking in their
Systems Engineering Management Plan (SEMP).
Figure 4.3.4.2.F1 depicts how leading indicators can influence risk mitigation activities.
The Office of the Deputy Assistant Secretary of Defense for Systems Engineering
(ODASD(SE)) conducts PSRs on ACAT ID and IAM programs to help shape the
program’s technical planning and management approaches. Like any independent
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      301
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
review, the PSR is a technical assessment tool intended to prevent problems by early
recognition of risks and identification of proposed mitigation activities. PSR
requirements appear in DoDI 5000.02.
Early conduct of PSRs should help the Program Manager identify and resolve any
program planning or execution issues well before major program decisions. Table
4.3.4.3.T1 lists important PSR attributes.
                                    •    No "stovepipes"
                                    •    All reviewers look at multiple areas
  Cross-functional                  •    All observations and comments are adjudicated with the
                                         entire team and program office
When practical, the initial PSR occurs nine to twelve months before a milestone
decision review; a follow-up review (two to three months prior to the milestone)
assesses the implementation of key recommendations and mitigation of risks in order to
improve program planning and execution. The PSR typically consists of two- to three-
day visits to the program office (and developer(s) as applicable).
PSRs focus on all SE processes appropriate to the life-cycle phase but are broader in
scope to consider all aspects of acquisition management, including resource planning,
management methods and tools, earned value management, logistics, and other areas.
The Defense Acquisition Program Support (DAPS) Methodology is a source for
tailorable criteria and review questions and helps ensure consistency in reviews. The
DAPS Methodology includes:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      302
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Mission capabilities / requirements generation
    •    Resources
    •    Management
    •    Technical planning and process
    •    Program performance
Insights from PSRs aid the development of the Systems Engineering Plan (SEP) (see
DAG section 4.1.2. Systems Engineering Plan) and the Request for Proposals (RFPs),
and ensure that the program has adequately addressed SE equities in these
documents. After its engagement with the program in preparation for the pre-Milestone
A PSR, the ODASD(SE) staff maintains continuous engagement with the program to
monitor its execution of the planning reflected in the SEP. PSRs prior to Milestones B,
C, and the Full-Rate Production decision can make use of information already vetted
during SE WIPT meetings, various technical reviews (see DAG sections 4.2.8. through
4.2.14.), and program management reviews in order to help reduce the PSR burden on
the program office and developer staff. PSR action items are documented in the
milestone review's Acquisition Decision Memorandum.
Programs should maintain a current and approved set of requirements over the entire
acquisition life cycle. The Requirements Management process helps ensure delivery of
capability that meets intended mission performance to the operational end user.
The end-user needs are usually identified in operational terms at the system level
during implementation of the Stakeholder Requirements Definition and Requirements
Analysis processes; see DAG section 4.3.10. Stakeholder Requirements Definition
Process and 4.3.11. Requirements Analysis, respectively. Through the Requirements
Management process, the Systems Engineer tracks requirements changes and
maintains traceability of end-user needs to the system performance specification and
ultimately the delivered capability. As the system design evolves to lower levels of
detail, the Systems Engineer traces the high-level requirements down to the system
elements through the lowest level of the design. Requirements Management provides
bottom-up traceability from any derived lower-level requirement up to the applicable
source (system-level requirement) from which it originates. This bidirectional traceability
is the key to effective management of system requirements. It enables the development
of an analytical understanding of any system-wide effects of changes to requirements
for a given system element, updating requirements documentation with rationale and
impacts for approved changes. At the same time, bi-directional traceability ensures that
approved changes do not create any "orphaned" lower-level requirements (i.e., that all
bottom-up relationships to applicable system-level requirements remain valid after the
change). Bidirectional traceability also ensures that higher-level requirements are
properly flowed to lower-level requirements and system element designs so that there
are no "childless parent" higher-level requirements (i.e., each high-level requirement is
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      303
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
ultimately being addressed by lower-level requirements and system element designs).
The Program Manager should keep leadership and all stakeholders informed of cost,
schedule, and performance impacts associated with requirement changes and
requirements growth.
All affected stakeholders and decision makers should fully understand the effects of
proposed changes to requirements at the system or system element level before they
accept any changes for incorporation into the design. The RTM provides significant
benefits during trade-off analysis activities since it captures the system-wide effects of
proposed changes to established requirements.
DAG section 4.3.19. Tools and Techniques contains information about SE tools
generally employed in the Requirements Management process. There are many
commercial software packages specifically designed for the traceability aspect of
Requirements Management, from top-level operational requirements down to the
lowest-level system elements in the Work Breakdown Structure.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      304
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
uncertainties and is therefore critical to achieving cost, schedule, and performance
goals at every stage of the life cycle. Effectively managing risks helps the Program
Manager and Systems Engineer develop and maintain a system’s technical
performance, and ensure realistic life-cycle cost and schedule estimates.
DoDI 5000.02 requires that technical and programmatic risks be managed in all life
cycle phases. A program’s Technology Development Strategy (TDS) or Acquisition
Strategy (AS), and Systems Engineering Plan (SEP) should address risks and should
describe the program’s risk management process. DAG section 4.3.18.9. Environment,
Safety, and Occupational Health contains information regarding ESOH related risk
management.
Risk Management is most effective when fully integrated with the program’s SE and
management processes. Identification of risk drivers, dependencies, root causes, and
corrective action, as well as consequence management are key elements of this
integration.
By definition, a risk is an unwanted event that may or may not occur in the future. A risk
has three components:
A risk is an unwanted future event that may or may not occur, meaning it has a
probability of occurrence of less than one. An issue is an unwanted event that has
occurred or is certain to occur in the future (in other words, a probability equal to one).
Thus, an issue differs from a risk only in that it is not a probabilistic event. While
Program Managers and Systems Engineers can use Risk Management approaches to
deal with issues, they should remember that issue management applies resources to
current issues or problems. In contrast, risk management proactively applies resources
to identify and mitigate future potential root causes and their consequences. Risk
management includes the condition when mitigation attempts fail and the risk is
realized. The challenge for the Program Manager and Systems Engineer is to balance
how they choose to deal with issues and risks, since they encounter both over the life of
the program. The Program Manager and Systems Engineer should clearly define,
assess, and consider technical and programmatic off ramps if the program cannot be
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      305
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
adequately advanced given schedule and budget.
Because risks can occur in any aspect of a program, it is important to recognize that all
program team members and stakeholders have a responsibility to identify risks and
report them to the Program Manager and Systems Engineer. Stakeholders also should
be invited to participate in risk analysis and mitigation activities as requested or
directed.
The Systems Engineer is responsible for prioritizing identified technical risks and
developing mitigation actions. The Program Manager reviews and approves the risk
priorities and mitigation plans and ensures required resources are available to
implement the mitigation plans.
    •    Risk Management Guide for DoD Acquisition (Also see DAG Chapter 11
         Program Management Activities for more information on the Program Manager’s
         role in Risk Management)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      306
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    MIL-STD-882E, "DoD Standard Practice for System Safety", May 11, 2012
    •    Joint Capabilities Integration and Development System (JCIDS) Manual (requires
         Common Access Card (CAC) to access website), January 19, 2012
Table 4.3.6.T2 provides insights into the emphasis of Risk Management throughout the
acquisition life cycle. Regardless of phase, several best practices may apply to a
program’s Risk Management process:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      307
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                Table 4.3.6.T2. Focus of Risk Management Process by Phase
                                                         Products / Outputs
Phase                     Focus                                                                    Measures / Metrics
                                                       (Risk Considerations)
Pre-       Risk assessment of the                                                             Identify operational risks
MDD        effort/approach, early                                                             associated with capability gaps,
           assessments of complexity,                                                         measured in terms of probability
           technical maturity, ability to                                                     and consequence
           close or reduce gaps
                                                                                              Estimate resources to
           Mitigation measures include                                                        implement recommendations to
           resourcing teams for further                                                       close or mitigate capability gaps
           detailed evaluation                                                                and reduce operational risk
                                                                                              Identify dependencies and
                                                                                              constraints (e.g., capability
                                                                                              integration and interoperability
                                                                                              with other systems or materiel
                                                                                              solutions) associated with
                                                                                              closing or mitigating capability
                                                                                              gaps
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      308
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                                         Products / Outputs
Phase                     Focus                                                                    Measures / Metrics
                                                       (Risk Considerations)
           Risk Management as a driver for Technology maturity and risk                       Measures that demonstrate
TD         technology readiness,           reduction                                          reduced technology maturity
           preliminary design, and                                                            risks with respect to CT
                                           Validation of CT maturity for a
           Milestone B entrance criteria                                                      developers and producers
                                           materiel solution from
                                           prototypes, experimentation, or                         •    Vendor viability in terms
                                           other form of demonstration                                  of business health,
                                                                                                        market position, industry
                                                    Validation of CT supplier/vendor
                                                                                                        outlook stability
                                                    trustworthiness from a supply
                                                    chain integrity risk perspective               •    Assessments of the CT
                                                    Risk reduction through                              competitive environment
                                                    competitive prototyping:                            to assess reliance risk
                                                                                                        on a single
                                                         •    Broadens the                              vendor/supplier
                                                              opportunity for
                                                                                              Technology Readiness Levels
                                                              technology maturation
                                                                                              (TRL) as the metric to assess
                                                              by engaging multiple
                                                                                              CT maturity
                                                              parties to compete for
                                                              technology prototypes           Affordability monitoring
                                                         •    Can help the program            Continuous should cost
                                                              identify the nature of          estimation
                                                              risk at the subsystem/
                                                                                              Assessment that preliminary
                                                              system level
                                                                                              design has high likelihood of
                                                              (functionality,
                                                                                              satisfying the need within cost
                                                              performance, or
                                                                                              and schedule constraints
                                                              affordability)
                                                    Risks associated with
                                                    preliminary design
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      309
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                                         Products / Outputs
Phase                     Focus                                                                    Measures / Metrics
                                                       (Risk Considerations)
EMD        Risk Management as an                    EMD Risk Management                       PM’s Risk Management
           element of development, full             processes, procedures, and                Dashboard focused on EMD:
           system integration, and                  plan
           Milestone C entrance criteria                                                           •    KPP risk management
                                                    Risk mitigation for establishment
                                                    of qualification requirements                  •    TPM analyses and
                                                    throughout the supply chain                         monitoring
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      310
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                                         Products / Outputs
Phase                     Focus                                                                    Measures / Metrics
                                                       (Risk Considerations)
O&S        Risk Management as an                    O&S Risk Management                       PM’s Risk Management
           element of operational                   processes, procedures, and                Dashboard focused on O&S:
           readiness and FOC                        plan
                                                                                                   •    O&S funding streams
                                                    Special Emphasis: O&S SCRM
                                                                                                   •    Management and burn-
                                                    O&S risk management plan that                       down of technology
                                                    includes addressing the above                       obsolescence risks
                                                    focus areas
                                                                                                   •    Technology insertion
                                                                                                        upgrade schedules and
                                                                                                        refresh rate
                                                                                                   •    Qualification and
                                                                                                        product verification of
                                                                                                        spares suppliers, field
                                                                                                        failure rates, and depot
                                                                                                        failure rates
                                                                                              O&S contract monitoring
The Configuration Management process allows technical insight into all levels of the
system design and is the principal methodology for establishing and maintaining
consistency of a system’s functional, performance, and physical attributes with its
requirements, design, and operational information throughout the system’s life cycle.
Effective configuration management supports the establishment and maintenance of the
product baseline, which enables the successful production, delivery, and sustainment of
the needed capability to the end user.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      311
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         with approved acquisition and life-cycle sustainment strategies
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      312
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                  interface control documents
              o Verification requirements to demonstrate achievement of all specified
                  functional performance characteristics (element CI to element CI level and
                  at the system level) documented
              o Design constraints documented and incorporated into the design
    •    Product Baseline: Describes the detailed design for production,
         fielding/deployment, and operations and support. The product baseline
         prescribes all necessary physical (form, fit, and function) characteristics and
         selected functional characteristics designated for production acceptance testing
         and production test requirements. It is traceable to the system performance
         requirements contained in the Capability Development Document (CDD). The
         initial product baseline includes "build-to" specifications for hardware (product,
         process, material specifications, engineering drawings, and other related data)
         and software (software module design - "code-to" specifications). The initial
         system element product baseline is established and placed under configuration
         control at the system element Critical Design Review (CDR) and verified later at
         the Physical Configuration Audit. In accordance with DoDI 5000.02, the Program
         Manager assumes control of the initial product baseline for all Class I
         configuration changes at the completion of the system-level CDR to the extent
         that the competitive environment permits. This does not necessarily mean that
         the Program Manager takes delivery and acceptance of the Technical Data
         Package. Attributes of the product baseline include:
              o Requirements Traceability Matrix (RTM) is complete
              o The detailed design (hardware and software), including interface
                  descriptions, satisfies the CDD or any available draft Capability Production
                  Document (CPD), and pertinent design considerations
              o Hardware, software and interface documentation are complete
              o Key product characteristics having the most impact on system
                  performance, assembly, cost, reliability, ESOH, and sustainment have
                  been identified
              o Traceability from design documentation to system and system element
                  verification requirements and methods is complete
              o Manufacturing processes that affect the key characteristics have been
                  identified, and capability to meet design tolerances has been determined
The program office and developer share responsibility for planning, implementing, and
overseeing the Configuration Management process and its supporting activities. The
distribution of responsibilities between the program office and the developer varies
based on the acquisition strategy and the life-cycle phase.
The Program Manager approves the Configuration Management Plan and should
ensure adequate resources are allocated for implementing Configuration Management
throughout the life cycle. The Program Manager approves the system baselines, and
approves Class I changes to the product baseline after CDR, usually through a
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      313
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Configuration Control Board (CCB). MIL-HDBK-61A, "Configuration Management
Guidance" defines Class I and II changes:
    •    Class I changes impact the form, fit, function, or interface characteristics of the
         configuration item
    •    Class II changes are changes to a Government approved technical baseline that
         do not meet the definition of a Class I change
Through the Technical Data Management process, the program identifies, acquires,
manages, maintains, and ensures access to the technical data and computer software
required to manage and support a system throughout the acquisition life cycle. Key
Technical Data Management considerations include understanding and protecting
Government intellectual property and data rights, achieving competition goals,
maximizing options for product support, and enabling performance of downstream life-
cycle functions. DoDI 5000.02 contains Technical Data Management requirements for
Acquisition Category (ACAT) I and II programs.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      314
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         interest of achieving cost savings; the lack of product data and/or data rights
         often makes it difficult or impossible to award contracts to anyone other than the
         original manufacturer, thereby taking away much or all of the Government’s
         ability to reduce total ownership costs (TOC)
The Program Manager and Systems Engineer, in conjunction with the Product Support
Manager, should ensure that life-cycle requirements for weapon system-related data
products and data rights are identified early and that appropriate contract provisions are
put in place to enable deliveries of these products. Figure 4.3.8.F1 shows the activities
associated with Technical Data Management, including:
    •    Formulate the program’s Technical Data Rights Strategy (TDRS) and technical
         data management approach, with emphasis on technical and product data
         needed to support the product throughout its life cycle. (see DAG Chapter 2
         Program Strategies for more information about Data Rights).
    •    Ensure that data requirements are documented in the TDRS; summarized in the
         Technology Development Strategy (TDS), Acquisition Strategy (AS), and Life-
         Cycle Sustainment Plan (LCSP) ; and submitted at each milestone prior to award
         of the contract for the next life-cycle phase.
    •    Consider not only the immediate, short-term costs of acquiring the needed
         technical data and data rights but also the long-term cost savings resulting from
         the ability to compete production and logistics support activities and reduce TOC.
         Understand that the Government can possess either Government Purpose or
         Unlimited Rights to use many types of technical data and data rights, at no
         additional cost, based on the type of technical data and the source of funding
         used to generate the data (see DoD Open Systems Architecture Contract
         Guidebook for Program Managers for more information about data rights).
- Acquire Data
    •    Use explicit contract Statement of Work tasks to require the developer to perform
         the work that generates the required data. The content, format, and quality
         requirements should be specified in the contract.
    •    Use current, approved Data Item Descriptions (DID) and Contract Data
         Requirements Lists (CDRL) in each contract to order the delivery of the required
         technical data and computer software.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      315
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         accordance with the relevant data rights agreements and DFARS clauses, and
         contain appropriate distribution statements and/or export control statements.
Caution: Acceptance of delivered data not marked consistent with the contract can
result in the Government "losing" legitimate rights to technical data and can incur
significant legal liability on the Government and the individual Government employees.
Regaining those rights generally requires costly and time-consuming legal actions.
    •    Budget for and fund the maintenance and upkeep of product data throughout the
         life cycle.
    •    An Integrated Data Environment (IDE) or Product Life-cycle Management (PLM)
         system allows every activity involved with the program to create, store, access,
         manipulate, and exchange digital data.
    •    To the greatest extent practical, programs should use existing IDE/PLM
         infrastructure such as repositories operated by Commodity Commands and other
         organizations. (Program-unique IDEs are discouraged because the high
         infrastructure cost; further, multiple IDEs inhibit access, sharing, and reuse of
         data across programs.)
    •    Ensure all changes to the data are made in a timely manner and are documented
         in the program IDE or PLM system.
Plan for and establish methods for access and reuse of product data by all personnel
and organizations that perform life-cycle support activities.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      316
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                              Figure 4.3.8.F1. Data Management Activities
In support of the Government’s requirement for a Technical Data Package (TDP), the
Program Manager should also consider all product related data (e.g., technical
manuals, repair instructions, and design/analysis data) to:
Contractually deliverable data should be identified and ordered at the specific "data
product" level, e.g., two-dimensional drawings, three-dimensional Computer-Aided
Design (CAD) models, technical manuals, etc. Figure 4.3.8.F2 provides a notional
representation of different types of product-related data.
Caution: Program Managers and Systems Engineers should be aware that terms such
as "technical data," "product data," and "TDP" are imprecise, not equivalent, and often
incorrectly used interchangeably.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      317
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    MIL-STD-963, Data Item Descriptions
    •    MIL-STD-31000, Technical Data Packages
- Data Protection
The Program Manager is responsible for protecting system data, whether the data is
stored and managed by the Government or by contractors. The DoD policy with regard
to data protection, marking, and release can be found in:
    •    DoDD 5230.25
    •    DoDI 5230.24
    •    DoD 5400.7-R
    •    DoD 5200.1-M
Data containing information subject to restrictions are protected in accordance with the
appropriate guidance, contract, or agreement. Guidance on distribution statements,
restrictive markings, and restrictions on use, release, or disclosure, of data can be found
in the DFARS Part 252.227-7013 and 7014, and DoDI 5230.24.
When digital data is used, the data should display applicable restriction markings,
legends, and distribution statements clearly visible when the data is first opened or
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      318
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
accessed. These safeguards not only ensure Government compliance regarding the
use of data but also guarantee and safeguard contractor data delivered to the
Government, and extend responsibilities of data handling and use to parties who
subsequently use the data.
Section 208 of Public Law 107-347 and DoD Privacy Impact Assessment (PIA)
guidance requires that PIA be conducted prior to developing or purchasing any DoD
information system that collect, maintain, use, or disseminate personally identifiable
information about members of the public, federal personnel, DoD contractors and, in
some cases, foreign nationals. Available PIA guidance provides procedures for
completing and approving PIAs. For further information, see DAG Chapter 7 Acquiring
Information Technology, Including National Security Systems.
The Interface Management process assists the Program Manager ensure interface
definition and compliance among the system elements, as well as with other systems.
The Interface Management process helps ensure that developers document all internal
and external interface requirements and requirements changes in accordance with the
program’s Configuration Management Plan. Developers also should communicate
interface information to their counterparts responsible for affected systems and system
elements, and should plan for coherent testing to verify expected performance and
ultimately operational performance.
Systems are composed of system elements, and may operate as part of larger systems
of systems (SoS). The design, definition and management of the physical and logical
interfaces, both internal (communications between system elements) and external
(communications between the system and other systems), are critical to program
success. Both types of interfaces have become increasingly important as system
complexity has increased, along with demands for systems to operate in highly
interdependent SoS environments (see DAG section 4.2.1.2. Systems of Systems).
Interfaces play a critical role in all systems and systems of systems that interact to
deliver a collective capability. Complex systems consist of numerous interfaces of
various types. In the absence of effective governance, interface sprawl can result in
degraded system performance, sustainability, and maintainability.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      319
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
systems operate as designed and meet stakeholder expectations throughout the life
cycle. Interface management should consider programmatic issues (e.g., roles and
responsibilities, funding, scheduling) in addition to the technical aspects of systems
engineering (SE) and integration.
The Program Manager and Systems Engineer should ensure that the program’s
interface management plan:
    •    Documents the system’s internal and external interfaces and their requirement
         specifications
    •    Identifies preferred and discretionary interface standards and their profiles
    •    Provides justification for selection and procedure for upgrading interface
         standards
    •    Describes the certifications and tests applicable to each interface or standard
    •    Is consistent with the program’s configuration management plan
The Program Manager and Systems Engineer should ensure that the developer
documents all system interface requirements (see DAG section 4.3.5. Requirements
Management Process), places them under appropriate levels of configuration
management, and makes them available to the appropriate stakeholders. These
documented interface requirements serve critical functions at all levels of the system
throughout the life cycle, including:
The Systems Engineer responsible for interface management has numerous key tasks
throughout the life cycle, including:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      320
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         architectures
The Program Manager should establish an Interface Control Working Group (ICWG)
composed of appropriate technical representatives from the interfacing activities and
other interested participating organizations. The ICWG serves as a forum to develop
and provide interface requirements, as well as to focus on detail interface definition and
timely resolution of issues. In the SoS environment, external program offices and
developers collaborate as members of the ICWG.
During the Stakeholder Requirements Definition process, the lead Service, Component,
or designated program office receives requirements from relevant stakeholders and
translates them into a set of technical requirements. The process helps ensure each
individual stakeholder’s requirements, expectations, and perceived constraints are
understood from the acquisition perspective. Failing to perform an exhaustive
Stakeholder Requirements Definition process could result in significant requirements
creep, rework due to misunderstanding of end-user needs, unexpected contract
modifications, cost growth, and schedule slip. The objective of this process is to help
ensure that stakeholder requirements are feasible, balanced, and fully integrated as
more information is learned through requirements analysis.
The DoD Architecture Framework (DoDAF) provides an approach for DoD architecture
development, presentation, and integration for both warfighting operations and business
operations and processes. For the Net Ready Key Performance Parameter (NR-KPP),
JCIDS and CJCSI 6212.01 specify the data needed to elaborate, communicate, verify,
and validate a system’s interoperability requirements and design. System architectural
descriptions contain three basic viewpoints: operational, system, and standards (or
technical) viewpoints. In the case of the NR-KPP, these viewpoints contain essential
architecture data that describe a system’s interoperability requirements and design from
multiple perspectives. DoDAF provides a standardized approach for capturing and
presenting this architectural data. This standardization facilitates improved
communication and sharing of technical information among various stakeholders and
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      321
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
across organizational boundaries.
The Program Manager and Systems Engineer are responsible for supporting the
Stakeholder Requirements Definition process and should work with the end user to
establish and refine operational needs, attributes, performance parameters, and
constraints documented in JCIDS documents.
The authoritative source for stakeholder requirements are documents produced via the
JCIDS such as the Initial Capabilities Document (ICD), Capability Development
Document (CDD), and the Capability Production Document (CPD). JCIDS analyzes
gaps in existing and/or future warfighting operations and provides a process that allows
the Joint Requirements Oversight Council to balance joint equities and make informed
decisions on validation and prioritization of capability needs.
The Requirements Analysis process involves the decomposition of user needs (usually
identified in operational terms at the system level during implementation of the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      322
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Stakeholder Requirements Definition process; see DAG section 4.3.10 Stakeholder
Requirements Definition Process) into clear, achievable, and verifiable high-level
requirements. As the system design evolves, Requirements Analysis activities support
allocation and derivation of requirements down to the system elements representing the
lowest level of the design. The allocated requirements form the basis of contracting
language and the system performance specification. The resultant system requirements
are addressed at technical reviews and audits throughout the acquisition life cycle and
in applicable program and systems engineering (SE) technical documentation.
    •    Define a capability that links the needs of the users to the system, system
         elements, and enabling system elements to be designed and developed
    •    Define a system that meets users' operational mission requirements within
         specified cost and schedule constraints
    •    Provide insight into the interactions among various functions to achieve a set of
         balanced requirements based on user objectives
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      323
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
minimize impacts of potential cost drivers to establish an agreed-to set of requirements
coordinated with the appropriate stakeholders. Poorly written requirements can lead to
significant problems in the areas of schedule, cost, or performance, and can thus
increase program risk. A well-crafted set of functional/performance requirements can
then be translated into design requirements for the total system over its life cycle and
can allow stakeholders to assess system performance during execution of the
Verification and Validation processes (see DAG sections 4.3.15. Verification Process
and 4.3.16. Validation Process, respectively). Good requirements have the following
attributes:
    •    Necessary
    •    Unique
    •    Unambiguous - clear and concise
    •    Complete
    •    Consistent
    •    Technically feasible/achievable/obtainable
    •    Traceable
    •    Measurable/quantifiable
    •    Verifiable (e.g., Testable)
    •    Able to be validated
    •    Operationally effective
    •    Singular
The Requirements Analysis process ensures that requirements derived from user-
specified capability needs are analyzed, decomposed, and functionally detailed across
the system design. Early development and definition of requirements using the
attributes listed above reduces development time, enables achievement of cost and
schedule objectives, and increases the quality of the final system. Requirements
Analysis encompasses the definition and refinement of the system, system elements,
enabling system elements, and associated functional and performance requirements.
The development of the functional baseline is largely a product of the Requirements
Analysis process. All requirements are placed under configuration control, tracked, and
managed as described in the Requirements Management process and Configuration
Management process (see DAG sections 4.3.5. Requirements Management Process
and 4.3.7. Configuration Management Process, respectively).
Architecture Design is a trade and synthesis process that allows the Program Manager
and Systems Engineer to translate the outputs of the Stakeholder Requirements
Definition and Requirements Analysis processes into alternative design solutions and
establishes the architectural design of candidate solutions that may be found in a
system model. The alternative design solutions may include hardware, software, and
human elements; their enabling system elements; and related internal and external
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      324
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
interfaces. The Architecture Design process, combined with Stakeholder Requirements
Definition and Requirements Analysis, provides key insights into technical risks early in
the acquisition life cycle, allowing for early development of mitigation strategies.
Architecture Design is integral to ensuring that multiple well-supported solutions are
considered. The Architecture Design process supports analysis of design
considerations and enables reasoning about key system aspects and attributes such as
reliability, maintainability, survivability, sustainability, performance, and total ownership
cost.
The functional architecture provides the foundation for defining the system architecture
through the allocation of functions and sub-functions to hardware/software, databases,
facilities, and human operations to achieve its mission. The development of the physical
architecture consists of one or more product structures or views of the physical solution.
The product structure may consist of conceptual design drawings, schematics, and/or
block diagrams that define the system’s form and the arrangement of the system
elements and associated interfaces. The DoD Architecture Framework (DoDAF)
operational and system viewpoints provide one method for developing and describing
the system functional architecture. The development of a physical architecture is an
iterative and recursive process and evolves together with the functional requirements
and functional architecture. Development of the physical architecture is complete when
the system has been decomposed to the lowest system element (usually the lowest
replaceable unit of the support strategy). It is critical that this process identify the design
drivers and driving requirements as early as possible.
The Program Manager may oversee Architecture Design efforts to gain and maintain
insights into program schedule and cost drivers for use in evaluation of alternative
architectures, excursions, mitigation approaches, etc.
During this process, derived requirements come from solution decisions. It is essential
to identify derived requirements and ensure that they are traceable and part of the
allocated requirements. For each given solution alternative, the Decision Analysis
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      325
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
process trades off requirements against given solution alternatives. For each solution
alternative, based on programmatic decisions, certain performance requirements may
be emphasized over others. The essence of this activity is to achieve a balanced and
feasible design with acceptable risk, and that falls within the program design
constraints. An integral part of defining and refining the functional and physical
architecture is to provide technical support to the market research especially early in the
acquisition life cycle. Systems engineers should analyze whether existing products
(commercial or non-developmental items) can meet user performance requirements or
whether technologies can realistically be matured within the required time frame. When
possible, mature technologies should be used to satisfy user needs.
The output of this process is the system allocated baseline, which includes the
documentation that describes the physical architecture of the system and the
specifications that describe the functional and performance requirements for each
configuration item along with the interfaces that compose the system. In addition, Work
Breakdown Structures (WBS) and other technical planning documentation are updated.
The system architecture and the resulting design documentation should be sufficiently
detailed to allow the following:
The result of the Architecture Design process is an architectural design that meets the
end-user capability needs shown in the Requirements Management process to have all
stated and derived requirements allocated to lower level system elements and to have
the possibility of meeting cost, schedule, and performance objectives. The architectural
design should be able to be communicated to the customers and to the design
engineers. The level of detail of the architectural design depends on the complexity of
the system and the support strategy. It should be detailed enough to bound the cost and
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      326
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
schedule of the delivered system, define the interfaces, ensure the customers that the
requirements can be met, and control the design process down to the lowest removable
unit to support operations and sustainment. This architecture design may be
documented and found in a program’s system model. Once identified, the system
architecture is placed under configuration management.
The Implementation process involves two primary efforts: design and realization. The
outputs of the Implementation process include the detailed design down to the lowest
level system elements in the system architecture, and the fabrication/production
procedures of forming, joining, and finishing, or coding for software. Depending on
technology maturity, the Implementation process may develop, buy, or reuse system
elements to render the system. Implementation is integral to systematically increasing
maturity, reducing risk, and ensuring the system is ready for Integration, Verification,
and Validation. The Implementation process provides a system that satisfies specified
design and stakeholder performance requirements. As a best practice, the Systems
Engineer should develop an implementation plan including implementation procedures,
fabrication processes, tools and equipment, implementation tolerances, and verification
uncertainties.
Design
Implementation begins in the Materiel Solution Analysis phase, where the Analysis of
Alternatives informs whether the preferred materiel solution can be developed, bought,
or reused. This analysis takes many forms, such as modeling and simulation,
experiments, and prototypes through which competing systems can be assessed.
Careful decisions regarding the design of system elements can enable the use of open
(non-proprietary) standards and an open systems or modular approach that may allow
for resiliency as well as reduce costs and promote competition during development,
production, technology refresh, and life-cycle extension. Design activities may include:
    •    Identify and analyze the constraints that the technology and design and
         realization techniques impose on the design solution
    •    Develop design and implementation prototypes and solutions for the system
         elements
    •    Analyze candidate system element design and implementation solutions and
         conduct variability studies to identify conflicts and resolution alternatives to
         ensure system integrity
    •    Identify fabrication and quality procedures, and document design assumptions
         and decisions in the final system elements drawings or technical data package
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      327
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Realization
Realization is the process of building the system elements using specified materials and
fabrication and production tools/procedures identified during design. Early fabrication
and production planning is critical for successful realization and delivery of the needed
capability. System elements are built to the product baseline and should meet quality
standards. Realization activities may include:
The output of the Implementation process is the physical system elements as identified
in the product baseline, including fabrication and production methods.
The Interface Management process is critical to the success of the Integration process.
Interface control specifications should be confirmed early on and placed under strict
configuration control. All of the program’s external interfaces and dependencies should
be documented in the program’s Systems Engineering Plan (SEP). The SEP Outline
requires that all programs with external dependencies and/or interfaces establish
Memoranda of Agreement (MOA) in order to formally establish commitments and
management procedures. A current table showing the status of all MOAs is a mandated
as part of the program SEP, which is updated in each phase.
The Program Manager and Systems Engineer are responsible for planning, managing,
and executing the Integration process. Experience has shown that programs that
develop an integration plan are more successful. This plan defines the stages of
integration during which system elements are successively integrated to form higher
level elements and eventually the finished product. Alternative integration paths should
be considered. The integration plan should include a description of the required
Systems Integration Laboratories or other facilities, personnel, test stands, harnesses,
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      328
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
testing software, and integration schedule.
Verification provides evidence that the system or system element performs its intended
functions and meets all performance requirements listed in the system performance
specification and functional and allocated baselines. Verification answers the question,
"Did you build the system correctly?" Verification is a key risk-reduction activity in the
implementation and integration of a system and enables the program to catch defects in
system elements before integration at the next level, thereby preventing costly
troubleshooting and rework.
The Program Manager and Systems Engineer manage verification activities and
methods as defined in the functional and allocated baselines, and review the results of
verification. Guidance for managing and coordinating integrated testing activities can be
found in DAG Chapter 9 Test and Evaluation and in DoDI 5000.02.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      329
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         computer models) to interpret or explain the behavior/performance of the system
         element. Analysis of test data or review and analysis of design data should be
         used as appropriate to verify requirements.
    •    Test. Test is an activity designed to provide data on functional features and
         equipment operation under fully controlled and traceable conditions. The data are
         subsequently used to evaluate quantitative characteristics.
Designs are verified at all levels of the physical architecture through a cost-effective
combination of these methods, all of which can be aided by modeling and simulation.
Verification activities and results are documented among the artifacts for Functional
Configuration Audits (FCA) and the System Verification Review (SVR) (see DAG
section 4.2.14. Functional Configuration Audits/System Verification Review). When
possible, verification should stress the system, or system elements, under realistic
conditions representative of its intended use.
The individual system elements provided by the Implementation process are verified
through developmental test and evaluation (DT&E), acceptance testing, or qualification
testing. During the Integration process, the successively higher level system elements
may be verified before they move on to the next level of integration. Verification of the
system as a whole occurs when integration is complete. As design changes occur, each
change should be assessed for potential impact to the qualified baseline. This may
include a need to repeat portions of verification in order to mitigate risk of performance
degradation.
Validation provides objective evidence that the capability provided by the system
complies with stakeholder performance requirements, achieving its use in its intended
operational environment. Validation answers the question, "Is it the right solution to the
problem?" Validation consists of evaluating the operational effectiveness, operational
suitability, sustainability, and survivability of the system or system elements under
operationally realistic conditions.
The Program Manager and Systems Engineer are responsible for supporting the
Validation process. The execution of the Validation process is typically conducted by
independent testers as documented in the Test and Evaluation Master Plan (TEMP).
System end users and other stakeholders are typically involved in validation activities.
Guidance for managing and coordinating integrated testing activities can be found in
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      330
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
DAG Chapter 9 Test and Evaluation and DoDI 5000.02. Using and engaging integrated
test teams, composed of knowledgeable and experienced Government and industry
developmental and operational testers, bring different perspectives and allow for an
efficient use of resources.
Transition is the process applied to move any system element to the next level in the
physical architecture. For the end-item system, it is the process to install and field the
system to the user in the operational environment. The end-item system may need to be
integrated with other systems in the operational environment honoring the defined
external interfaces. In this case, the transition process needs to be performed in
conjunction with the integration process and interface management process for a
smooth transition.
Early planning for system transition reduces risk and supports smooth delivery and
rapid acceptance by the system’s end user. Transition considerations should include, as
appropriate, user and maintainer requirements, training, deployability, support tasks,
support equipment, and packaging, handling, storage, and transportation (PHS&T). Part
of the Transition process is ensuring that each site is properly prepared for the receipt,
acceptance, and/or installation of the system.
The Transition process includes maintenance and supportability activities for the
deployed system and its enabling system elements, as well as a process for reporting
and resolving deficiencies. The OUSD(AT&L) memorandum, "Document Streamlining -
Life-Cycle Sustainment Plan (LCSP)" requires that sustainment and support planning be
documented in the LCSP, which is required for all Major Defense Acquisition Programs
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      331
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
and reviewed prior to Milestones A, B, and C, as well as the Full-Rate Production
Decision Review (FRP DR).
The Program Manager, Systems Engineer, and Product Support Manager oversee all
transition plans and activities required to install or deploy the end-item system and
enabling system elements to its operational environment. The Systems Engineer
conducts In-Service Reviews (see DAG section 4.2.17. In-Service Review) and leads all
engineering efforts to correct deficiencies found during transition. Program Managers
should ensure all deliverables, particularly documentation (i.e. drawings, tech manuals,
etc.), have been received from the contractor.
Transition activities vary based on life-cycle phase, program scale, and system
complexity. The end-item system may need to be integrated with other systems in the
operational environment based on the defined external interfaces. In this case, the
Transition process is performed in conjunction with the Integration process and
Interface Management process for a smooth transition.
    •    Satisfy the unique needs of the program or system (user capabilities, and
         operational performance requirements) while balancing cost and schedule
         constraints, through trade-offs, by addressing the design considerations (as
         mandated in the Systems Engineering Plan (SEP) and DoDI 5000.02) and
         management tools listed in Table 4.3.18.T1 Design Considerations
    •    Translate the end user desired capabilities into a structured system of
         interrelated design specifications
    •    Enable trade-offs among the design considerations in support of achieving
         desired mission effectiveness within cost and schedule constraints
    •    Translate the end-user desired capabilities into a structured system of
         interrelated design specifications that support delivery of required operational
         capability
    •    Incorporate mandated design considerations into the requirements since some
         design considerations are mandated by laws, regulations, or treaties, while
         others are mandated by the domain or Service / Component; these mandates
         should be incorporated during the Requirements Analysis process to achieve
         balance across all of the system requirements
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      332
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Some design considerations are concepts that assist trade-offs and ought to be
accommodated or applied to each system/program/project. Others are constraints,
boundaries, or limitations, with values that sometimes can be tailored or negotiated, but
which in general represent fairly immovable parts of the trade space. The Program
Managers and Systems Engineers should show evidence of critical thinking in
addressing the design considerations, as documented in the program SEP. The
mandated SEP Outline Table 4.6-1 identifies design considerations critical to achieving
the program’s technical requirements and demonstrates that the mandated design
considerations are an integral part of the design decision process, including trade study
criteria.
With the understanding that each design consideration is a discrete item to investigate
during the design process, the Program Manager and Systems Engineer also need to
view design considerations as an integrated set of variables. These variables influence
one another, and stakeholders should consider them in conjunction with one another, as
early as the Analysis of Alternatives, to achieve better mission performance and to
preclude a stove pipe view during design.
                                DAG
      Design                                         Statutory
                              Section                                                    Policy & Guidance
   Consideration                                    Requirement
                              Number
Accessibility                4.3.18.1.         • Section 508 of the    • DoDD 8000.01
(Section 508                                     Rehabilitation Act of • DoD 8400.01-M
                                                 1973 (as amended 36 • FAR 39.204
Compliance)                                      CFR Part 1194)
Affordability - SE 4.3.18.2.                                           • USD(AT&L) memorandum, "Better
Trade-Off                                                                Buying Power 2.0: Continuing the
                                                                         Pursuit for Greater Efficiency and
Analysis                                                                 Productivity in Defense Spending,"
                                                                         November 13, 2012
                                                                       • USD(AT&L) memorandum,
                                                                         "Implementation Directive for Better
                                                                         Buying Power-Restoring Affordability
                                                                         and Productivity in Defense
                                                                         Spending," November 3, 2010
                                                                       • USD(AT&L) memorandum, "Better
                                                                         Buying Power: Guidance for
                                                                         Obtaining Greater Efficiency and
                                                                         Productivity in Defense Spending,"
                                                                         September 14, 2010
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      333
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                DAG
      Design                                         Statutory
                              Section                                                    Policy & Guidance
   Consideration                                    Requirement
                              Number
Anti-                        4.3.18.3.         • FY2012 National               • USD(AT&L) memorandum,
Counterfeiting                                   Defense                         "Overarching DoD Counterfeit
                                                 Authorization Act               Prevention Guidance," March 16,
                                                 (NDAA)                          2012
Commercial-Off- 4.3.18.4.                      • Sections 403 and 431          • DoDI 5000.02, Enclosure 2
the-Shelf (COTS)                                 of title 41, United
                                                 States Code
                                               • Public Law 103-355
                                               • Public Law 104-106
Corrosion                    4.3.18.5.         • Section 2228 of title • DoDD 5000.01, Enclosure 1,
Prevention and                                   10, United States       paragraph E1.1.17
                                                 Code                  • DoDI 5000.02, Enclosure 12,
Control (CPC)
                                                                         paragraph 7
                                                                       • DoDI 5000.67
                                                                       • PDUSD(AT&L) memorandum,
                                                                         "Document Streamlining - Program
                                                                         Strategies and Systems Engineering
                                                                         Plan," April 20, 2011
                                                                       • DoD Corrosion Prevention and
                                                                         Control Planning Guidebook
                                                                       • DFARS 223.73
Critical Safety              4.3.18.6.         • Section 802 of Public • DoD 4140.1-R
Item (CSI)                                       Law 108-136           • JACG Aviation CSI Management
                                               • Section 130 of Public   Handbook
                                                 Law 109-364           • SECNAVINST 4140.2
                                               • Section 2319 of title • AFI 20-106
                                                 10, United States     • DA Pam 95-9
                                                 Code                  • DLAI 3200.4
                                                                       • DCMA INST CSI (AV) Management
                                                                         of Aviation Critical Safety Items
                                                                       • DFARS 209.270, 246.407, 246.504,
                                                                         246.371 and 252.246-7003
Demilitarization             4.3.18.7.                                 • DoDI 4160.28, Volume 1
and Disposal                                                           • DoDI 5000.02, Enclosure 2,
                                                                         paragraph 8.c.(2)
                                                                       • DoD 4140.1-R
                                                                       • DoD 4160.21-M
                                                                       • MIL-STD-882E
Diminishing                  4.3.18.8.                                 • SD-22
Manufacturing
Sources and
Material
Shortages
(DMSMS)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      334
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                DAG
      Design                                         Statutory
                              Section                                                    Policy & Guidance
   Consideration                                    Requirement
                              Number
Environment,                 4.3.18.9.         • National                      • DoDI 4715.9
Safety, and                                      Environmental Policy          • DoDI 5000.02, Enclosure 12
Occupational                                     Act (NEPA)                    • PDUSD(AT&L) memorandum,
                                               • Section 4321-4347 of            "Document Streamlining - Program
Health (ESOH)                                    title 42, United States         Strategies and Systems Engineering
                                                 Code                            Plan," April 20, 2011
                                               • Executive Order               • MIL-STD-882E
                                                 12114, Environmental          • DFARS 223.73
                                                 Effects Abroad of             • FAR 23.2, 23.4, 23.7 and 23.8
                                                 Major Federal Actions
Human Systems                4.3.18.10.                                        • DoDD 5000.01, Enclosure 1,
Integration (HSI)                                                                paragraph E1.1.29
                                                                               • DoDI 5000.02, Enclosure 8
Insensitive                  4.3.18.11. • Section 2389 of title                • DoDD 6055.9
Munitions                                          10, United States           • Secretary of Defense Memorandum,
                                                   Code                          "DoD Policy on Submunition
                                                                                 Reliability," January 10, 2001
                                                                               • USD(AT&L) Memorandum, "Joint
                                                                                 Insensitive Munitions Test Standards
                                                                                 and Compliance Assessment,"
                                                                                 February 10, 2010
                                                                               • USD(AT&L) Memorandum,
                                                                                 "Insensitive Munitions Strategic
                                                                                 Plans," July 21, 2004
                                                                               • DoD Acquisition Manager’s
                                                                                 Handbook for Insensitive Munitions,
                                                                                 Revision 02, November 2008
Intelligence (Life- 3.3.18.12.                                                 • DoDD 5250.01
cycle Mission
Data Plan (LMDP))
Interoperability    4.3.18.13. • Public Law 104-106                            •   DoDD 4630.05
and Dependency                 • Section 3506 of title                         •   DoDD 5000.01
(I&D)                            44, United States                             •   DoDI 2010.06
                                                   Code                        •   DoDI 4630.8
                                                                               •   DoDI 5000.02
                                                                               •   CJCSI 3170.01
                                                                               •   CJCSI 6212.01
                                                                               •   JCIDS Manual
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      335
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                DAG
      Design                                         Statutory
                              Section                                                    Policy & Guidance
   Consideration                                    Requirement
                              Number
Item Unique                  4.3.18.14.                                        •   DoDD 8320.03
Identification                                                                 •   DoDI 4151.19
(IUID)                                                                         •   DoDI 4140.01
                                                                               •   DoDI 5000.02, Enclosure 12,
                                                                                   paragraph 10
                                                                               •   DoDI 5000.64
                                                                               •   DoDI 8320.04
                                                                               •   PDUSD(AT&L) Memorandum,
                                                                                   "Document Streamlining - Program
                                                                                   Strategies and Systems Engineering
                                                                                   Plan," April 20, 2011
                                                                               •   DoD Guide to Uniquely Identifying
                                                                                   Items, Version 2.5, September 15,
                                                                                   2012
                                                                               •   DoD Guidelines for Engineering,
                                                                                   Manufacturing and Maintenance
                                                                                   Documentation Requirements, April
                                                                                   20, 2007
                                                                               •   DFARS 211.274-2, 252.211-7003,
                                                                                   252.211-7007
Open Systems                 4.3.18.15. • Section 2430 of title                •   DoDI 5000.02, Enclosure 12,
Architecture                                       10, United States               paragraph 8
                                                   Code                        •   DoD 5010.12-M
(OSA)
                                                                               •   USD(AT&L) Memorandum, "Better
                                                                                   Buying Power 2.0: Continuing the
                                                                                   Pursuit for Greater Efficiency and
                                                                                   Productivity in Defense Spending,"
                                                                                   November 13, 2012
Operational                  4.3.18.16. • Section 138c of title                •   CJCSI 3170.01
Energy                                           10, United States             •   JCIDS Manual
                                                 Code
Packaging,        4.3.18.17.                   • Title 49 of the Code          •   DoDI 4540.07
Handling, Storage                                of Federal                    •   DoD 4145.19-R
and                                              Regulations (49 CFR)          •   DoD 4140.27-M
                                                                               •   DTR 4500.9-R
Transportation
(PHS&T)
Producibility,    4.3.18.18. • Section 812 of                                  • DoDI 5000.02, Enclosure 2
Quality &                      National Defense                                • DFARS 207.105, 215.304
                               Authorization Act
Manufacturing                  FY2011
(PQM)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      336
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                DAG
      Design                                Statutory
                              Section                                                    Policy & Guidance
   Consideration                          Requirement
                              Number
Reliability &                4.3.18.19. • Public Law 111-23,             • DoDI 5000.02, Enclosure 12
Maintainability                                    Weapon System         • DTM 11-003
(R&M)                                              Acquisition Reform    • PDUSD(AT&L) memorandum,
                                                   Act 2009                "Document Streamlining - Program
Engineering                                                                Strategies and Systems Engineering
                                                                           Plan," April 20, 2011
                                                                         • DoD Reliability, Availability,
                                                                           Maintainability, and Cost Rationale
                                                                           (RAM-C) Report Manual
Spectrum                     4.3.18.20. •          Sections 305 and 901 • DoDD 3222.3
Management                                         - 904 of title 47,    • DoDI 4650.01
                                                   United States Code    • DoDI 5000.02, Enclosure 12,
                                               •   Section 104 of Public   paragraph 11
                                                   Law 102-538           • AR 5-12
                                                                         • AFI 33-118
                                                                         • OPNAVINST 2400.1 and 2400.2
                                                                         • OPNAVINST 2400.20F
Standardization              4.3.18.21. •          Sections 2451-2457 • DoDI 4120.24
                                                   of title 10, United   • DoD 4120.24-M
                                                   States Code           • SD-19
                                               •   Public Law 82-436
Supportability               4.3.18.22.                                  • DoDD 5000.01, Enclosure 1,
                                                                           paragraphs E1.1.17, E1.1.29
                                                                         • DoDI 4151.22
                                                                         • PDUSD(AT&L) Memorandum,
                                                                           "Document Streamlining - Life-Cycle
                                                                           Sustainment Plan (LCSP),"
                                                                           September 14, 2011
                                                                         • DoD 4140.1-R
                                                                         • DoD 4151.22-M
                                                                         • SD-19
                                                                         • MIL-HDBK-502
Survivability     4.3.18.23.                                             • DoDI 3150.09
(including CBRN)                                                         • DoDI 5000.02, Enclosures 6 and 8
& Susceptibility
System Security 4.3.18.24. • Section 2358 of title                             • DoDI 5000.02, Enclosure 4
Engineering (SSE)            10, United States                                 • DoDI 5200.39
                                                   Code                        • DoDI 5200.44
                                                                               • DODI 8500 Series
                                                                               • PDUSD(AT&L) memorandum,
                                                                                 "Document Streamlining - Program
                                                                                 Protection Plan (PPP)," July 18,
                                                                                 2011
                                                                               • Program Protection Plan Outline and
                                                                                 Guidance, Version 1.0, July 2011
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      337
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
4.3.18.1. Accessibility (Section 508 Compliance)
All Electronic and Information Technology (E&IT) systems comply with Section 508 of
the Rehabilitation Act (as amended 36 CFR Part 1194), unless exempt under FAR
39.204 as a military system or National Security System. Compliance with Section 508
provides access by Federal employees with disabilities and the public to information
and data that able-bodied persons can access through E&IT systems. Section 508
should be considered as a design requirement, addressed at each technical review, and
clearly stated in the Acquisition Strategy and Systems Engineering Plan.
Program Managers should ensure Section 508 compliance, unless exempt, while
Systems Engineers are responsible for implementation through use of standards and
compliant tools and products.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      338
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
4.3.18.2. Affordability - Systems Engineering Trade-Off Analyses
Affordability is the degree to which the capability benefits are worth the system’s total
life-cycle cost and support DoD strategic goals. Systems engineering (SE) trade-off
analyses for affordability, a special application of the Decision Analysis process (see
DAG section 4.3.3. Decision Analysis Process), supports the establishment of a realistic
affordability target, serves as inputs for the will cost and should cost estimates, and
enables continuous monitoring of affordability estimates across the system life cycle. SE
trade-off analyses should always practice continuous improvement, value engineering
and Lean Six Sigma.
Although not a mandated Key Performance Parameter (KPP), the affordability target is
managed throughout the system life cycle as a system KPP and cannot be changed
without Milestone Decision Authority (MDA) approval. The USD(AT&L) memorandum
"Implementation Directive for Better Buying Power Restoring Affordability and
Productivity in Defense Spending" requires the program to establish an affordability
target at Milestone A. This affordability target forms the basis for the SE trade-offs and
sensitivity analyses that is conducted in support of Milestone B, and subsequent
reviews. The affordability target is nominally the average unit acquisition cost and
average annual operations and support cost per unit. For indefinite quantity of
production units, the affordability target may be the total acquisition cost (see DAG
Chapter 3 Affordability and Life-Cycle Resource Estimates for more information
regarding the affordability target).
The independently generated will cost estimate is used to defend the system budget but
does not account for potential efficiencies. The should cost estimate is based on
efficient use of resources and effective implementation of processes, and is the focus of
SE activities and program management decisions across the life cycle.
The SE trade-offs are conducted among cost, schedule, and performance objectives to
ensure the program is affordable. The Program Manager should identify the design
performance points that are the focus of trade-off analyses to establish cost and
schedule trade space. The Program Manager presents the results of the trade-off
analyses at program milestone/technical reviews, showing how the affordability target
varies as design performance and schedules are varied (affordability drivers) and
demonstrating how the cost-effective design point is established for the program.
The Program Manager and Systems Engineer use the results of SE trade-off analyses
for affordability to inform system requirements and ensure that, when taken collectively,
the requirements are compelling, affordable, and achievable within the time frame
available to the program. These requirements are normally characterized by creative
alternatives, reliable information and models, well-reasoned aggregation techniques,
and a sound recommendation and action plan.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      339
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
The trade-off analyses are executed by a resourced team that consists of a decision
maker with full responsibility, authority, and accountability for the trade at hand, a trade-
off analyst with a suite of reasoning tools, subject matter experts with performance
models, and a representative set of end users and other stakeholders.
Throughout the system life cycle, the Systems Engineer continuously monitors
affordability drivers, identifies opportunities to reduce life-cycle costs, and conducts
trade-off analyses as needed to meet program cost, schedule, and performance
requirements.
4.3.18.3. Anti-Counterfeiting
4.3.18.3. Anti-Counterfeiting
Counterfeit parts are becoming pervasive in various supply chains and therefore have
become a significant threat to the Defense supply chain. Counterfeiters motives are
primarily greed (profit) and/or malicious intent. Counterfeits may appear at all phases of
the life cycle, making it necessary for the Program Manager, Systems Engineer, and
Product Support Manager to plan for prevention, detection, remediation, reporting, and
restitution activities from the beginning of the life cycle to disposal and demilitarization.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      340
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
          Design Consideration                                                   Relationship
    Critical Safety Items (CSI)                         From an anti-counterfeiting risk-based approach, CSI
                                                        should be more carefully scrutinized to ensure no
                                                        counterfeits infiltrate the supply chain.
    Demilitarization and Disposal                       An excellent source for counterfeiters to obtain parts
                                                        that can be turned into "used sold as new" parts
                                                        (fraudulently certified as new).
    Diminishing Manufacturing                           As systems age and the trustworthy sources for the
    Sources and Material Shortages                      piece parts dry up, counterfeiters increasingly take
                                                        advantage of the situation by offering a source for hard-
    (DMSMS)                                             to-find-parts.
    Environment, Safety, and                            Several examples of counterfeit materials that can
    Occupational Health (ESOH)                          increase ESOH risks include: false R-134, a refrigerant
                                                        which produces explosive by-products; fire
                                                        extinguishers compressed with air; and faulty smoke
                                                        detectors. Furthermore, Restriction of Hazardous
                                                        Substances (RoHS) (2002/95/EC) has led to increased
                                                        numbers of counterfeits, where a lead-free (Pb-free)
                                                        microcircuit is sold as having tin-lead (SnPb) leads.
    Item Unique Identification (IUID)                   Successful implementation of IUID could reduce the
                                                        ability of counterfeiters to introduce parts into supply.
                                                        Conversely, IUID may provide a false sense of security
                                                        if it can be duplicated by counterfeiters.
    Open Systems Architecture                           OSA could provide a means to quickly certify a newer,
    (OSA)                                               more available part for use in weapon systems, thus
                                                        reducing the impact of DMSMS. Conversely, it could
                                                        also result in more part numbers (equivalents) being
                                                        introduced into supply thus increasing the likelihood of
                                                        counterfeit intrusion.
    Producibility, Quality, and                         PQM can be severely degraded if supply is
    Manufacturing (PQM)                                 contaminated with counterfeits.
    Reliability and Maintainability                     Counterfeits that somehow get past receipt inspection
    Engineering                                         and test can have radically different reliability and failure
                                                        modes than the "honest" part.
    Supportability                                      Increased failure rates due to counterfeits can have a
                                                        negative impact on supportability and might drive the
                                                        wrong problem-resolution behaviors and increase
                                                        sustainment costs.
    System Security Engineering                         SSE implements anti-counterfeit protection measures
    (SSE)                                               as part of a comprehensive plan to protect CPI and
                                                        mission-critical functions and components.
During development of the Systems Engineering Plan (SEP), the Program Manager,
Systems Engineer, and Product Support Manager should consider these relationships
and develop plans to address the threat.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      341
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
4.3.18.4. Commercial-Off-the-Shelf
4.3.18.4. Commercial-Off-the-Shelf
The primary benefits of using COTS components in system design are to:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      342
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    The graphical user interface (GUI) design may not completely support user tasks,
         which can cause inefficient workarounds and improper use of the system by the
         user
The marketplace drives COTS product definition, application, and evolution. COTS
products presume a flexible architecture and often depend on product releases that are
designed to be used "as is" to meet general business needs and not a specific
organization's needs. The commercial product life cycle is usually much shorter than the
equivalent military product life cycle. Programs should consider the potential availability
of suitable replacement and/or alternative items throughout the longer, military life cycle,
and should monitor the commercial marketplace through market research activities and
ongoing alignment of business and technical processes. This necessary activity
imposes additional cost, schedule, and performance risks that the acquisition
community should plan for. COTS products should be evaluated to meet all
performance and reliability requirements during all environmental conditions and service
life requirements specified by the intended application requirements documents.
The Federal Acquisition Streamlining Act (FASA) of 1994 (Public Law 103-355) and the
Clinger-Cohen Act (Public Law 104-106) both endorse the use of COTS products by the
Federal Government but have slightly different definitions, with the latter allowing for
modifications to COTS.
The Systems Engineer should ensure open system design, identification and mitigation
of Environment, Safety, and Occupational Health (ESOH) and security risks, survivable
technology insertion, or refresh throughout the projected system life cycle.
The Program Manager and Systems Engineer should consider the following when
evaluating use of COTS products:
    •    The intended product use environment and the extent to which this environment
         differs from (or is similar to) the commercial use environment
    •    Integration, documentation, security, Human System Integration, ESOH,
         hardware/software integrity, reliability risk, operational environment, and
         corrosion susceptibility/risk, etc.
    •    Planning for life-cycle activities (including sustainment, supply chain risks,
         obsolescence, and disposal)
    •    Developing relationships with vendors, Foreign Ownership Control, and Influence
         (FOCI) (see Defense Security Service for the latest policy regarding COTS from
         FOCI sources)
    •    Supportability, if vendor or marketplace changes occur
    •    Test and evaluation of COTS items (including early identification of screening,
         functionality testing and usability assessments) (See DAG Chapter 9 Test and
         Evaluation, Chief Development Tester)
    •    Protecting intellectual property rights by being aware of pertinent intellectual
         property right issues associated with commercial items acquisitions, especially
         with the acquisition of commercial software products. When acquiring Intellectual
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      343
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         Property (IP) license rights, the acquisition community should consider the core
         principles described in the DoD guide: "Intellectual Property: Navigating through
         Commercial Waters."
    •    Ability to modify or interface COTS software with other software even if
         Government generated or owned
    •    Ability to have insight into configuration management, and the features and
         functions of upgrades and changes
    •    Ability to instrument and/or test aspects of COTS products
The corrosion of military equipment and infrastructure within the DoD has been
documented to cost approximately $23 billion annually. In addition to its significant
financial impact, corrosion can also adversely affect system availability and ESOH.
Therefore, it is extremely important to plan for and implement corrosion prevention and
mitigation as early as possible in the acquisition life cycle (even prior to Milestone A) to
minimize the life-cycle impact.
The execution of a program’s Corrosion Prevention and Control (CPC) planning should
contribute to reduced corrosion vulnerability with lower life-cycle costs; and improved
ESOH, maintainability, and availability.
Section 2228 of title 10, United States Code requires planning and execution of
corrosion prevention and mitigation in DoD systems. Accordingly DoDI 5000.02 and
5000.67 require corrosion prevention and control planning for all acquisition programs
across the life cycle. Elements of good CPC engineering include, but are not limited to,
the following:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      344
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
engineering expertise, are available throughout the program and that corrosion
performance is considered appropriately during design trades. The Systems Engineer,
supported by CPC subject matter experts, is responsible for identifying corrosion
concerns and developing mitigation strategies within the whole system design and
operational construct.
All designated Acquisition Category (ACAT) programs are required to accomplish CPC
planning across their life cycle, with ACAT I programs required to formally document
this planning in an approved CPC Plan delivered at Milestones B and C. In addition, the
DoD has developed the Corrosion Prevention and Control Planning Guidebook as a
resource to assist the Program Manager, Systems Engineers, and other program staff
in the development of a robust CPC program.
For all ACAT programs, CPC engineering should be reflected in various program
documents, including, but not limited to:
In the contract and RFP, CPC planning should be addressed in some fashion in the
technical content of each contract/RFP Section and subsection, including, but not
limited to the Statement of Work (SOW), IMP/IMS, CDRL, and system performance
specification (see DAG section 4.1.6. SE Role in Contracting).
Critical Safety Item (CSI) is a part, assembly, or support equipment whose failure could
cause loss of life, permanent disability or major injury, loss of a system, or significant
equipment damage. Special attention should be placed on CSIs to prevent the potential
catastrophic or critical consequences of failure. Significant problems occurred when
DoD purchased CSIs from suppliers with limited knowledge of the item’s design intent,
application, failure modes, failure effects, or failure implications. The definition of CSI is
not to be confused with the MIL-STD-882E definition of a Safety Critical Item (SCI). A
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      345
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
SCI is "a hardware or software item that has been determined through analysis to
potentially contribute to a hazard with catastrophic or critical mishap potential, or that
may be implemented to mitigate a hazard with catastrophic or critical mishap potential."
The purpose of CSI analysis is to ensure that Program Managers for DoD acquisition
programs who enter into contracts involving CSIs do so only with resources approved
by the Design Control Activity (DCA). The DCA is defined by law as the systems
command of a military department. The DCA is responsible for the airworthiness or
seaworthiness certification of the system in which a CSI is used.
The intent of CSI laws, policies, regulations, and guidance is to avoid hazards through
mitigating receipt of defective, suspect, improperly documented, unapproved, and
fraudulent parts having catastrophic potential. These statutory requirements are
contained in section 802 of Public Law 108-136, enacted to address aviation CSIs, and
section 130 of Public Law 109-364, enacted to address ship CSIs, embedded in section
2319 of title 10, United States Code. The statute addresses three specific areas:
    •    Establish that the DCA is responsible for processes concerning the management
         and identification of CSIs used in procurement, modification, repair, and overhaul
         of aviation and ship systems.
    •    Require that DoD work only with sources approved by the DCA for contracts
         involving CSIs.
    •    Require that CSI deliveries and services performed meet all technical and quality
         requirements established by the DCA.
CSI policies and guidance ensure that items of supply that are most critical to
operational safety are rigorously managed and controlled in terms of:
    •    Supplier capability
    •    Conformance to technical requirements
    •    Controls on changes or deviations
    •    Inspection, installation, maintenance, and repair requirements
DoD 4140.1-R, DoD Supply Chain Materiel Management Regulation establishes top-
level procedures for the management of aviation CSIs. The Joint Aeronautical
Commanders Group issued the Aviation Critical Safety Items (CSIs) Management
Handbook. This guidance establishes standard user-level operating practices for
aviation CSIs across the Services, the Defense Logistics Agency (DLA), the Defense
Contract Management Agency (DCMA), and other Federal agencies. Appendix I of the
Aviation CSI Management Handbook is a joint Military Service/Defense Agency
instruction on "Management of Aviation Critical Safety Items" issued on January 25,
2006. This instruction (SECNAVINST 4140.2, AFI 20-106, DA Pam 95-9, DLAI 3200.4,
and DCMA INST CSI (AV)) addresses requirements for identifying, acquiring, ensuring
quality, managing, and disposing of aviation CSIs. Similar policies and guidance are
being developed and/or revised to address ship CSIs as defined by public law.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      346
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
The Defense Federal Acquisition Regulation Supplement (DFARS) was amended to
implement the contractual aspects regarding aviation CSIs. Comparable DFARS
amendments are being developed to address ship CSIs. DFARS 209.270 states that
the DCA is responsible to:
This supplement states that the contracting activity contracts for aviation CSIs only with
suppliers approved by the DCA. Program Managers should coordinate with the
contracting activity to ensure that they contract for aviation CSIs only with suppliers
approved by the DCA and that nonconforming aviation CSIs are to be accepted only
with the DCA’s approval, as required by DFARS 246.407. DFARS 246.407 was
amended to state that DCA authority can be delegated for minor nonconformance.
DFARS 246.504 requires DCA concurrence before certificates of conformance are
issued to accept aviation CSIs.
Because the system developer may uncover problems with products after items are
delivered, DFARS 246.371 and 252.246-7003 require the developer to notify the
procuring and contracting officers within 72 hours after discovering or obtaining credible
information that a delivered CSI may have discrepancies that affect safety. Program
Managers should coordinate with the contracting authority to be kept aware of materiel
recalls and shortfalls that may impact production rates and sustainment.
The CSI list evolves as the design, production processes, and supportability analyses
mature. Program Managers identify and document CSIs during design and development
to influence critical down-stream processes such as initial provisioning, supply support,
and manufacturing planning to ensure adequate management of CSIs throughout a
system’s Operations and Support (O&S) phase. The Program Manager should make
provisions for developers including original equipment manufacturer (OEM) contractors
to deliver an initial allocated baseline at the Preliminary Design Review (PDR), to
include an initial list of proposed CSIs and a proposed process for selecting and
approving CSIs as well as addressing the critical characteristics of those items. Prior to
the Critical Design Review (CDR), the program office, with support from the DCA and
developer/OEM contractors, should ensure there is a clear understanding of CSI
processes, terms, and criteria. The initial product baseline is delivered at CDR and at
that time the program should have 100% of drawings completed for the CSIs.
Throughout Low-Rate Initial Production (LRIP) (if applicable), conduct of the Physical
Configuration Audit (PCA), and establishment of the final product baseline, the program
should update the CSI list and review it to ensure the list reflects the delivered system.
Before the Full-Rate Production / Full Deployment Decision Review (FRP/FD DR), a
final CSI list should be documented and approved by the DCA.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      347
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
4.3.18.7. Demilitarization and Disposal
The incorporation of demilitarization (DEMIL) and disposal requirements into the initial
system design is critical to ensure compliance with:
Program Managers and Program Support Managers should ensure, as an essential part
of systems engineering, that DEMIL and disposal requirements are incorporated in
system design to minimize DoD’s liabilities, reduce costs, and protect critical program
information and technology. This includes integrating DEMIL and disposal into the
allocated baseline approved at the Preliminary Design Review (PDR) and refining
DEMIL and disposal requirements in the initial product baseline at the Critical Design
Review (CDR). DEMIL and disposal requirements are included in the program’s
Systems Engineering Plan (SEP), the Life-Cycle Sustainment Plan (LCSP), and the
contract(s). For munitions programs, DEMIL and disposal documentation need to be in
place before the start of Developmental Test and Evaluation.
DEMIL renders safe and eliminates functional capabilities and inherent military design
features from both serviceable and unserviceable DoD materiel. It is the act of
destroying the military offensive or defensive advantages inherent in certain types of
equipment or material. DEMIL may include mutilation, scrapping, melting, burning or
alteration designed to prevent the further use of this equipment and material for its
originally intended military or lethal purpose. Systems Engineers integrate DEMIL
considerations into system design to recover critical materials and protect assets,
information, and technologies, from uncontrolled or unwanted release and disruption or
reverse engineering. Program Managers should ensure the DEMIL of materiel is
accomplished in accordance with DoDI 4160.28, DoD Demilitarization (DEMIL)
Program.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      348
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Supply Chain Materiel Management Regulation and DoD 4160.21-M, Defense Materiel
Disposition Manual.
The program’s plan for demilitarization and disposal of DoD excess and surplus
property protects the environment and personnel and minimizes the need for
abandonment or destruction. During systems design, the Systems Engineer supports
the Program Manager’s plans for the system’s demilitarization and disposal, through the
identification and documentation of hazards and hazardous materials related to the
system, using MIL-STD-882E, DoD Standard Practice for System Safety. Early,
balanced analyses of ESOH hazards relative to the system’s design, enable the
Program Manager to make informed decisions based on alternatives and provide a
clear understanding of trade-offs and consequences, both near term and over the
systems life cycle.
The Systems Engineer should be aware of and consider DMSMS management during
system design. Following are several practices that the program should consider to
minimize DMSMS risk throughout the life cycle of the system:
    •    Avoid selecting technology and components that are near the end of their
         functional life
    •    During the design process, proactively assess the risk of parts obsolescence
         while selecting parts
    •    When feasible, use an Open Systems Architecture (OSA) to enable technology
         insertion/refreshment more easily than with design-specific approaches
    •    Proactively monitor supplier bases to prevent designing in obsolescence;
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      349
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         participate in cooperative reporting forums, such as the Government-Industry
         Data Exchange Program (GIDEP), to reduce or eliminate expenditures of
         resources by sharing technical information essential during research, design,
         development, production and operational phases of the life cycle of systems,
         facilities and equipment
    •    Proactively monitor potential availability problems to resolve them before they
         cause an impact in performance readiness or spending
    •    Mitigation of program cost and schedule risks from actions that cause damage to
         people, equipment, or the environment
    •    Reduction of Operations and Support and disposal costs
    •    Provision of a safe, suitable, supportable, and sustainable capability able to
         operate world-wide
Throughout each acquisition phase, programs conduct the ESOH analyses to:
    •    Identify and mitigate potential risks to the system and its associated personnel
    •    Manage ESOH design considerations from the beginning of the SE effort
    •    Plan for compliance with the National Environmental Policy Act (NEPA) and
         Executive Order (EO) 12114, Environmental Effects Abroad of Major Federal
         Actions
    •    Ensure compliance with statutory ESOH requirements
Efforts to identify and analyze hazards, and mitigate ESOH risks provide information
needed for informed design decisions and development of ESOH-related
documentation for milestone decisions.
DoD defines ESOH in MIL-STD-882E, DoD Standard Practice for System Safety as "the
combination of disciplines that encompass the processes and approaches for
addressing laws, regulations, EOs, DoD policies, environmental compliance, and
hazards associated with environmental impacts, system safety (e.g., platforms,
systems, system-of-systems, weapons, explosives, software, ordnance, combat
systems), occupational safety and health, hazardous materials management, and
pollution prevention."
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      350
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
ESOH System Design Requirements
The Systems Engineer identifies the ESOH requirements applicable to the system
throughout its life cycle from statutes, regulations, policies, design standards, and
capability documents. From these requirements, the Systems Engineer should derive
ESOH design requirements and include them in capability documents, technical
specifications, solicitations, and contracts.
The Acquisition Program Office ESOH-specific documents are the Programmatic ESOH
Evaluation (PESHE) and the NEPA/EO 12114 Compliance Schedule. The Systems
Engineering Plan (SEP) contains the ESOH management planning information. The
SEP, PESHE, and NEPA/EO 12114 Compliance Schedule provide inputs to program
documentation that include, but are not limited to: Technology Development Strategy
(TDS), Test and Evaluation Strategy (TES), Test and Evaluation Master Plan (TEMP),
Life-Cycle Sustainment Plan (LCSP), Corrosion Prevention and Control Plan (CPCP),
system specifications, solicitations, and contracts; and capability documents.
The SEP contains ESOH design considerations as an integral part of the requirements
analysis process, including trade study criteria. ESOH design considerations are
particularly important for Milestone A to ensure SE addresses ESOH during the
Technology Development (TD) phase, which includes a significant amount of the design
development, testing, and the Preliminary Design Review. SEP Table 4.6-1 includes the
information listed in Table 4.3.18.9.T1. Additional ESOH details are provided in SEP
Sections 3.4 and 3.6; Tables 2.2-1, 3.4.4-1, 3.4.4-2, and 4.4-1; and Figure 3.4.1-1.
The PESHE documents the ESOH design consideration data produced by executing
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      351
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
the ESOH planning described in the SEP. The PESHE includes, but is not limited to:
    •    ESOH Risk Matrices (for hardware and software) used by the program with
         definitions for severity categories, probability levels, risk levels, and risk
         acceptance and user representative concurrence authorities.
    •    The following data for each hazard: Hazard Tracking System (HTS) identification
         number, hazard description, potential mishap, initial Risk Assessment Code
         (RAC) and risk level, mitigation measure(s) and funding status, target RAC and
         risk level, current RAC and risk level, and risk acceptance and user concurrence
         status (NOTE: providing an electronic copy of the current data from the HTS
         would satisfy this requirement).
    •    The following data for each hazardous material, hazardous waste, and pollutant
         associated with the system: the specific uses, locations, quantities, and plans for
         their minimization and/or safe disposal (NOTE: providing an electronic copy of
         the current data from either the HTS (if it includes this information) or the
         hazardous materials management data would satisfy this requirement).
    •    Environmental impact information not included in the HTS or hazardous materials
         tracking system needed to support installation and range analyses.
NOTE: The results of the sustainability analysis (see DAG section 4.3.19.2.
Sustainability Analysis) should be used to inform the hazard analysis.
DoDI 5000.02, Enclosure 12 requires that each program maintain a NEPA/EO 12114
compliance schedule. This schedule includes, but is not limited to:
Because actions occurring during the TD phase may require NEPA/EO 12114
compliance, the program should develop a TD Compliance Schedule for inclusion in the
SEP. DoDI 5000.02, Enclosure12 also requires programs to support other organizations
NEPA/EO 12114 analyses involving their systems.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      352
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                              Table 4.3.18.9.T2. ESOH Activities by Phase
The Systems Engineer uses the MIL-STD-882E process to identify and assess hazards
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      353
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
(to include software safety), eliminate hazards where possible, and manage ESOH risks
where hazards cannot be eliminated. MIL-STD-882E provides a matrix and defines
probability and severity criteria to categorize ESOH risks. Prior to exposing people,
equipment, or the environment to known system-related hazards, the Systems Engineer
ensures ESOH risks are formally accepted, which includes formal concurrence on high
and serious risks by the designated user representative as defined in MIL-STD-882E (or
by each participating Service user representative in a Joint program). DoDI 5000.02
identifies the appropriate management level authorized to accept ESOH risks.
For Joint programs, the Component Acquisition Executive of the Lead Executive
Component should be the acceptance authority for high-level risks. The program
documents formal risk acceptances as part of the program record (e.g., Hazard
Tracking System (HTS)). If a risk level increases for a hazard, a new risk acceptance is
required prior to exposing people, equipment, or the environment to the increased risk.
The program also participates in system-related mishap investigations to assess
contributing hazards, risks, and mitigations.
DoDI 5000.02, Enclosure 12 requires programs to report the status of current high and
serious ESOH risks at program reviews and fielding decisions and the status of all
ESOH risks at technical reviews. The purpose of this reporting is to inform the Milestone
Decision Authority (MDA), Program Executive Office (PEO), Program Manager, and end
user about trades being made and ESOH risks that need to be accepted. Each ESOH
risk report includes the following: the hazard, potential mishap, initial Risk Assessment
Code (RAC) and risk level, mitigation measure(s) and funding status, target RAC and
risk level, current RAC and risk level, and risk acceptance / user representative
concurrence status.
The Systems Engineer manages hexavalent chromium usage in systems to balance the
requirements for corrosion control and prevention and the procedures in DFARS
Subpart 223.73 - Minimizing the Use of Hexavalent Chromium. For more information on
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      354
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
chemicals/materials of evolving regulatory concern, refer to the DENIX website.
The Program Manager, in concert with the user and the T&E community, provides
safety releases (to include formal ESOH risk acceptance in accordance with DoDI
5000.02, Enclosure 12, Section 6), to the developmental and operational testers before
any test using personnel. The safety release addresses each system hazard present
during the test and include formal risk acceptance for each hazard. The program’s
safety release is in addition to any test range safety release requirements, but it should
support test range analyses required for a range-generated test release. The program
documents safety releases as part of the Program Record.
The Program Manager should provide a transmittal letter to the involved test
organization with a detailed listing of the system hazards germane to the test that
includes the current risk level and documented risk acceptance along with information
on all implemented mitigations.
In an effort to enhance and sustain mission readiness over the system life cycle, reduce
reliance on resources, as well as reduce the DoD footprint, programs should follow the
policy and procedures identified in the DoD Green Procurement Program (GPP). GPP
benefits include:
Key Resources
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      355
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
4.3.18.10. Human Systems Integration
Systems engineering (SE) addresses the three major elements of each system:
hardware, software, and human. SE integrates human capability considerations with the
other specialty engineering disciplines to achieve total system performance
requirements by factoring into the system design the limitations of the human users.
During system design, Systems Engineer should apply Human Systems Integration
(HSI) and Human Factors Engineering (HFE) design criteria, principles, and practices
described in MIL-STD-1472, Human Engineering and MIL-STD-46855A, Human
Engineering Requirements for Military Systems, Equipment and Facilities.
The HSI effort minimizes ownership costs and ensures the system is built to
accommodate the human performance characteristics of users who operate, maintain,
and support the total system. The total system includes not only the mission equipment
but also the users, the training and training devices, and the operational and support
infrastructure.
The Program Manager has overall responsibility for integrating the HSI effort into the
system program. These responsibilities are described in DAG Chapter 6 Human
Systems Integration.
The Systems Engineer supports the Program Manager and is responsible for HSI. The
Systems Engineer should work with the manpower, personnel, training, safety, health,
habitability, personnel survivability, and HFE stakeholders to develop the HSI effort. The
Systems Engineer translates and integrates those human capability considerations, as
contained in the capabilities documents, into quantifiable system requirements.
Requirements for conducting HSI efforts should be specified for inclusion in the
Statement of Work and contract and included in the Systems Engineering Plan (SEP),
specifications, the Test and Evaluation Master Plan (TEMP), the Life-Cycle Sustainment
Plan (LCSP), and other appropriate program documentation. The SEP Outline requires
that HSI be addressed as a mandatory design consideration in Table 4.6-1.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      356
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
4.3.18.11. Insensitive Munitions
Insensitive Munitions minimize the probability of inadvertent initiation and the severity of
subsequent collateral damage to weapon platforms, logistic systems, and personnel
when munitions are subjected to unanticipated stimuli during manufacture, handling,
storage, transport, deployment, or disposal, or due to accidents or action by an
adversary.
The Program Manager and Systems Engineer for munitions acquisition programs,
regardless of the ACAT level, should have safety as a top consideration when
performing trade studies or making program decisions. The term "Insensitive Munitions"
implies that unanticipated stimuli will not produce an explosive yield, in accordance with
MIL-STD-2105D, Hazard Assessment Tests for Non-Nuclear Munitions. The Program
Manager and cognizant technical staff should coordinate harmonized Insensitive
Munitions/Hazard Classification (HC) test plans with the Service Insensitive
Munitions/Hazard Classification (HC) review organizations. The Service organizations
should coordinate the Insensitive Munitions/Hazard Classification (HC) with the Joint
Services Insensitive Munitions Panel (JSIMTP), Joint Service Hazard classifiers, and
the DoD Explosives Safety Board (DDESB), is chartered by DoDD 6055.9E, Explosives
Safety Management and the DDESB. Aspects of Insensitive Munitions also apply to
nuclear weapons but are not addressed herein.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      357
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Acquisition Manager’s Handbook for Insensitive Munitions contains the above-
referenced documents and appendices for each Service’s policy and review board
process.
The IMSP is the primary program output required by USD(AT&L) and the Joint Staff to
provide evidence that the program is in compliance with all applicable laws and
regulations. Both the Component-level and DoD-level insensitive munitions review
organizations can provide additional guidance and can assess the adequacy of the
IMSP. In addition to the IMSP, the Analysis of Alternatives (AOA), Acquisition Strategy
(AS), Systems Engineering Plan (SEP), Test and Evaluation Master Plan (TEMP), Risk
Management Plan, Corrosion Prevention and Control Plan (CPCP), and other JCIDS
documents called for in CJCSI 3170 and the JCIDS Manual (requires Common Access
Card (CAC) to access website), address aspects of explosive ordnance safety,
including Insensitive Munitions.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      358
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         Figure 4.3.18.12.F1. Intelligence Mission Data (IMD) Life Cycle Timeline
The Program Manager, Systems Engineer, and Test and Evaluation Manager are the
primary functional program office leads responsible for the identification and
programming of unique IMD to support the program beginning at MS A (see DoDD
5250.01).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      359
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
DAG Chapter 8 Intelligence Analysis Support to Acquisition provides key linkages to the
System Requirements Document (SRD), Systems Engineering Plan (SEP), and Test
and Evaluation Master Plan (TEMP).
These three products are directly affected by the intelligence signature and mission data
requirements.
Almost all DoD systems operate in a system-of-systems (SoS) context relying upon
other systems to provide desired user capabilities, making it vital that interoperability
needs and external dependencies are identified early and incorporated into system
requirements. When identifying system requirements, it is critical to consider the
operational and SoS context (see DAG section 4.2.1.2. Systems of Systems). These
include, but are not limited to, physical requirements (e.g., size, power limits, etc.),
electronic requirements (e.g., signature, interference, etc.) and information
exchange/management (e.g., network, bandwidth, information needs, etc.). These also
include interdependencies with other systems. For efficiency, systems often rely on
either services provided by other systems during operations or reuse of system
elements developed by other programs.
The Program Manager is responsible for ensuring that the operational and SoS context
for the system are well understood.
The Systems Engineer has the primary responsibility for ensuring all interoperability and
dependency impacts are analyzed and collaborated with the appropriate
internal/external stakeholders and are translated into system requirements and design
considerations.
Analysis conducted for the SoS contexts for the system -- where the system is
dependent on other systems and where the system needs to interact with other systems
- enables translation of I&D into system requirements. I&D requirements call for
collaborative implementation approaches with external organizations, including
identification, management, and control of key interfaces. Areas of dependency and
interoperability should be reviewed for risks to the program and plans made to manage
and mitigate those risks. This review includes system interdependencies (e.g., weapon
may depend on new sensor capabilities provided by another system) and information
exchanges with other systems required to support mission capabilities. For efficiency,
systems may rely on system elements developed by others for key functionality, either
through services (e.g., weather information) provided by other systems or through reuse
of system elements (e.g., engines, radios) developed by other programs. These
contexts are analyzed to identify system requirements and risks, including actions
needed by external parties (e.g., other systems or infrastructure) for the system to meet
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      360
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
user requirements.
Additional DoD policy and guidance regarding I&D, summarized below, are directed at
ensuring that systems work effectively with other systems:
Program Managers and Product Support Managers should budget, plan for, and
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      361
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
implement IUID-enabled SIM as an integral activity within MIL-STD-130 requisite item
identification processes to identify and track applicable major end items and
configuration-controlled items. IUID implemented in accordance with DoDI 8320.04 and
IUID Implementation Plans are required for all milestone decisions as directed by DoDI
5000.02. IUID-specific design considerations are required in the Systems Engineering
Plans (SEP), and SIM planning and implementation required by DoDI 4151.19 are
addressed in the Life-Cycle Sustainment Plan (LCSP).
The Systems Engineer considers what to mark and how to incorporate the IUID mark
within MIL-STD-130 item marking requirements when formulating design decisions. In
addition, the Systems Engineer considers where product and maintenance information
reside and how the life-cycle data is used within the configuration management and
product support systems - including new and legacy information systems.
The DoD Guide to Uniquely Identifying Items, provides guidance on implementing IUID
intended for use by Department of Defense (DoD) contractors and their suppliers who
put unique item identifier (UII) marks on new items during production, as directed in the
contract.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      362
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
OSA benefits warfighters by:
    •    Reducing operator learning curves by using systems that have similar functions
         and are operated in similar ways thereby reducing costs
    •    Increasing interchangeability
    •    Reducing support and sustainment costs
The engineering trade analyses conducted prior to MS B help determine which system
elements of program architecture can be adapted to OSA in order to reduce program
cost and development time lines. Correct application of OSA principles and practices
results in modular architecture components having well-defined functions and open
standards-based interfaces. Threat analyses, functional criticality analyses, technology
opportunities, and evolved capability assessments are examples of assessments
against the functional architecture to determine what components should be OSA-
enabled. When these architecture components require upgrade, replacement is
competitive, faster, and cheaper because the OSA-enabled components are modular.
Because system functional architecture maps from the higher-level enterprise
architecture, engineering trade analyses and assessments supporting OSA should be
completed and OSA-enabled architecture components specified, before contracts are
let for technology development of those architecture components. Successful
implementation of OSA approaches requires the synchronized acquisition of data rights
for OS and interfacing architecture elements. These data rights are initially structured to
support acquisition of modular open system designs but also should address life-cycle
support.
DoDI 5000.02 identifies the use of OSA as a key systems engineering (SE) approach in
Enclosure 12, paragraph 8. The USD(AT&L) memorandum, "Better Buying Power 2.0:
Continuing the Pursuit for Greater Efficiency and Productivity in Defense Spending,"
November 13, 2012, raises the relevance of OSA along with acquisition of data rights
for appropriate architecture elements. The overarching business case for DoD is
increasing the level of competition by enabling small business. Programs should
develop a business model documenting the strategy for use of OSA and associated
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      363
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
data rights. The OSA-DR Charter signed by USD(AT&L) on February 15, 2012, requires
programs to issue business case guidance to aid programs in developing their business
models.
The DoD Open Systems Architecture Contract Guidebook for Program Managers
contains guidance regarding contract language programs should use to acquire data
rights in support of a program’s OSA strategy. Additional information and supporting
details amplifying each aspect of OSA is available on the DASD(SE) website.
    •    Employ an overall plan for and OSA approach that supports program functional
         architecture and that uses prescribed USD(AT&L) business case analyses
    •    Ensure the program functional architecture is structured to accommodate OSA
         where feasible, due to the high potential for reduced risk and cost
    •    Assess performance
    •    Balance current implementation of OSA with performance and evolving
         technology at the physical level; OSA establishes a technical baseline that may
         support modular architecture, but formally constrains the interfaces between
         modules, where interfaces close to current performance limits may quickly
         become obsolete
    •    Technically evaluate the appropriateness of an OSA approach by considering
         software constraints, security requirements and procedures, availability and cost
         of data rights, life-cycle affordability, and reliability of open standards, as well as
         other relevant factors such as environmental constraints (e.g., temperature,
         humidity, and ESOH)
Modular open system designs, developed from the system architecture, should be
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      364
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
analyzed at each design review because there is a link between OSA and the level and
type of technical data, computer software, and data rights the Government needs for
life-cycle support. In many cases weapon systems using OSA system elements can
have increased opportunities for competitive sourcing during the life-cycle sustainment,
and a correspondingly less need for detailed design data and associated data rights.
This benefit enables an incremental approach to capability adaptation in OSA-enabled
systems and is a benefit of the modularity originally specified into the functional
architecture.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      365
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
technology insertion. A system view such as this one includes a record of the data rights
that are required to enable the planned OSA design. This is to initially ensure and, for
the life-cycle sustainment, maintain the strong link between the OSA design and the
acquired data rights that enable it. The levels of data rights that need to be required for
each OSA-enabled architecture component are determined in order to assert the
requisite contract requirements to obtain them. The data rights strategy ensures that
enterprise-level data rights flow to system architecture components and that they
support the system architecture. Levels of data rights are described in DAG Chapter 2
Program Strategies and in Appendix 9 of the OSA Contract Guidebook.
In addition, the Program Manager should maintain an open systems management plan.
The plan describes the offeror’s approach to:
The open system management plan also should include a statement explaining why
each COTS/NDI was selected for use.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      366
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Program products typically used in making decisions regarding OSA include:
    •    System Requirements
    •    Technology Development Strategy (TDS) or Acquisition Strategy (AS)
    •    Program Protection Plan (PPP)
    •    Analysis of Alternatives (AoA)
    •    Enterprise Architecture
See DoD ASSIST homepage for more data item deliverables that may be appropriate
for each specific program and DoD 5010.12-M for data deliverables.
Emerging threats to the logistic resupply of operational forces, the trend toward ever
greater energy demand in the operational forces, and increasing costs to operate and
resupply energy-intensive systems have all put increasing focus on lowering system
and unit energy demand. Reducing the force’s dependence on energy logistics can
improve the force’s mobility and resilience and increase its control over the timing and
conditions of the fight. Focusing on energy as an explicit design consideration and
systems engineering (SE) category is a significant change in practice and thinking, to
help manage emerging operational challenges.
The Program Manager and Systems Engineer can help lower operational energy by
addressing issues associated with the system’s energy logistics support and power
resupply frequency.
This approach should generate informed choices based on the threshold and objective
values of the Energy Key Performance Parameter (KPP) for the system. For liquid
energy-consuming systems, the top-level units of measure for the Energy KPP might be
gallons of fuel demanded (consumed) over a defined set of duty cycles, or to
accomplish a specified mission goal such as a sortie. These measures may be further
decomposed into weight, range, electric power demand, and other relevant measures to
inform the necessary SE trade analysis. The intended result is a comprehensive set of
trade-space choices for industry to consider to deliver solutions that are not only energy
efficient but also mission effective and affordable. See Joint Capabilities Integration and
Development System (JCIDS) Manual (requires Common Access Card (CAC) to access
website) and CJCSI 3170.01H linked at the end of this section.
Energy’s relationship to performance arises from the operational context in which the
system is used. Accordingly, the scenarios that illustrate how the system is used, as
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      367
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
part of a unit of maneuver, are essential to understanding the energy supply and
demand constraints to be managed. This is essentially the same approach as balancing
survivability goals against lethality goals in the engineering trade space. Operational
energy issues include:
    •    How the system and combat unit refuel/recharge in the battlespace scenarios,
         and how often
    •    How this refueling/recharging requirement might constrain our forces (limit their
         freedom of action, on-station time, signature, etc.)
    •    How the adversary depicted in the defining scenarios might delay, disrupt, and/or
         defeat our forces by interdicting this system’s refueling/recharging logistics
    •    How much force protection could be diverted from combat missions to protecting
         these refueling/recharging events when and where required
The following documents provide the Program Manager and Systems Engineer with
additional insight into the issue of Operational Energy in the acquisition life cycle:
NOTE: The results of the sustainability analysis (see DAG section 4.3.19.2.
Sustainability Analysis) can be used to inform energy analyses.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      368
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
4.3.18.17. Packaging, Handling, Storage, and Transportation
The program team employs Packaging, Handling, Storage, and Transportation (PHS&T)
principles/methods to ensure the necessary equipment reaches the warfighter while
minimizing risk of damage to the equipment during handling, storage, and transportation
- frequently in highly challenging and corrosive operational environments.
Program Managers and Systems Engineers should ensure PHS&T is addressed during
the requirements analysis process, and validated throughout each phase of the systems
engineering (SE) development of the weapon system. DoDI 4540.07 identifies specifics
regarding PHS&T as related to program management of weapon systems acquisitions.
In addition, the following documents address PHS&T:
4.3.18.18.1. Producibility
4.3.18.18.1. Producibility
Producibility (the relative ease of manufacturing), like manufacturing and other key
system design functions, is integral to effectively and efficiently delivering capability to
the warfighter. Producible designs are lower risk, more cost-effective, and repeatable,
which enhances product reliability and supportability. Producibility should be assessed
at both a product and enterprise (i.e., organizational) level. The Program Manager
should implement producibility engineering and planning efforts early and should
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      369
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
continuously assess the integrated processes and resources needed to successfully
achieve producibility.
To assess producibility on a product level, both the product and its manufacturing
processes should be measured. Manufacturing processes should be monitored and
controlled, through measurement, to ensure that they can repeatedly produce accurate,
high-quality products, which helps the program meet objectives for limiting process
variability to a tolerable range.
The Program Manager should ensure that the producibility program focuses on the
following five elements to build and maintain a successful producibility system:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      370
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Optimize manufacturing plans as the design matures
Producibility should be a Technical Performance Measure (TPM) for the program, and
the program’s strategy for producibility should be contained in paragraph 3.6 of the
program’s Systems Engineering Plan (SEP). Planned producibility engineering activities
for previous and subsequent phases also should be summarized in the SEP. As a key
design accomplishment, producibility should be included in the SEP, mapping key
design considerations into the RFP and subsequently into the contract.
To ensure consistency in applying quality planning and process control, the program
should establish Quality Management Systems (QMS) early (Milestone A). The QMS
should be defined and documented in paragraph 11.2 of the Technology Development
Strategy (TDS) and the Acquisition Strategy (AS). The process should be integrated into
these documents as a systems engineering (SE) practice that supports the successful
transition of capability development to full-rate production and delivery of systems to
support warfighter missions.
The primary focus of the QMS should be to ensure efficiency in processes; when
integrated with Statistical Process Control (SPC) (eliminate defects and control
variation) the transition from system development to production should help with
controlling life-cycle cost and reducing complexities that are often found when quality is
not integrated as a function of the design. Therefore, to achieve high-quality (product
characteristics meet specification requirements), an end product should be designed so:
The Program Manager and Systems Engineer should take into consideration that
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      371
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
process capability goes beyond machine capability. The process should include the
effects of change in workers, materials, fabrication methods, tooling and equipment,
setup, and other conditions. Process capability data should be collected throughout
process and product development. Data collection efforts should be continuously
refined, using test articles, through production.
In addition to QMS and SPC, understanding and improving processes may require
common and/or new tools and techniques to eliminate defects and variation in
processes.
Another quality management tool available to the program management team is parts
management. MIL-STD-3018 provides requirements for the implementation of an
effective Parts Management Program (PMP) on Department of Defense (DoD)
acquisitions.
Quality should be a TPM for the program, and the program’s strategy for managing
quality should be included in the SEP. Planned quality engineering and management
activities for previous and subsequent phases also should be summarized in the SEP.
As a key design accomplishment, quality should be included in the SEP (Table 4.6-1)
mapping key design considerations into contracts.
Two valuable tools to assist in creating quality in design are Six Sigma and Quality
Function Development (QFD). Six Sigma techniques identify and reduce all sources of
product variation - machines, materials, methods, measurement system, the
environment, and the people in the process. QFD is a structured approach to
understanding customer requirements and translating them into products that satisfy
those needs.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      372
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Consideration of requirements for efficient manufacture during the design and
         production of the system
    •    The availability of raw materials, special alloys, composite materials,
         components, tooling, and production test equipment
    •    The use of advanced manufacturing technology, processes, and systems
    •    The use of contract solicitations that encourage competing offerors to acquire
         modern technology, production equipment, and production systems (including
         hardware and software)
    •    Methods to encourage investment in advanced manufacturing technology,
         production equipment, and processes
    •    During source selection, increased emphasis on the efficiency of production
    •    Expanded use of commercial manufacturing processes rather than processes
         specified by DoD
    •    Technology and the Industrial Base: assess the capability of the national
         technology and industrial base to support the design, development, production,
         operation, uninterrupted maintenance support, and eventual disposal
         (environmental impacts) of the system
    •    Design: assess the maturity and stability of the evolving system design and
         evaluate any related impact on manufacturing readiness
    •    Cost and Funding: examine the risk associated with reaching manufacturing cost
         targets
    •    Materials: assess the risks associated with materials (including basic/raw
         materials, components, semi-finished parts, and subassemblies)
    •    Process Capability and Control: assess the risks that manufacturing processes
         are able to reflect the design intent (repeatability and affordability) of key
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      373
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         characteristics
    •    Quality Management: assess the risks and management efforts to control quality
         and foster continuous improvement
    •    Manufacturing Workforce (Engineering and Production): assess the required
         skills, certification requirements, availability, and required number of personnel to
         support the manufacturing effort
    •    Facilities: assess the capabilities and capacity of key manufacturing facilities
         (prime, subcontractor, supplier, vendor, and maintenance/repair)
    •    Manufacturing Management: assess the orchestration of all elements needed to
         translate the design into an integrated and fielded system (meeting program
         goals for affordability and availability)
The Program Manager and Systems Engineer should consider the manufacturing
readiness and manufacturing-readiness processes of potential contractors and
subcontractors as a part of the source selection for major defense acquisition programs,
see DFARS 215.304.
The Program Manager and Systems Engineer should assess manufacturing readiness
at a minimum of four key points (events) during the acquisition life cycle, as described in
Table 4.3.18.18.3.T1.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      374
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
        Table 4.3.18.18.3.T1. Minimum Points (Events) to Assess Manufacturing
                      Readiness during the Acquisition Life Cycle
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      375
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Manufacturing Readiness Assessment
                                                                                    Considerations
               Points
2. Technology Development, Pre-EMD                              • The program should be nearing acceptance of a
Review. As the program approaches the Pre-                        preliminary system design
EMD Review and the Milestone B decision,                        • An initial manufacturing approach has been
critical technologies should have matured                         developed
sufficiently for 2366b certification and                        • Manufacturing processes have been defined and
demonstrated in a relevant environment and                        characterized, but there are still significant
should consider:                                                  engineering and/or design changes in the system
                                                                  itself; manufacturing processes that have not
                                                                  been defined or that may change as the design
                                                                  matures should be identified
                                                                • Preliminary design, producibility assessments,
                                                                  and trade studies of key technologies and
                                                                  components should have been completed
                                                                • Prototype manufacturing processes and
                                                                  technologies, materials, tooling and test
                                                                  equipment, as well as personnel skills have been
                                                                  demonstrated on systems and/or subsystems in a
                                                                  production-relevant environment
                                                                • Cost, yield, and rate analyses have been
                                                                  performed to assess how prototype data compare
                                                                  with target objectives, and the program has in
                                                                  place appropriate risk reduction to achieve cost
                                                                  requirements or establish a new baseline, which
                                                                  should include design trades
                                                                • Producibility considerations should have shaped
                                                                  system development plans, and the Industrial
                                                                  Base Capabilities assessment (in the Acquisition
                                                                  Strategy (AS) for Milestone B has confirmed the
                                                                  viability of the supplier base
3. Production Readiness Review. A                               • The detailed system design is complete and
production readiness review identifies the risks of               stable to support low-rate production
transitioning from development to production.                   • Technologies are mature and proven in a
Manufacturing is a function of production; in order               production environment, and manufacturing and
to transition to production without significant risk it           quality processes are capable, in control and
is important that key processes have been                         ready for low-rate production
considered and evaluated during the PRR, such
                                                                • All materials, manpower, tooling, test equipment,
as ensuring:
                                                                  and facilities have been proven on pilot lines and
                                                                  are available to meet the planned low-rate
                                                                  production schedule
                                                                • Cost and yield and rate analyses are updated
                                                                  with pilot line results
                                                                • Known producibility risks pose no significant
                                                                  challenges for low-rate production
                                                                • Supplier qualification testing and first article
                                                                  inspections have been completed
                                                                • Industrial base capabilities assessment for
                                                                  Milestone C has been completed and shows that
                                                                  the supply chain is adequate to support LRIP
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      376
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Manufacturing Readiness Assessment
                                                                                    Considerations
               Points
4. FRP Decision Review. To support FRP,                         • LRIP learning curves that include tested and
there should be no significant manufacturing                      applied continuous improvements
process and reliability risks remaining.                        • Meeting all systems engineering (SE)/design
Manufacturing and production readiness results                    requirements
should be presented that provide objective
                                                                • Evidence of a stable system design demonstrated
evidence of manufacturing readiness. The results
                                                                  through successful test and evaluation
should include recommendations for mitigating
any remaining low (acceptable) risk, based on                   • Evidence that materials, parts, manpower,
assessment of manufacturing readiness for FRP                     tooling, test equipment, and facilities are
which should include (but not be limited to):                     available to meet planned production rates
                                                                • Evidence that manufacturing processes are
                                                                  capable, in control, and have achieved planned
                                                                  FRP objectives
                                                                • Plans are in place for mitigating and monitoring
                                                                  production risks
                                                                • LRIP cost targets data have been met; learning
                                                                  curves have been analyzed and used to develop
                                                                  the FRP cost model
DoDI 5000.02 requires Major Defense Acquisition Program (MDAP) Program Managers
to implement a comprehensive R&M engineering program as an integral part of the
systems engineering (SE) process. The Systems Engineer should understand that R&M
parameters have an impact on the system’s performance, availability, logistics
supportability, and total ownership cost. To ensure a successful R&M engineering
program, the Systems Engineer should integrate the following activities across the
program’s engineering organization and processes:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      377
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         to the Initial Capabilities Document (ICD)/Capability Development Document
         (CDD) /Capability Production Document (CPD)
    •    Ensuring that R&M engineering activities and deliverables in the Request for
         Proposal are appropriate for the program phase and product type
    •    Integrating R&M engineering activities and reliability growth planning curve(s) in
         the Systems Engineering Plan (SEP) at each milestone
    •    Planning verification methods for each R&M requirement
    •    Ensuring the verification methods for each R&M requirement are described in the
         TEMP, along with a reliability growth planning curve beginning at MS B
    •    Ensuring data from R&M analyses, demonstrations, and tests are properly used
         to influence life-cycle product support planning, availability assessments, cost
         estimating, and other related program analyses
    •    Identifying and tracking R&M risks and Technical Performance Measures.
    •    Assessing R&M status during program technical reviews
    •    Including consideration of R&M in all configuration changes and trade-off
         analyses
As part of the SE process, the R&M engineer should be responsible for the R&M
activities by acquisition phase outlined in Table 4.3.18.19.T1.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      378
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                    Table 4.3.18.19.T1. R&M Activities by Acquisition Phase
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      379
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
     Acquisition Phase                                                   R&M Activities
Engineering and                           • Perform evaluations to assess R&M status and problems
Manufacturing                             • Ensure that the product baseline design and required testing can
Development (EMD)                           meet the R&M requirements
Phase. During the EMD                     • Ensure the final FMECA identifies failure modes, and their detection
phase, the R&M engineer, as                 methods, that could result in personnel injury and/or mission loss,
part of the program SE team,                and ensure they are mitigated in the design
should:
                                          • Ensure that the detailed R&M prediction to assess system potential
                                            to meet design requirements is complete
                                          • Verify through appropriate subsystem/equipment-level tests the
                                            readiness to enter system-level testing at or above the initial
                                            reliability established in the reliability growth-planning curve in both
                                            the SEP and the TEMP
                                          • Verify system conformance to specified R&M requirements through
                                            appropriate demonstration and test
                                          • Implement a FRACAS to ensure feedback of failure data during test
                                            and to apply and track corrective actions
                                          • Coordinate with the Chief Developmental Tester (T&E Lead) and
                                            Operational Test Agencies (OTA) to ensure that the program office
                                            and OTA data collection agree on R&M monitoring and failure
                                            definitions, and that R&M and BIT scoring processes are consistent
                                            in verification of requirements through all levels of testing
                                          • Define contractor R&M engineering activities in the RFP and contract
                                            Statement of Work (SOW) for the P&D phase to ensure adequate
                                            R&M engineering activities take place during P&D, and to ensure the
                                            RFP and contract SOW provide adequate consideration of R&M in
                                            re-procurements, spares, and repair parts
                                          • Verify that parts, materials, and processes meet system
                                            requirements through the use of a management plan detailing
                                            reliability risk considerations and evaluation strategies for the
                                            intended service life. Include flow of requirements to subcontractors
                                            and suppliers. See MIL-STD-1546, Parts, Materials, and Processes
                                            Control Program for Space and Launch Vehicles, and MIL-STD-
                                            1547, Electronic Parts, Materials, and Processes for Space and
                                            Launch Vehicles
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      380
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
     Acquisition Phase                                                   R&M Activities
Production and                            • Verify initial production control of R&M degradation factors by test
Deployment (P&D)                            and inspection, production data analysis, and supplemental tests
Phase. During the P&D                     • Verify R&M characteristics, maintenance concept, repair policies,
phase, the R&M engineer, as                 Government technical evaluation, and maintenance procedures by
part of the programs SE team                T&E
should:                                   • Identify R&M and production-related BIT improvement opportunities
                                            via FRACAS and field data assessment
                                          • Review Engineering Change Proposals (ECP), operational
                                            mission/deployment changes, and variations for impact on R&M
                                          • Update R&M predictions and FMECAs based on field results and
                                            apply them to the models previously developed to assess impacts on
                                            spares, manpower, missions, and availability
                                          • Verify that parts, materials, and processes management
                                            requirements for limiting reliability risk and "lessons learned" are
                                            utilized during all design change efforts including change proposals,
                                            variations, substitutions, product improvement efforts, or any other
                                            hardware change effort
Operations and Support • Assess operational data to determine the adequacy of R&M and BIT
(O&S) Phase. During the  characteristics performance, maintenance features and procedures,
O&S phase, the R&M                          and provisioning plans
engineer, as part of the                  • Identify problem areas for correction through ongoing closed-loop
program SE team should:                     FRACAS and field data assessment
                                          • Monitor availability rates and respond to negative trends and data
                                            anomalies
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      381
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
level SSRA. The Systems Engineer should be aware of the worldwide rules for
spectrum management and the need to obtain host nation permission for each
transmitter and frequency assignment.
Program Managers need to ensure that spectrum access is adequate and that it is
granted in the Continental United States (CONUS) and wherever else the equipment is
deployed. The Pre-Milestone A Analysis of Alternatives (AoA) should address spectrum
needs as part of concept formulation. Both the SSRA and DD-1494 are required for
each milestone (see DoDI 4650.01). The SSRA is used within the DoD as the basis for
assessing the feasibility of building and fielding equipment that operate within assigned
frequency bands and to identify potential de-confliction situations. The DD-1494,
Application for Equipment Frequency Allocation, has four stages, which reflect the
increasing maturity of available spectrum information during development. The DD-1494
form is submitted to National Telecommunications and Information Administration
(NTIA) for approval of spectrum allocation without which emitters cannot operate within
CONUS, and to the International Telecommunications Union (ITU) for satellites. The
NTIA Manual of Regulations and Procedures for Federal Radio Frequency Management
(Redbook) chapter 3 addresses international treaty aspects of the spectrum and chapter
4 addresses frequency allocations.
The Systems Engineer has a lead role in defining spectrum needs, throughput and
power requirements, and other attributes of the signals in space (outside the antenna -
not in the transmission device) and the antenna characteristics and platform mounting
details, as well as the safety aspects of emitters with regard to the Hazards of
Electromagnetic Radiation to Ordnance (HERO), Personnel (HERP), and Fuel (HERF).
The SE should be aware that portions of the spectrum previously assigned to DoD or
other Federal users are being sold for commercial use. Thus, previously approved DD-
1494 can be revoked, requiring modifications to designs, and even to fielded equipment.
Similarly, host nations can alter prior agreements as commercial applications encroach
upon previously available spectrum.
Each nation reserves the right to control emitters operating within its territory, thus host
nation agreements are essential in support of deployment. Program Managers and
Systems Engineers of platforms that mount multiple emitters and receivers need to
obtain spectrum access for each emitter and ensure that those emitters and receivers
do not produce mutual interference, or interact with ordnance (see DoDD 3222.3, MIL-
STD-461, MIL STD-464, and MIL-HDBK-235-1, 237,and 240A, and "Joint Services
Guide for Development of a Spectrum Supportability Risk Assessment"). The Defense
Information Systems Agency (DISA), Defense Spectrum Organization provides
spectrum support and planning for DoD and can be reached at
http://www.disa.mil/Services/Spectrum. See Figure 4.3.18.20.F1 for spectrum activities
by acquisition phase. This figure summarizes the requirements of DoDI 4650.01.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      382
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
          Figure 4.3.18.20.F1. Spectrum-Related Activities by Life-Cycle Phase
4.3.18.21. Standardization
4.3.18.21. Standardization
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      383
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
eliminating requirements.
    •    Reducing the number of unique or specialized parts used in a system (or across
         systems)
    •    Reducing the logistics footprint
    •    Lowering life-cycle costs
In addition, parts management can enhance the reliability of the system and mitigate
part obsolescence due to Diminishing Manufacturing Sources and Material Shortages
(DMSMS). MIL-STD-3018, Parts Management, dictates that program offices should
apply standardization processes to:
4.3.18.22. Supportability
4.3.18.22. Supportability
Supportability refers to the inherent characteristics of the system and the enabling
system elements that allow effective and efficient sustainment (including maintenance
and other support functions) throughout the system’s life cycle. By addressing
supportability as part of the system design, the Program Manager through the Systems
Engineer and Product Support Manager ensures the system reaches Initial Operational
Capability (IOC) with the required enabling system elements in place. The benefits to
the program are:
    •    Cost savings
    •    Fielding of a more affordable logistics infrastructure
    •    Improving Materiel and Operational Availability
    •    Reducing footprint
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      384
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Early consideration of supportability needs during Requirements Analysis, Architecture
Design, and Implementation processes are critical to ensure the delivered capability is
operationally suitable, effective, sustainable, and affordable. The system baseline
should incorporate inherent supportability characteristics and should include the design
of the enabling support infrastructure. Details can be found in DAG Chapter 5 Life-Cycle
Logistics, but typical product support infrastructure considerations are listed in Table
4.3.18.22.T1.
The Program Manager is responsible for approving life-cycle cost trades throughout the
acquisition process. It is critical that the design of a program focused on life-cycle
supportability involve the logisticians alongside the end users early in the Stakeholder
Requirements Definition process to support the Reliability Centered Maintenance
(RCM) analysis and to develop the overall performance based product support strategy.
Reference DoD 4151.22-M, Conditioned Based Maintenance Plus (CBM+), an
important support concept and a specific initiative, can be useful to perform
maintenance based on evidence of need as provided by RCM analysis and other
enabling processes and technologies.
RCM analysis is a systematic approach analyzing the functions and potential failures to
identify and define preventive or scheduled maintenance tasks for an equipment end
item. Tasks may be preventive, predictive, or proactive in nature. RCM results provide
operational availability with an acceptable level of risk in an efficient and cost-effective
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      385
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
manner.
Additionally, the Product Support Manager and Systems Engineer should ensure that
supportability activities are documented in the Systems Engineering Plan (SEP) and the
Life-Cycle Support Plan (LCSP), and that the supportability design requirements are
documented in the program’s functional baseline.
The Systems Engineer working with the Product Support Manager should identify and
mitigate the supportability life-cycle cost drivers to ensure a system is affordable across
the life cycle. The streamlined LCSP outline calls out specific phase and milestone
expectations. These expectations include determining supportability design alternatives
along with their associated cost and establishing both the Operational Availability (AO)
and Materiel Availability (A M) drivers. The derived supportability requirements should be
based on trade studies along with their associated cost and operational and materiel
availability drivers (see DAG Chapter 5 Life-Cycle Logistics). The Cost-Benefit
Analyses, jointly conducted by the Systems Engineer and Product Support Manager in
the supportability analysis, provides insight into supportability drivers and includes the
impact of resources on readiness supported by engineering analyses required for
product support (i.e., FMECA, predictions, and diagnostics architecture).
Survivability is the capability of a system and its crew to avoid or withstand a hostile
environment without suffering an abortive impairment of its ability to accomplish its
designated mission. Susceptibility is the degree to which a device, piece of equipment,
or weapon system is open to effective attack as a result of one or more inherent
weaknesses. Man-made and natural environmental conditions, described in MIL-STD-
810 (sand, vibration, shock, immersion, fog, etc.), and electromagnetic environment,
described in MIL-STD-461/464, also should be considered in system design.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      386
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Susceptibility is a function of operational tactics, countermeasures, probability of an
enemy threat, etc. Susceptibility is considered a subset of survivability. Vulnerability is
the characteristics of a system that cause it to suffer a definite degradation (loss or
reduction of capability to perform the designated mission) as a result of having been
subjected to a certain (defined) level of effects in an unnatural (man-made) or natural
(e.g., lightning, solar storms) hostile environment. Vulnerability is also considered a
subset of survivability.
Design and testing ensure that the system and crew can withstand man-made hostile
environments without the crew suffering acute chronic illness, disability, or death. The
Program Manager, supported by the Systems Engineer, should fully assess system and
crew survivability against all anticipated threats, at all levels of conflict, throughout the
system life cycle. The goal of survivability and susceptibility is to:
If the system or program has been designated by the Director, Operational Test and
Evaluation (DOT&E), for live-fire test and evaluation (LFT&E) oversight, the Program
Manager should integrate test and evaluation (T&E) to address crew survivability issues
into the LFT&E program supporting the Secretary of Defense LFT&E Report to
Congress.
If the system or program has been designated a CBRN mission-critical system, the
Program Manager should address CBRN survivability, in accordance with DoDI
3150.09, The Chemical, Biological, Radiological, and Nuclear (CBRN) Survivability
Policy. The Program Manager should ensure that progress toward CBRN survivability
requirements is documented in the applicable Service CBRN mission-critical report.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      387
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
including crew, regardless of acquisition category, should be survivable to the threat
levels anticipated in their projected operating environment as portrayed in their platform-
specific System Threat Assessment Report (STAR) (see DoDI 5000.02, Enclosures 6
and 8), or in lieu of a STAR, the appropriate capstone threat document.
    •    How the design incorporates susceptibility and vulnerability reduction and CBRN
         survivability requirements
    •    How progress toward these are tracked over the acquisition life cycle
System Security Engineering (SSE) activities allow for identification and incorporation of
security design and process requirements into risk identification and management in the
requirements trade space.
SSE is the integrating process for mitigating and managing risks to advanced
technology and mission-critical system functionality from foreign collection, design
vulnerability or supply chain exploit/insertion, battlefield loss, and unauthorized or
inadvertent disclosure throughout the acquisition life cycle. The SSE process captures
SSE analysis in the system requirements and design documents, and SSE verification
in the test plans, procedures, and results documents. The Program Protection Plan (see
DAG Chapter 13 Program Protection) documents the comprehensive approach to
system security engineering analysis and the associated results.
SSE is the functional discipline within systems engineering that ensures security
requirements are included in the engineering analysis with the results being captured in
the Program Protection Plan (PPP), provided at each Systems Engineering (SE)
technical review (SETR) event (see DAG Chapter 13 Program Protection) and
incorporated into the SETR-related SE requirements and the functional, allocated, and
product baselines. The PPP is approved by the Milestone Decision Authority (MDA) at
each milestone decision review and at the Full-Rate Production/Full-Deployment
(FRP/FD) decision, with an approvable draft at the pre-Engineering and Manufacturing
Development (EMD) review. The analysis should be used to update the SE baselines
prior to each SETR and key knowledge point throughout the life cycle.
The Program Manager is responsible for developing a PPP that ensures the program
complies with program protection policy and system requirements. The Systems
Engineer and/or System Security Engineer is responsible for ensuring a balanced set of
security requirements, designs, testing, and risk management are incorporated and
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      388
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
addressed in the their respective trade spaces.
The Systems Engineer and/or System Security Engineer is responsible for facilitating
cross-discipline system security working groups and is typically responsible for leading
the SSE analysis necessary for development of the PPP. The cross-discipline
interactions reach beyond the SSE community to the test and logistics communities.
The Test Lead is responsible for incorporating sufficient system security test
requirements into the Test and Evaluation Strategy (TES) and Test and Evaluation
Master Plan (TEMP). The logistics community is responsible for continuing the
protections and risk management activities initiated in acquisition throughout the
Operations and Support (O&S) phase.
SSE processes inform the development and release of each request for proposal (RFP)
(see DAG Chapter 13 Program Protection) by incorporating SSE process requirements
into the Statement of Work (SOW) and the system security requirements into the
Requests for Proposal (RFP) requirements document. Contractor responsibilities
include developing plans to ensure that the system security protections are
implemented in the development environments, system designs, and supply chains.
The early and frequent consideration of SSE principles reduces rework and expense
resulting from late-to-need security requirements (e.g., anti-tamper, exportability
features, supply chain risk management, secure design, defense-in-depth, and
information assurance implementation).
Systems engineering (SE) tools support the performance of activities and the
development of products. SE techniques use tools and methods to complete specific
tasks. SE tools and techniques support the Program Manager and Systems Engineer in
performing and managing the SE activities and processes to improve productivity and
system cost, schedule, capabilities, and adaptability. The program should begin
applying SE tools and techniques during the early stages of program definition to
improve efficiency and traceability and to provide a technical framework for managing
the weapon system development.
Collaboration tools allow the program office and developer to exchange data and
analyses easily. Analytical tools and techniques also can assist in the development and
validation of system designs. It is critical that the Systems Engineer understand the
constraints and limitations of any particular analysis tool or technique, and apply this
understanding when making assessments or recommendations based on its output.
• Needs and constraints of the program (e.g., complexity, size, and funding)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      389
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Applicability to required tasks and desired products
    •    Computer system requirements, including peripheral equipment
    •    Licensing and maintenance costs
    •    Technical data management (see DAG section 4.3.8. Technical Data
         Management Process)
    •    Integration with other SE tools in use within the program, by the developer, and
         by externally interfacing programs
    •    Cost to train the user to apply the tool or technique
    •    Number and level of expertise of Government and contractor staff (both users of
         the tool and users of the tool outputs)
    •    Feasibility of implementing the tool or technique throughout the acquisition life
         cycle
Table 4.3.19.T1 lists general capabilities and features of SE tools and the SE processes
they might support.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      390
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
             SE Process                                        Tool Capabilities / Features
                                             • Assists in the identification of configuration items
                                             • Assists in baseline/version control of all configuration
   Configuration                                 items
   Management                                • Assists in ensuring configuration baselines and changes
                                                 are identified, recorded, evaluated, approved,
                                                 incorporated and verified
                                             • Assists in identification of data requirements
   Technical                                 • Assists in storage, maintenance, control, use, and
   Data Management                             exchange of data
                                             • Assists in document preparation, update, and analysis
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      391
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
             SE Process                                        Tool Capabilities / Features
                                             • Assists in planning and executing delivery and deploying
   Transition                                    of the system to the end user for use in operational
                                                 environment
Models and simulations are SE tools used by multiple functional area disciplines during
all life-cycle phases. Modeling is essential to aid in understanding complex systems and
system interdependencies, and to communicate among team members and
stakeholders. Simulation provides a means to explore concepts, system characteristics,
and alternatives; open up the trade space; facilitate informed decisions and assess
overall system performance.
The DoD Acquisition Modeling and Simulation Working Group Systems Engineering
Modeling, Simulation, and Analysis Fundamentals (located on the DASD(SE) website)
recommends that all programs identify and maintain "a collection of related information,
representing all necessary viewpoints on the design, and capturing all relevant system
interactions." The Program Manager and Systems Engineer should consider directing
the use of such a collection when planning for the development, use, and application of
models, simulations, and analyses on their program. This collected information can help
drive consistency and integration among SE and analytical tools, and provide the
program with a capability to assess potential design changes as well as system
upgrades throughout the life cycle. Figure 4.3.19.1.F1. shows some benefits of using
modeling and simulation throughout the acquisition life cycle. This figure is adapted
from a 2010 National Defense Industrial Association (NDIA) Systems Engineering
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      392
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Division "Model-Based Engineering (MBE)" study and is used with permission.
Modeling and simulation should take advantage of opportunities for reuse (see DoD
Modeling and Simulation Catalog). Models and simulations developed in early
acquisition phases may be repurposed for other activities during later phases (e.g.,
engineering models can be used in training simulations).
SE requires use of models and simulations from many disciplines and across a
hierarchy of perspectives that range from an engineering/technical level up to the
campaign/strategic level in order to effectively analyze requirements, design, cost,
schedule, performance, and risk. These models and simulations often exist, but
sometimes need to be newly developed, which can be costly. An option for new
development is to consider federating existing models and simulations, using any of
various interoperability standards, in order to create needed capability. Program
Managers and Systems Engineers should consider how to leverage M&S
interoperability as they plan for M&S use throughout a program's life cycle. Modeling
and simulation is also used to support developmental test and evaluation (DT&E) and
operational test and evaluation (OT&E).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      393
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Roles, Responsibilities, and Activities
To make effective and appropriate use of modeling and simulation, the Program
Manager and Systems Engineer should ensure that planned modeling and simulation
activities are:
    •    Complete and comprehensive, including all efforts anticipated throughout the life
         cycle, to include planning, development, and acceptance through proper
         verification, validation, and accreditation (VV&A); (see DAG Chapter 9 Test and
         Evaluation)
    •    Reflected in the program’s technical planning (Work Breakdown Structure
         (WBS), schedules, budgets, Systems Engineering Plan (SEP), and other
         program documentation; see DAG section 4.3.2. Technical Planning Process)
    •    Appropriately resourced, including a properly skilled workforce
The Program Manager and Systems Engineer should establish and maintain a
repository of all relevant modeling and simulation data products that describe what the
system is and does. This repository also should contain descriptive system information
that could be used to feed other modeling and simulation efforts. They should ensure
that all modeling and simulation products are established, maintained, controlled, and
resourced to achieve an efficient and effective acquisition program.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      394
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
The Program Manager and Systems Engineer should ensure that the program’s
modeling and simulation activities are coordinated, managed, and controlled such that
products are consistent with the system and architecture design at all levels. Modeling
and simulation planning should be part of the overall program plan; and should be
integrated with it. The program may choose to integrate the modeling and simulation
planning details into the program plan or create a separate modeling and simulation
planning document. If the documents are separate, the program must ensure the
modeling and simulation planning is kept up to date as the program plan adjusts.
Program Managers should follow their local modeling and simulation organizations
standards for planning managing and controlling such activities.
Modeling and modeling artifacts should be evident in the contents of the required
program technical reviews and in the baselined technical data needed to support major
program reviews and program decisions.
The sustainability analysis, using a Life Cycle Assessment (LCA) method, is a tool to
assist the Systems Engineer in designing more sustainable systems - those which use
fewer resources over the life cycle, have fewer impacts on human health and the
environment, and thus have a lower total ownership cost (TOC). The Program Manager
should make sustainability considerations an integral part of both a robust trade space
analysis and a comprehensive supportability analysis. These sustainability analyses can
help reduce system TOC by uncovering previously hidden or ignored life-cycle costs,
leading to more informed decisions earlier in the acquisition life cycle. They can also
help make systems more affordable and improve the accuracy of life-cycle cost
estimates.
Large military systems and platforms can have a life cycle of 30 years or more. To meet
evolving mission needs far into the future, the system design should incorporate long-
term sustainability considerations in order to reduce life-cycle costs. Without a full
understanding of life-cycle impacts, significant costs may be unintentionally inserted
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      395
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
during acquisition and later exposed by the logistics and operational communities.
"Sustainability" differs from "sustainment" in that it relates to the use of resources, and
the associated impacts and costs over the system’s life cycle. In contrast, sustainment
is more concerned with the end user’s ability to operate and maintain a system once it is
in inventory and deployed. Both aspects need to be addressed in the design process.
The Streamlined Life Cycle Assessment Process for Sustainability in DoD Acquisitions
is specifically for use in the DoD acquisition process. It combines LCA with multi-
attribute analysis. It integrates a number of trade space and design considerations and
provides a procedure to compare conceptual or detailed design alternatives. The
streamlined LCA can be applied in a qualitative mode even when data are lacking, and
can be accomplished with minimal resources. It is intended to ensure consideration of
important downstream impacts and costs in trade-off and design decisions. The method
is consistent, without duplication, with other considerations such as operational energy,
supportability, and environment, safety, and occupational health (ESOH).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      396
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
4.3.19.3. Value Engineering
By following this process, the Program Manager can analyze the functions of an item or
process to determine best value, identify and reduce unnecessary costs, increase
productivity, enhance quality, and improve system and program performance. VE
supports most aspects of the Better Buying Power initiative:
    •    Affordability and cost growth: VE critically compares the cost and value of every
         requirement to focus the program on providing only necessary functions at a
         minimum overall cost. This represents a systematic approach for attaining return
         on investment.
    •    Promote competition: Program Managers can employ VE to identify technical
         data describing required functions of system elements, enabling multiple
         suppliers to bid.
    •    Provide incentives for productivity and innovation: VE provides industry with an
         incentive to reduce costs; the developer receives a share in the savings if the
         Government implements a VE change.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      397
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Roles, Responsibilities, and Activities
Program Managers and Systems Engineers should encourage both in-house VE and
VECP-based studies and trade-offs on every activity or contract with a value exceeding
the simplified acquisition threshold. While a common misconception is that VE applies
only to production, successful introduction of VE may occur at any point in the life cycle.
The most opportune time to apply VE is early in the life cycle, before production begins,
before preparation of field or technical manuals, and before finalizing logistics support
plans.
The following examples are potential areas in which the application of VE and VECP
may provide a benefit:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      398
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Additional resources available to the Program Manager and Systems Engineer to learn
more about VE as a tool to reduce costs include:
  • Defense Acquisition University (DAU) Continuous Learning Module (click on
       CLE001)
  • DoD VE information as provided by the Institute for Defense Analyses
  • SD-24, Value Engineering: A Guidebook of Best Practices and Tools
  • Office of Management and Budget Circular A-131
Lessons learned and case studies generally describe areas of risk, pitfalls encountered
in programs, and strategies employed to mitigate or fix problems when they arose. Best
practices are proven techniques and strategies that can avoid common problems and
improve quality, cost, or both.
Best practices and lessons learned are applicable to all aspects of a program -
technical, managerial, and programmatic - and at any point in the acquisition life cycle.
However, they are not universal or "one-size-fits-all" solutions. The greatest benefits
occur when Program Managers and Systems Engineers judiciously select successful
practices or strategies from analogous programs/systems and tailor them to meet
current program needs.
Design, build, test, and certification standards are an implementation of lessons learned
over time. Program Managers and Systems Engineers should be aware that Standards
are not ad hoc requirements developed by a single engineer or program office. They
result from years of engineering, manufacturing, or sustainment knowledge that
eventually migrates to a standard that should be followed.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      399
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    DoD Systems Engineering community of practice websites
    •    Other Departments and Agencies such as National Aeronautics and Space
         Administration (NASA) or Department of Energy (DoE)
    •    Professional organizations such as the International Council on Systems
         Engineering (INCOSE) or the Institute of Electrical and Electronics Engineers
         (IEEE)
    •    Industry organizations such as National Defense Industrial Association (NDIA) or
         Aerospace Industries Association (AIA)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      400
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
DEFENSE ACQUISITION GUIDEBOOK
Chapter 5 -- Life-Cycle Logistics
5.0. Overview
5.5. References
5.0. Overview
5.0.1. Purpose
5.0.2. Contents
5.0. Overview
Within the Defense Acquisition Management System, DoDD 5000.01 requires that:
"Planning for Operation and Support and the estimation of total ownership costs shall
begin as early as possible. Supportability, a key component of performance, shall be
considered throughout the system life cycle."
5.0.1. Purpose
This chapter provides the associated guidance the Program Manager (PM), Product
Support Manager (PSM), and Life-Cycle Logisticians can use in influencing the design
and providing effective, timely product support capability to achieve the systems
materiel readiness and sustain operational capability. Emphasis is placed on integrating
life-cycle management principles by using performance-based life-cycle product support
strategies to provide effective support. This synchronized with the systems engineering
process results in affordable materiel readiness at an optimal life-cycle cost (LCC) by
reducing the frequency, duration, and related costs of availability degrader events to
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      401
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
reduce manpower and logistics footprint. An executive summary of key chapter
principles is provided below.
The PM, as the life-cycle manager, is responsible for accomplishing program objectives
across the life cycle, including the operating & support (O&S) phase. Employing
performance-based life-cycle product support tied to sustainment metrics is the
overarching Department of Defense (DoD) concept for providing materiel readiness to
the user. This logistics aspect of the life-cycle management approach is depicted in
Figure 5.0.1.F1 and discussed in subsequent sections.
There are three DoD Decision Support Systems - Joint Capabilities Integration and
Development System (JCIDS) , Defense Acquisition System , and Planning,
Programming, Budgeting and Execution (PPBE) process - that frame the environment
for implementing life-cycle management. In addition, there are three related but distinct
communities, with corresponding reporting chains, within the DoD -- the acquisition,
user, and sustainment chains involved in implementing the decision support systems.
Working in tandem these communities share responsibilities which vary depending on
the life-cycle phase. Consequently, the PM needs to be involved with each chain. The
Defense Acquisition Guidebook focuses on the acquisition chain (e.g. the OSD, Service
Secretariat, Program Executive Officer chain, etc.). Chapter 5 addresses the acquisition
chain and highlights interfaces with the user chain (e.g. the type commander, Theater
Commanders, etc.) and sustainment chain (e.g. supply chain (including the
transportation system, maintenance facilities and depots, industrial base), in-service
engineering organizations, etc.).
During acquisition the focus is primarily through the acquisition community with
requirements input from the user and sustainment communities. These requirements
include:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      402
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                            Figure 5.0.1.F1. Life-Cycle Logistics Overview
During operations the focus is primarily through the user and sustainment communities
with support from the acquisition community. The PM's focus is on supporting the user’s
ability to effectively meet mission requirements through the application of systems
engineering to implement continuous process improvement initiatives. This involves
monitoring performance to identify major readiness degraders (e.g., reliability, cycle
time and cost) and to:
    •    Align and refine the product support package (e.g. the product support elements)
         and sustainment processes to achieve the sustainment metrics
    •    Engage the various communities to achieve optimum materiel readiness
    •    Optimize or reduce the logistics demand (including the logistics footprint) and
         support processes (e.g., training, technical data, supply chain, maintenance, etc.)
         based on actual conditions
    •    Reduce operating and support costs
    •    Identify and implement design changes to address evolving requirements,
         technological obsolescence, diminishing manufacturing sources, or materiel
         availability shortfalls.
To accomplish this life-cycle product support concept outcomes are estimated in the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      403
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
design phase then measured during testing and operations and become the basis for
actions to achieve materiel readiness. The sustainment metrics, including the
Sustainment Key Performance Parameter (KPP) with its supporting Key System
Attributes (KSAs), provide the common thread to integrate the product support elements
and align the behaviors required to achieve the desired materiel readiness outcome
across the entire enterprise. The goal is to use consistent outcome metrics as the basis
for actions to provide and sustain affordable materiel readiness across the entire life
cycle.
5.0.2. Contents
Section 5.1 5.3 present information applicable across the lifecycle, while the information
in Section 5.4 has been tailored to specific portions of the lifecycle.
Section 5.5, References , provides references for further explanation and information.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      404
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
5.1. Life-Cycle Sustainment in the Defense Acquisition Management System
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      405
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    Figure 5.1.1.F1. Sustainment Thread in the Defense Acquisition Management
                                      System
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      406
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Understanding industrial base capabilities and service capabilities;
    •    Ensuring competition, or the option of competition, at both the prime and
         subcontract level throughout the program life cycle;
    •    Performance-based life-cycle product support strategies to project and sustain
         the force with minimal footprint that support the Sustainment KPP, it’s associated
         KSAs, and overall affordability goals;
    •    Continuous process improvement including assessing the life-cycle product
         support strategies, to include end-to-end sustainment chain planning,
         assessment, and execution.
Developing the Product Support Strategy that defines the overall end state is the first
step in achieving product support. In developing the support strategy, each program
should develop an affordable strategy that:
    •    Positions and delivers materiel to satisfy highly variable readiness and combat
         sustainment needs in a variety of unique and demanding environments.
    •    Meets all materiel management and maintenance statutory requirements.
    •    Supports rapid power projection.
    •    Improves readiness through performance-based sustainment strategies.
    •    Establishes end-to-end processes focused on outcomes.
    •    Implements contemporary business systems and practices that enable the
         integration of people, information, and processes.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      407
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Protects critical program information including as it moves through the supply
         chain, as required in DoD Instruction 5200.39 .
The support concept has to address the hardware and its associated technical data and
computer software (including Commercial Off The Self (COTS) software) since software
can be a major sustainment issue as systems become more software intensive.
Programs need to plan for technology refreshment and maintaining the software after
production. This includes how changes (for obsolescence/ technology refreshment and
maintaining the software) will be budgeted and executed along with the necessary
computer software documentation required to sustain the software throughout the
system life. In addition to sustaining the software, aspects such as customer support,
systems administration help desk support, etc. need to be considered.
Achieving the support concept and sustaining operational capability requires the
involvement of the logistics, engineering, testing, program management, contracts,
supply chain, and financial management experts. The overall support strategy,
documented in the Life-Cycle Sustainment Plan, should include life-cycle support
planning and address actions to assure sustainment and continually improve product
affordability for programs in initial procurement, re-procurement, and post-production
support. A performance-based product support plan will be used to align the support
activities necessary to meet these objectives.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      408
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
5.1.1.2. Sustainment Metrics
Minimum Reporting: The specific metrics should be tailored to the program and it’s
operational and sustainment needs. At a minimum, they should consist of four
interrelated metrics: an outcome metric meaningful to the user in achieving and
sustaining the operating tempo; a materiel metric to measure the systems quality; a
response metric to measure the quality of the logistics system; and a cost metric. They
should be consistently defined within the program and traceable to the operational
need. At the top level, the sustainment metrics should focus on providing an effective
system that is available and reliable with minimal down time at a reasonable cost. Exact
definitions and details can be found in the JCIDS Manual . However, programs have the
flexibility to tailor the metrics (including adding additional sustainment metrics (e.g.
footprint, manning levels) as long as the intent is met. The following describes the
general intent of each of the metrics:
    •    Materiel Availability the percentage of the total inventory (not just the
         operationally assigned assets) operationally capable at a given time based on
         materiel condition. This "total inventory" aspect is critical because it not only
         measures the ability to execute "today's" missions but also provides an indication
         of the "surge" ability. Materiel availability is primarily an indication of the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      409
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         percentage of time a system is operationally capable of performing an assigned
         mission. In addition to the planned missions/scenarios, operating tempo, and
         sustainment concept of operations (CONOPS), this metric is dependent on
         system reliability and the mean downtime resulting from, but not limited to
         failures, scheduled downtime, general maintenance or servicing actions.
    •    Materiel Reliability - the probability the system will perform without failure over a
         specific interval. This metric focuses on reliability of the entire system and should
         not be confused with the mission success rate. Defining the criteria for measuring
         relevant failures (including consistent definitions for failures (e.g., criteria for
         counting assets as "up" or "down") and mission critical systems) and clearly
         defining how time intervals will be measured are important and must be
         consistent with the other metrics.
    •    Mean Down Time - the average time an end item is unavailable to perform it’s
         assigned mission after it experiences unscheduled or scheduled maintenance
         actions. It includes all time where the system is not at the disposal of the Force
         Provider to initiate missions. In addition to the projected supply chain approach
         with its resultant logistics footprint, the impact of surge/deployment acceleration
         requirements should be determined for this and the Materiel Availability metric.
    •    Ownership Cost KSA - a subset of the operating and support costs, excluding
         manpower, training and indirect support cost. However, to address affordability it
         is important to use operations and support costs to influence program design,
         acquisition, and sustainment alternative decisions. Consequently, pending the
         official JCIDS Manual change, OSD is now requiring programs report the O&S
         costs along with the Ownership Cost KSA because the programs cost model
         must be consistent with the design specifications as well as the assumptions and
         conditions used for Materiel Availability, Materiel Reliability and Mean Down Time
         metrics. In all cases it is critical the cost structure being used be clearly defined
         (along with the cost estimating relationships/models, and assumptions) and all
         relevant costs for the trade-off decisions are included regardless of funding
         source. ( see chapter 3 ).
The selection of the specific performance metrics should be carefully considered and
supported by an operationally-oriented analysis, taking into account technology
maturity, fiscal constraints, and the timeframe the capability is required. In implementing
performance-based life-cycle product support strategies, the metrics should be
appropriate to the scope of product support integrators and providers responsibilities
and should be revisited as necessary to ensure they are motivating the desired
behaviors across the enterprise. During operations the program can consider measuring
additional metrics for configuration control, training effectiveness, overall user
satisfaction, etc. The specific metrics selected should tie to existing user performance
measures and reporting systems. In addition, existing logistics and financial metrics
should be related to these top level user performance metrics and considered as
supporting metrics to help provide confidence they can be met as well as identify risk
areas.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      410
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
5.1.1.3. Performance-Based Life-Cycle Product Support Implementation
Building on the best features of the public and private sectors is a key component of the
support strategy. The Performance-Based Life-Cycle Product Support Implementation
Framework (Figure 5.1.1.3.F1) captures the range of capability solutions that could be
employed. The framework is incremental, in that each alternative builds on the previous
category. In all cases the systems sustainment parameters are projected and measured
during the design process and then re-assessed once the system is operational so
appropriate actions can be taken to achieve the Materiel Availability objective. Within
each category, the program manager is responsible for working with the stakeholders to
ensure the appropriate actions are taken to meet the user’s needs. The difference is the
amount of financial risk shared with the product support integrator or provider and
sustainment aspects covered. The categories do not imply a level of "goodness" but
only provide a means to illustrate the wide range of implementation options available to
the program. Each category description is described below.
Category 2: At level 2 fiscal risks begin to transition, but only in narrow but critical
supply chain functional areas. Typical functions falling within this level include providing
material, inventory management, transportation, and/or maintenance where the provider
is accountable for the responsiveness required to meet customer requirements. This
level generally concentrates on providing parts with the government making design
decisions. Part availability, mean down time (MDT) or logistics response time (LRT) are
the typical metrics for Level 2 implementations where the time it takes the supplier to
deliver the part, commodity or service to the user determines their payment. In using the
approach, care must be given to the requirements and contract terms to ensure they
drive the supplier's behavior so the government achieves an affordable material
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      411
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
readiness outcome.
The PM is still responsible for taking the appropriate actions with the providers;
however, more risks are shared because there are fewer providers with whom to
coordinate. The PM still procures many of the individual product support elements and
manages the systems configuration. The program has to develop performance
requirements, integrate, procure, and balance the elements not included in the
Performance-Based Agreement (PBA) to achieve an affordable materiel availability
outcome.
Category 3: This level expands the provider's fiscal risk level by transferring life-cycle
support activities to the product support integrator (PSI), making them accountable for
sustaining overall system materiel availability. Category 3 typically focuses on
maintaining the required availability of key components or assemblies, such as a wing
flap or auxiliary power unit, but can include the entire system. In Category 3, there is an
additional PSI focus on life-cycle support, training, maintenance, repair and overhaul
including logistics planning and execution, in-service engineering, configuration
management and transportation. In Category 3, the PSI may also make repair or
replace decisions. The preferred metric is materiel availability.
At this level the product support integrator is assigned specific life-cycle responsibility,
solely or in partnership, for the breadth of processes affecting materiel availability. This
includes aspects of sustainment engineering and configuration control, since reliability
and maintenance of equipment and effectiveness of the supply chain influences
continually affordable operational availability.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      412
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Contracting for performance based logistics is a multiple step process that can be
applied to new, modified or legacy systems. The process is detailed on the web-based
PBL Toolkit as a best practice. It is a proven process focusing on legacy programs that
can be tailored and adapted to individual systems, subsystems or components to meet
its needs and its business and operational environments.
Conditions change over the life of any system so it is critical that performance be
measured against a plan and corrective steps be taken as conditions warrant. These
steps can range from corrective actions anywhere within the program or it’s supply
chain to re-baselining the metrics. Care should be taken to ensure the appropriate
stakeholders are involved with any requirements change decisions and that the baseline
is not changed too often to avoid rubber baselines.
Monitoring actual performance (or projected performance during design) then taking the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      413
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
appropriate corrective actions when needed is critical in achieving and sustaining
performance. During testing, monitoring allows early corrective actions before the
system is deployed. During operations, it can help the PM determine if the metrics are
driving the desired behaviors (or if different metrics are needed) to achieve the desired
behavior or performance. Consequently, the PM should have a strong monitoring and
assessment program structured to fit the unique program conditions. Representatives
from each of the functional areas that drive the metrics should be involved in the
process.
The Condition Based Maintenance Plus (CBM+) is a specific initiative which can be
useful in cost effectively sustaining performance. It is the application and integration of
appropriate processes, technologies, and knowledge-based capabilities to improve the
reliability and maintenance effectiveness of DoD systems and components. At its core,
CBM+ is maintenance performed based on evidence of need provided by Reliability
Centered Maintenance (RCM) analysis and other enabling processes and technologies.
CBM+ uses a systems engineering approach to collect data, enable analysis, and
support the decision-making processes for system acquisition, sustainment, and
operations. CBM+ policy is established in DoD Instruction 4151.22.
The program team can often be too close to the day-to-day decisions, so independent
program reviews can be useful in helping ensure the system will be able to maintain or
improve performance. The DoD components each have their own structures to do this,
usually tied to formal program reviews, but the PM should consider bringing in their own
independent reviewers to help in the process and gain lessons learned from other
programs.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      414
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
changes are made as necessary, to ensure an affordable materiel readiness strategy.
Statutory, Policy, and Guidance Factors. While the PM has latitude in developing the
acquisition strategy, there are statutory requirements that must be taken into account.
Congress has enacted a number of statutes capabilities to assure availability of a ready
and controlled (i.e. government owned) source of technical competence and resources
to ensure effective and timely response to a national defense contingency requirement (
10 USC 2464 ) and ensure that there is a balance between the private and the public
sector industrial base ( 10 USC 2466 and 10 USC 2474 ). The support strategy must
ensure compliance with all statutory and regulatory requirements. These legislative and
statutory requirements must be considered as an integral and evolving aspect of all Life-
Cycle Management decisions. The PM must also follow Federal Acquisition Regulation
(FAR) and Defense Federal Acquisition Regulation Supplement (DFARS) guidance, as
well as appropriate DoD Directives and Instructions. Instructions, including the DoDD
4151.18 (Maintenance of Military Materiel), DoDI 4151.19 (Serialized Item Management
(SIM) for Materiel maintenance), DoDI 4151.22 (Condition Based Maintenance Plus
(CBM+) for Materiel Maintenance), DoDI 8320.04 (Item Unique Identification (IUID)
Standards for Tangible Personal Property) need to be addressed.
Support strategy. PMs must balance multiple objectives in designing the strategy to
achieve operational effectiveness while maintaining affordability. PMs accomplish this
by laying out and executing a support strategy so every part of the product support
package is integrated and contributes to the users mission capability. To ensure there is
a means to assess performance the PM and product support provider(s) should
redefine and augment system sustainment metrics used to meet system capability
requirements. (Support providers may be public, private, or a mix, to include public
private partnerships. Examples of public support providers include DoD maintenance
depots, DoD Component and Defense Logistics Agency (DLA) inventory control points
and distribution depots.) The PM and the support provider(s) should enter into
Performance-Based Agreements that define the sustainment metrics necessary to meet
the system performance requirements.
A program manager's best means of ensuring a system will meet its sustainment
objectives and satisfy user sustainment needs, is to ensure sustainment considerations
are infused in all phases of the program's life cycle. It is especially important that
sustainment considerations are included in Pre-Systems Acquisition and Acquisition
activities, including the Joint Capabilities Integration and Development System (JCIDS)
process, structured program reviews, and tracking sustainment performance drivers
during Test and Evaluation. Even after the Initial Operational Capability (IOC) date, the
support strategy should be periodically reviewed and revised when sustainment metrics
are not being met or requirements change. These actions should be defined in the Life-
Cycle Sustainment Plan (LCSP) and other appropriate program documents.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      415
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                       Figure 5.1.2.F1. Key Sustainment Activities by Phase
This section addresses the sustainment aspects that should be included in key program
acquisition documents that cut across life-cycle phases. (Phase unique documents and
focus areas are addressed in subsequent sections). To help ensure a shared
understanding of the program's intent, it is important the documents used by the PM in
the acquisition process and program reviews be updated during subsequent phases,
especially prior to milestone decisions.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      416
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         categories relative to the logistics support infrastructure (remote sites, organic
         depots, commercial facilities, air bases or ship yards, etc. without naming specific
         locations)
    •    Expected durations of support
    •    Support or maintenance effectiveness metrics and key enablers, such as
         diagnostics/ prognostics
    •    Conditions conducive to joint sustainment and to performance-based support
         strategies
Analysis of Alternatives (AoA) . The AoA should describe and include the results of
the supportability analyses and trade-offs conducted to determine the optimum support
concept as part of the preferred system concept. It should also include the assumptions
used in the analyses.
Technology Development Strategy (TDS) . The TDS should also include the specific
new sustainment related technologies required to achieve the Sustainment KPP/KSAs.
Specific emphasis should be placed on technologies required to achieve logistics
performance (including reliability) over what is currently achieved in today's operational
environment.
Acquisition Strategy . The Acquisition Strategy describes the PM's approach for
acquiring the system and its support. The program manager must include the
acquisition strategy for achieving the sustainment metrics and acquiring the product
support package. The Acquisition Strategy should include the key upcoming contracting
actions and the timeline to acquire the product support elements necessary to maintain
the systems readiness and operational capability. Specifically, it should address how
the product support package required to support the materiel management, distribution,
technical data management, support equipment, maintenance, training, configuration
management, engineering support, supply support, and failure reporting/analysis,
functions will be acquired. It should also include a summary of the approach for
acquiring key enablers for achieving the sustainment metrics (e.g., using diagnostics,
prognostics, modular open systems approach, reliability growth).
Test and Evaluation Master Plan . Proper testing is critical to achieve the sustainment
metrics thresholds and objectives. The program manager should therefore ensure the
TEMP includes a description of the requirements and test points/methods for each of
them as well as any appropriate enabler or logistics consideration.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      417
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Plan (SEP) Outline) Accordingly, in developing and updating the SEP, the PM should
integrate sustainment into the program's technical approach described by addressing
how the:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      418
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                    Figure 5.1.2.1.F1. Sustainment Chart
DoD Instruction 5000.02 requires that a LCSP be developed and provided as part of the
program approval process to document how the sustainment strategy is being
implemented. The LCSP documents the Program Managers plan for formulating,
implementing and executing the sustainment strategy so that the systems design as
well as the development of the product support package (including any support
contracts) are integrated and contribute to the Warfighters mission requirements by
achieving and maintaining the Sustainment KPP/KSAs. The LCSP is a living document
describing the approach and resources necessary to develop and integrate sustainment
requirements into the systems design, development, testing and evaluation, fielding and
operations. The LCSP should be tailored to meet program needs documenting the
current program plan in the following areas:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      419
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         and timely acquisition, product support, and availability throughout the life-cycle
         including the Program Managers role in planning for and executing sustainment
    •    The funding required and budgeted by year and appropriation for the main
         sustainment cost categories including operating & support costs
    •    The plan for identifying and selecting sources of repair or support
    •    The sustainment risk areas and mitigation plans
    •    Product support implementation status
    •    Results and recommendations from DoD Component Independent Logistics
         Assessments (ILA)
Figure 5.1.2.2.F1 provides the outline that will be used to document the PMs plan for
how the Product Support Manger will implement the sustainment strategy. Details for
each section and additional information including mandated content can be found at the
LCSP web site .
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      420
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
1 Introduction
4.1 Contracts
7 Integrated Schedule
8 Funding
9 Management
9.1 Organization
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      421
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
9.2 Management Approach
10 Supportability Analysis
LCSP Annexes
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      422
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                      Figure 5.1.2.2.F2. LCSP Evolution
By Milestone C, the LCSP describes the implementation status of the product support
package (including any sustainment related contracts, e.g. Interim Contractor Support,
Contractor Logistics Support) to achieve the Sustainment KPP/KSAs. In addition to
sustaining the system performance capability threshold criteria and meeting any
evolving user readiness needs, the LCSP details how the program will manage O&S
costs and reduce the logistics footprint. After the Full Rate Production Decision Review
update, the LCSP describes the plans for sustaining affordable materiel availability as
well as accommodating modifications, upgrades, and re-procurement. It should be
updated for any Post-IOC Sustainment Reviews and shall be updated, at a minimum
every 5 years, or when:
    •    Subsequent increments are approved and funded to reflect how the support
         strategy will evolve to support multiple configurations.
    •    Significant changes are required to the product support package to achieve the
         objective sustainment metrics including major support provider changes.
As the program matures, the LCSP is updated to reflect increasing levels of detail as
they become available. The detail and focus will vary depending on the life-cycle phase
but in all cases the information should be in sufficient depth to ensure the acquisition,
design, sustainment, and user communities have an early common understanding of
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      423
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
the sustainment requirements, approach, and associated risks. Section 5.4 expands on
the primary focus areas for each life-cycle phase.
LCSP Development. The Program Manager is responsible for the content and
preparation of the Life-cycle Sustainment Plan. The Product Support Manager is the
PMs focal point for developing this document to function as the programs tool in
managing all sustainment efforts. ( Note: If this objective is achieved, then the same
LCSP should effectively serve the needs of decision reviews.)
The Product Support Manager must capitalize on the product support expertise of the
programs Sustainment Integrated Product Team (IPT) to produce a plan that can be
useful, and credible to all stakeholders charged with executing the plan. Specifically in
developing and executing the LCSP, the PM should work with the user, the Product
Support Manager, Product Support Integrator(s), and Product Support Providers to
document performance and sustainment requirements specifying objective outcomes,
resource commitments, and stakeholder responsibilities. Once developed, to help
ensure an integrated team approach, the LCSP should be approved by the Program
Manager, Product Support Manager, Contracting Officer, lead financial analyst and lead
engineer. Last but not least, the best way to ensure that the secondary purpose of
supporting decision reviews is satisfied is to include a representative (Action Officer)
from the appropriate Milestone Decision Authority as a member of the Sustainment IPT.
An effective LCSP services as the nexus of critical thinking for not only logisticians and
sustainment stakeholders, but among all functional disciplines required to
comprehensively deliver effective and affordable product support. The PSM Guidebook
(section 4) addresses the process that should be used to generate the Product Support
Strategy and the associated plan to implement the strategy. In addition, each section of
the LCSP may require the integration of multiple sections of this chapter, and indeed
multiple sections of the broader Defense Acquisition Guidebook. The following table
provides a mapping of the individual LCSP sections to key relevant sections of this
guidebook that the PSM will find useful for both context and direct guidance in
formulating planning content.
Table 5.1.2.2.T1.
                                                            2.2.15, 2.3.15
  2 Product Support Performance                             5.3
  2.1 Sustainment Performance                               5.1, 5.3, 5.4
  Requirements
                                                            1.3, 2.1, 2.2, 2.3, 3.3, 4.3
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      424
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
  2.2 Demonstrated (tested)                                 5.3, 5.4
  Sustainment Performance
                                                            2.0, 2.2, 2.3, 9.1, 9.4, 9.7
  3 Product Support Strategy                                5.1, PSM Guidebook (section 1)
                                                            2.2, 2.3
  4.1 Contracts                                             5.1, 5.4
                                                            1.2, 2.2, 2.3, 3.1, 3.2, 3.3, 3.4, 3.5, 6.5, 10.5,
                                                            10.9,
  9 Management                                              5.1, PSM Guidebook (all)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      425
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
  9.1 Organization                                          5.1
                                                            2.2
  9.1.1 Government Program Office                           5.1
  Organization
  9.1.2 Program Office Product Support                      5.1
  Staffing Levels
  9.1.3 Contractor(s) Program Office                        5.1, 5.4.
  Organization
  9.1.4 Product Support Team                                5.1, 5.4
  Organization
                                                            4.1.4, 6.2, 9.1, 9.6
  9.2 Management Approach                                   5.1
                                      11.3
  9.2.1 Product Support Manager Roles 5.1
  and Responsibilities
  9.2.2 Sustainment Risk Management 5.4
                                                            4.3.18.22
  10.1 Design Interface                                     5.3, 5.4
                                                            2.0.3, 2.1.
  10.1.1 Design Analysis                                    5.2, 5.4
                                                            4.3.18.19
  10.1.2 Technical Reviews                                  5.4
                                                            4.2.8
  10.2 Product Support Element                              5.2, 5.4
  Determination
                                                            4.3.18.22, 4.3.19, 6.3, 9.3, 11.3.3.1, 11.13
  10.3 Sustaining Engineering                               5.1, 5.4
  11 Additional Sustainment Planning                        5.4, 5.5, PSM Guidebook (section 2)
  Factors
  LCSP Annexes                                              5.1, 5.2, 5.3, 5.4, 5.5
Once a decision has been made that a system will replace another and it is required,
the Service Secretary sponsoring the new Major Defense Acquisition Program (MDAP)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      426
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
(or the Commander of the United States Special Operations Command) shall prepare a
Replaced System Sustainment Plan for the existing system. ( 10 USC 2437 ) It will
include at a minimum the following which will require close coordination between any
effected programs:
    •    The budget estimates required to sustain the existing system until the new
         system assumes the majority of mission responsibility. Consequently, it is critical
         that once a program is operational, it’s LCSP contain the current and required
         funding levels through the FYDP so that the additional funding through disposal
         can be easily added.
    •    The milestone schedule for developing and fielding the new system, including the
         scheduled dates for low-rate initial production, initial operational capability, full-
         rate production, full operational capability and the date of when the new system
         is scheduled to assume the majority of the mission responsibilities of the existing
         system.
    •    An analysis of the ability of the existing system to maintain mission capability
         against relevant threats including:
             o Anticipated funding levels necessary to ensure acceptable reliability and
                availability rates and maintain mission capability against the relevant
                threats.
             o The extent to which it is necessary and appropriate to transfer mature
                technologies from the new system or other systems to enhance the
                mission capability against relevant threats and provide interoperability with
                the new system during the period from initial fielding until the new system
                assumes the majority of responsibility for the existing system mission.
5.1.3.4. Stakeholders
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      427
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
10.3 , 11.8 , and the IPPD Handbook .)
Per DoD Directive 5000.01 , the Program Manager (PM) is accountable for
accomplishing program objectives over the life cycle, including during sustainment.
Consequently the PM is responsible for the implementation, management, and/or
oversight of activities associated with the systems development, production, fielding,
sustainment and disposal. Life-cycle management emphasizes early and continuing
emphasis on translating performance objectives into an operationally available and
affordable capability over the program life cycle including:
The PM's responsibility is to provide the user with a sustainable system and product
support that meets specified performance effectiveness and affordability requirements.
PM's should continually measure, assess, and report program execution in terms of
performance, schedule, sustainment, and cost outcomes. In addressing affordability,
PMs should continuously perform Should-Cost analysis that scrutinizes every element
of government and contractor costs. This includes driving productivity improvements
into the program during contract negotiations and throughout program execution
including sustainment as directed by the implementation of Will-Cost and Should-Cost
Management. These efforts are critical both for establishing budgetary requirements
and for tracking execution success over time for both new and legacy programs. In
accomplishing this, the PM should examine and implement appropriate, innovative,
alternative logistics support practices, including the best public sector and commercial
practices and technology solutions. PMs should determine specific discrete and
measurable items or initiatives that can achieve savings against the Will-Cost estimate.
These actionable items will be presented via the Should-Cost estimate and will be
tracked and managed as part of Should-Cost estimate progress reporting. (See Chapter
2.8.8.3 for additional information) The choice of logistics support practices is based on
the PM's documented assessment that they can satisfy users in a manner meeting
statutory requirements that are fully interoperable within DoD's operational, logistics
systems and enterprise; will improve schedules, performance, or support; or will reduce
LCC. Regardless of the chosen support strategy, PMs should collaborate with other key
stakeholders, especially the user, to refine and establish logistics support program goals
for cost, customer support, and performance parameters over the program life cycle.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      428
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
The resultant decisions and planned actions are critical components in the Acquisition
Strategy and the Acquisition Program Baseline.
During acquisition, the PM's focus is to base major decisions on system-wide analyses
with the full understanding of the life-cycle consequences of those decisions on system
performance and affordability. The emphasis should be on reducing system downtimes
and reducing Life-Cycle Costs through deliberate use of systems engineering analysis
to design out the maintenance burden, reduce the supply chain, minimize mission
impacts and reduce the logistics footprint.
The day-to-day oversight and management of the product support functions are
delegated to a product support manager who is responsible for managing the package
of support functions required to field and maintain the readiness and operational
capability of major weapon systems, subsystems, and components. This includes all
functions related to weapon system readiness including:
    •    Providing weapon systems product support subject matter expertise . The PSM
         shall provide weapon systems product support subject matter expertise to the
         PM for the execution of his or her duties as the total life cycle system manager, in
         accordance with DoDD 5000.01. In support of this PM responsibility, the PSM
         shall have a direct reporting relationship and be accountable to the PM for
         product support consistent with Public Law 111-84 National Defense
         Authorization Act for Fiscal Year 2010.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      429
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Developing and implementing a comprehensive product support strategy . The
         product support strategy is designed to assure achievement of warfighter
         capability-driven life cycle product support outcomes documented in
         performance-based agreements, generally expressed in the preferred terms of
         weapon system materiel availability, reliability, and operations and support cost
         affordability. The strategy should identify the execution plan to deliver integrated
         product support (IPS) elements to the warfighter, producing the best value
         balance of materiel readiness and life-cycle costs.
    •    Promoting opportunities to maximize competition while meeting the objective of
         best-value long-term outcomes to the warfighter . Tradeoffs between the benefits
         of long-term relationships and the opportunity for cost reductions through
         competitive processes should be considered together with associated risk.
    •    Seeking to leverage enterprise opportunities across programs and DoD
         Components . Joint strategies are a top priority where more than one DoD
         Component is the user of the respective major weapon system or variant of the
         system. Likewise, product support strategies should address a programs product
         support interrelationship with other programs in their respective portfolio and joint
         infrastructure, similar to what is performed for operational interdependencies.
    •    Using appropriate analytical tools to determine the preferred product support
         strategy . Analytical tools can take many forms (analysis of alternatives,
         supportability analysis, sustainment business case analysis, life cycle impact
         analysis), dependent upon the stage of the programs life cycle. These analytical
         tools shall incorporate the use of cost analyses, such as cost-benefit analyses as
         outlined in Office of Management and Budget Circular A-94, Guidelines and
         Discount Rates for Benefit-Cost Analysis of Federal Programs, as well as other
         appropriate DoD and Service guidance consistent with Public Law 111-84. These
         tools are used to help identify the best possible use of available DoD and
         industry resources at the system, subsystem, and component levels by analyzing
         all alternatives available to achieve the desired performance outcomes.
         Additionally, resources required to implement the preferred alternative should be
         assessed with associated risks. Sensitivity analyses should also be conducted
         against each of the IPS elements and tracked to determine those IPS elements
         where marginal changes could alter the preferred strategy.
    •    Developing appropriate product support arrangements for implementation .
         Development and implementation of product support arrangements should be a
         major consideration during strategy development to assure achievement of the
         desired performance outcomes. These arrangements should take the form of
         performance-based agreements, memorandums of agreements, memorandums
         of understanding, and partnering agreements or contractual agreements with
         product support integrators (PSIs) and product support providers (PSPs),
         depending on the best-value service integrators or providers.
    •    Periodically assessing and adjusting resource allocations and performance
         requirements to meet warfighter needs during strategy implementation .
         Planning, programming, budgeting, and execution of the product support strategy
         need to be accomplished and aligned to the warfighters performance-based
         agreements with the PM and PSM. PSMs, working in concert with the PM, users,
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      430
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         resource sponsors, and force providers, should adjust performance levels and
         resources across PSIs and PSPs as necessary, but not less than annually, to
         optimize implementation of the strategy based on current warfighter requirements
         and resource availability.
    •    Documenting the product support strategy in the LCSP . The LCSP describes the
         plan for the integration of sustainment activities into the acquisition strategy and
         operational employment of the support system. The PSM prepares the LCSP to
         document the plan for formulating, integrating, and executing the product support
         strategy (including any support contracts) to meet the warfighters mission
         requirements. In accordance with Public Law 111-84 and DoDI 5000.02, the
         LCSP shall be updated to reflect the evolving maturity of the product support
         strategy at each milestone, full rate production (FRP), and prior to each change
         in the product support strategy or every 5 years, whichever occurs first. The
         LCSP is approved by the milestone decision authority at each milestone and
         FRP decision. Updates to the LCSP for all major weapons systems after the FRP
         decision shall be approved by the CAE, in coordination with the Deputy Assistant
         Secretary of Defense for Materiel Readiness.
    •    Conducting periodic product support strategy reviews . The product support
         strategy evolves with the maturation of the weapon system through its various life
         cycle phases. At FRP, the LCSP should describe how the system is performing
         relative to the performance metrics and any required corrective actions to ensure
         the metrics are achieved. Reviews and revalidations of the strategy should be
         performed at a minimum of every 5 years or prior to each change in the strategy
         to ensure alignment across system, subsystem, and component levels in support
         of the defined best-value outcomes. In those situations where a support strategy
         is at the weapon systems level, the PSMs reassessment should explore potential
         opportunities for evolving toward a portfolio approach. In those situations where
         an LCSP is based on a collection of outcome-based product support strategies at
         the subsystem or component level, the periodic review should explicitly address
         integrated performance at the weapon systems level. In all situations, the
         reassessment should consider opportunities to make better use of industry and
         DoD resources. (See the Logistics Assessment Guidebook for additional
         information.)
Specific guidance in accomplishing these functions can be found in the Product Support
Manager Guidebook . In developing and implementing the performance-based product
support strategy the PSM can delegate responsibility for delivering specific outcomes. In
doing so, while remaining accountable for system performance, the PM and PSM may
employ any number of sub system PSMs or product support integrator(s) to integrate
support from all support sources to achieve the performance outcomes specified in a
performance-based agreement. They can be further supported by product support
providers (PSPs) who provide specific product support functions.
In accomplishing the outcomes, PSIs should have considerable flexibility and latitude in
how the necessary support is provided. The activities coordinated can include functions
provided by organic organizations, private sector providers, or partnerships. The
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      431
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
following, or any combination of partnerships between them, are candidates for the role:
5.1.3.4. Stakeholders
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      432
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
responsibility that is directly related to the outcome of an action or result. Generally
speaking they can influence the outcome or are the recipient of the results. The range of
personnel selected to participate as stakeholders is based on the outcome and
processes involved. Typical stakeholders are: users or operators, acquisition
commands, test communities, depots, manpower, personnel & training communities,
maintainers, and suppliers (e.g., DLA, the Inventory Control Point (ICP), US
Transportation Command (TRANSCOM), industry, and other organizations associated
with the sustainment chain).
PBAs formally document the agreed to level of support and associated funding, required
to meet performance requirements. The PBA with the user states the objectives that
form the basis of the performance-based product support effort. They establish the
negotiated baseline of performance and corresponding support necessary to achieve
that performance, whether provided by commercial or organic support providers. The
PM negotiates the required level of support to achieve the users desired performance at
a cost consistent with available funding. Once the performance and cost are accepted
by the stakeholders, the PM enters into PBAs with the user community which specify
the level of support and performance. Likewise, PMs enter into performance-based
agreements with organic sources and/or contracts with commercial sources which focus
on supporting the users in terms of cost, schedule, and performance. Consequently,
PBAs can describe agreements between 1) user and PM, 2) PM and support
integrator(s), or 3) support integrator and support provider(s). The agreements should
maintain flexibility to facilitate execution year funding and/or priority revisions and spell
out the 1) objective outcomes, 2) performance measures, 3) resource commitments,
and 4) stakeholder responsibilities. (See figure 5.1.4.F1.)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      433
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                          Figure 5.1.4.F1. Performance-Based Agreements
Sustainment metrics should provide the objectives that form the basis of the PBAs. The
PBA performance metrics should reflect the highest level of metrics that are most critical
in producing the desired performance outcome(s). Generally, a focus on a few properly
incentivized performance-based outcome metrics such as materiel availability, materiel
reliability, etc. will lead to more effective solutions. However, in developing the
agreements, it may not be possible to directly state these high level performance
objectives as metrics due to lack of support provider control of the support activities
necessary to produce the user performance (e.g., availability). This is because some
DoD Component logistics policies and/or guidance mandate a preference for DoD
Component performed maintenance and retail supply functions that cut across multiple
organizations. Accordingly, the PM may select the next echelon of metrics for which the
support provider can be held accountable and which most directly contribute to the
sustainment metrics.
The outcome metric to achieve the user requirements (e.g., materiel availability) should
be a balance between a quality metric (e.g., materiel reliability), a response metric (e.g.,
turnaround time), and a cost metric that are appropriate for the outcome needed. Many
existing logistics and financial metrics can be related to top level user performance
outcomes. These include, but are not limited to, logistics footprint, not mission capable
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      434
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
supply (NMCS), ratio of supply chain costs to sales, maintenance repair turnaround
time, depot cycle time, and negotiated time definite delivery. In structuring the metrics
and evaluating performance, it is important to clearly delineate any factors that could
affect performance, but are outside the control of the support providers.
While objective metrics form the bulk of the evaluation of a provider's performance,
some elements of product support might be more appropriately evaluated subjectively
by the user and the PM team. This approach allows some flexibility for adjusting to
potential support contingencies. For example, there may be different customer priorities
to be balanced with overall objective measures of performance.
For support provided by commercial organizations, the contract is the PBA reflecting the
agreed to user performance requirements. Note that the source of support decisions do
not favor either organic or commercial providers. Non-core source of support decisions
should optimize the best public and private sector competencies based on a best value
determination of the provider's capability to meet set performance objectives. The major
shift in the performance-based environment from the traditional approach is how
programs acquire support, not from whom it is obtained. The Sustainment Strategy
normally results in a blend of commercial and organic product support providers built on
the strengths of each to achieve an affordable strategy.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      435
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
also relies on the optimum blend between the organic/commercial providers. For
example, when executing commercial product support strategies, the use of on-hand
and due-in government inventory should be standard practice of all Performance-Based
Logistics and partnering agreements.
The preferred contracting approach is the use of long term firm fixed price contracts with
incentives tied to outcome performance to fulfill the product support and integrated
sustainment chain management responsibilities. Consequently, the contract should
provide support over a specific period of time for a predetermined fixed cost per
operating measure. Sustainment contracts should require the delivery of a capability to
the user using a Statements of Objectives or a Performance Work Statement approach.
(Level of effort or labor hour type contracts are not preferred because they limit the
contractor's ability to make necessary trade-offs to meet and/or exceed the threshold
performance outcomes within the funding profile.)
A sustainment contract may take many forms and the degree to which the outcome is
defined varies. It should purchase support as an integrated performance package
designed to optimize system readiness. It must specify performance requirements;
clearly delineate roles and responsibilities on both sides; specify metrics and their
definitions; include appropriate incentives, maximize the use of government-owned
inventory before procuring the same parts from private contractors; specify how
performance will be assessed. The contract should cover the procurement of a
capability to support the user versus the individual parts or repair actions and provide
the ability to manage support providers.
Award term contracts should be used where possible to incentivize industry to provide
optimal support. Incentives should be tied to metrics tailored to reflect the DoD
Component's specific definitions and reporting processes. Award and incentive
contracts should include tailored cost reporting to enable appropriate contract
management and to facilitate future cost estimating and price analysis. Sustainment
contracts should strive to specify a fixed cost per outcome (e.g., operating hour (e.g.,
hour, mile, cycle) or event (e.g., launch)) vice a cost plus contract. However, lack of
data on systems performance or maintenance costs or other pricing risk factors may
necessitate cost type contracts until sufficient data is collected to understand the risks.
Full access to DoD demand data should be incorporated into any contracts. The
contracts should be competitively sourced wherever possible and should make
maximum use of small and disadvantaged businesses.
Contracts must follow Federal Acquisition Regulation (FAR) and Defense Federal
Acquisition Regulation Supplement (DFARS) guidance, as appropriate, for the
acquisition of logistics services and support throughout the program life cycle. In
addition, competition over the entire life cycle can be a valuable tool for achieving
affordable sustainment. Consequently, early in the program, the PM should consider the
cost versus benefit of the data and other support elements required to achieve
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      436
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
competitive versus sole source contracting for sustainment functions (e.g. parts, repairs
and other supply chain processes).
The contracting methodology is a multiple step process that can be applied to new,
modified, or legacy systems at the system, subsystem, or major assembly level
covering a range of functions depending on program unique circumstances. Additional
guidance is contained at https://acc.dau.mil/pbl but the steps can be summarized in the
following general areas:
Define the Requirements: The initial step is to relate or determine the performance
outcome metric meeting the user’s needs rather than to rely on discrete transactional
logistics functions. Care should be given to ensure the costs and performance metrics
selected focus measurable contractor behavior in terms of the metrics selected.
Defining and documenting the requirement involves answering three key questions:
Who are the key stakeholders? What are the operations and sustainment environment
and infrastructure? What are the total system cost and performance objectives? To
develop an effective contracting strategy, the PM needs to identify the risks and benefits
to achieve the desired outcome. Evolving system sustainment requirements should be
constantly equated to long term financial resources.
In determining the extent to which contracts will be used, the PM should determine the
best mix of public and private sector capabilities to meet evolving user requirements,
joint sustainment opportunities, and the statutory requirements. ( DoD Directive
5000.01, E1.1.17 ) This involves identifying the best mix in terms of: capability, skills,
infrastructure, opportunities for partnering, compliance with Title 10, public/private
flexibility, and affordability for each support function. As operating scenarios and
technologies change, supportability related performance requirements may change.
Thus, refining and resources for system requirements is a continual management
process.
Develop and Award the Contract: From a sustainment perspective, contracts should
be structured to balance three major objectives throughout the life cycle of the system:
1) delivering sustained materiel readiness; 2) minimizing the requirement for logistics
support through technology insertion and refreshment; and 3) continually improving the
cost-effectiveness of logistics products and services. Careful balancing of investments
in logistics and technology to leverage technological advances through the insertion of
mature technology is critical. In addition, the PM should ensure the contract addresses
user requirements during peacetime, contingency operations, and war and provides for
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      437
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
the capability for the government to compete or take over the sustainment responsibility
in the future.
Contract development is a lengthy, complex process, led by the PM, involving multiple
stakeholders. No two contracts are exactly the same each must be tailored to the
unique requirements of the system considering, at minimum, the factors and criteria
listed below:
    •    Statutory requirements: Title 10 U.S.C. 2460 , 2464 , 2466 , 2469 , and 2474
         (Core, 50/50, public/private partnering, etc.). Depot maintenance partnerships
         can be effective tools to implement performance-based product support
         arrangements if properly structured. The contracts should allow partnering with
         public depot maintenance activities to satisfy the requirements. (Examples and
         further discussion of public private partnerships can be found on the Acquisition
         Community Connection web site.)
    •    Regulatory requirements: DoD (e.g., DFARS) and DoD Component policy
         (including contractors on the battlefield, service performance of organizational
         level support functions, etc.).
    •    Financial Enablers: Ensuring the financial enablers are commensurate with the
         risks.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      438
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
reasons sustainment strategies need to be considered early on in a programs life cycle.
The PSM needs to pay particular attention to the following areas of the Technical Data
Rights Strategy as well as it’s execution to ensure that all data and software required to
successfully sustain the system is available throughout the systems life cycle:
Program Managers establish and maintain a configuration control program, and are
required to "base configuration management decisions on factors that best support
implementing performance-based strategies throughout the product life cycle" ( DoD
Directive 5000.01 ). An effective configuration management program should include
configuration control over the functional and allocated baselines as well as the physical
baseline. The approach and responsibility for maintaining configuration control will
depend on a number of program specific factors such as design rights, design
responsibility, support concept, and associated costs and risk. Nominally the
government maintains configuration control of the system design specification and
retains the authority/responsibility for approving design changes impacting the system’s
ability to meet specification requirements. The contractor(s) has the right to access
configuration data at any level required to implement planned or potential design
changes and support options.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      439
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
plan drives the level of government configuration control and support element
requirements. During the maintenance planning process, factors such as reliability and
volatility of the design technology are used to determine how the system/component will
be supported, e.g., throwaway or repair, and commercial or organic repair.
Also, in PBL contracts, provisions should be made to protect the government in the
event the contractor is unable to provide the contracted performance and at contract
conclusion. Technical Data and Computer Software are important components of the
configuration management process so it is vital the PM understand the level of access
to Technical Data Packages (TDPs) and the appropriate levels of computer software (to
include source code when necessary) required to successfully procure, compete, and
sustain the system over its entire life cycle. This level will vary from system to system
and often down to the component or part level. Specific clauses must be included in the
contract to ensure the government retains access to or takes control of the necessary
TDP(s) and software and their corresponding updates. This ensures the government will
have the data necessary to duplicate the existing configuration with little to no
interruption in the support provided to the user if the support provider changes or the
contract is re-competed. Without this exit ramp, the government will not be able to cost
effectively re-compete a system and/or component.
Figure 5.2.F1 depicts the Life-Cycle Management System and relates key sustainment
design and systems engineering activities. (Figure 5.2.F1 provides an overview
roadmap during the acquisition process. Expanded versions are shown by phase in
section 5.4 .) These system engineering processes are not carried out in a strictly linear
progression; they are typically carried out iteratively, expanding into lower levels of
detail as the design evolves. Incremental acquisition present challenges in both
acquisition and sustainment activities. An obvious challenge is the potential cost and
configuration management challenges that can arise with multiple configurations of end
items as well as the support system. This should be addressed early in development
and evolution of the acquisition strategy. If planned correctly, configuration management
efforts combined with rapid prototypes can provide the PM the opportunity to observe
and evolve the success of tentative support strategies. Conversely, poor management
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      440
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
of multiple system configurations can create a significant sustainment burden.
Short term pressures to achieve system performance and schedule imperatives are
very real, and cannot be ignored in a financially and statutorily constrained environment.
However, system sustainability and affordability are also important program elements to
be considered. Consequently CJCS Instruction 3170.01 established the Sustainment
Key Performance Parameter and KSAs to reinforce the total life-cycle approach to
program decisions. This is because a system that meets performance requirements but
saves acquisition dollars by not expending the resources to make it reliable,
maintainable, or supportable is a liability to the user. Ultimately, over the system life
cycle, balancing this composite of long term objectives will provide greater benefit.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      441
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                       Figure 5.2.F1. Supportability Analysis in Acquisition
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      442
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
between the design and the process efficiencies used to operate and support the
system, including the product support package and the supply chain. In addition, each
of its elements directly influences the life-cycle cost. The key is to ensure mission
effectiveness is defined in terms meaningful to the Warfighter over a meaningful
timeframe. (e.g., number of systems required to move X ton miles in a 30 day period, or
number of systems required to provide continuous surveillance coverage over 60,000
square mile area for a 6 month period).
The design effectiveness reflects key design features - technical performance and
supportability features. These system aspects should be designed-in synergistically and
with full knowledge of the expected system missions in the context of the proposed
system operational, maintenance, and support concepts. To be effective, technical
performance and supportability objectives should be defined in explicit, quantitative,
testable terms. This is important to facilitate trade-offs as well as the selection and
assessment of the product and process technologies. Each of the major elements
controlled by the program manager in the design process is addressed below.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      443
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                  Figure 5.2.F2. Affordable System Operational Effectiveness
In this context, supportability (see sections 5.3 and 4.3.18.22 ) includes the following
design factors of the system and its product support package:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      444
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         contribute to an optimum environment for sustaining an operational system.
Process efficiency reflects how well the system can be produced, operated, serviced
(including fueling) and maintained. It reflects the degree to which the logistics processes
(including the supply chain), infrastructure, and footprint have been balanced to provide
an agile, deployable, and operationally effective system. While the program manager
does not fully control this aspect, the program directly influences each of the processes
via the system design and the fielded product support package. Achieving process
efficiency requires early and continuing emphasis on the various logistics support
processes along with the design considerations. The continued emphasis is important
because processes present opportunities for improving operational effectiveness even
after the "design-in" window has passed via lean-six sigma, supply chain optimization
and other continuous process improvement (CPI) techniques. Examples of where they
can be applied include supply chain management, resource demand forecasting,
training, maintenance procedures, calibration procedures, packaging, handling,
transportation and warehousing processes.
The relationships illustrated in figure 5.2.F2 are complex and not as clean as shown in
the figure. Figure 5.2.F3 is more accurate relative to how the basic system operational
effectiveness elements interface. For example, each of the supportability elements
influences the process aspects which in turn can impact supportability. (e.g., while
reliability drives the maintenance requirements, the implemented maintenance
processes and the quality of the spare and repair parts as reflected in the producability
features can impact the resultant reliability.) In addition, how the system is operated will
influence the reliability and both can be influenced by the logistic processes. Last but
not least, each of the design and process aspects drives the life-cycle costs. Achieving
the optimal balance across these complex relationships requires proactive, coordinated
involvement of organizations and individuals from the requirements, acquisition,
logistics, and user communities, along with industry. Consequently, because of the
complexity and overlapping interrelationships full stakeholder participation is required in
activities related to achieving affordable mission effectiveness. Models that simulate the
interactions of the elements, as depicted in Figure 5.2.F3, can be helpful in developing a
balanced solution.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      445
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
   Figure 5.2.F3. Affordable System Operational Effectiveness Interrelationships
Each of the elements reflected in Figure 5.2.F2 contribute to achieving the top level
affordable operational effectiveness outcome and have associated metrics which can be
measured to assess efficiency and effectiveness. However, they don't mathematically
add up as implied in Figure 5.2.F2. This is because, in addition to the complex
interrelationships between the elements, the various stakeholders only measure
portions of the supply chain and often use different metric definitions. Consequently
DoD has adopted 4 key sustainment metrics (including the Sustainment KPP and 2
KSAs) for projecting and monitoring key affordable operational effectiveness
performance enablers to:
Figure 5.2.F4 indicates the minimum set of sustainment metrics the PM should use to
facilitate communication across the stakeholders and the elements affecting them. The
color code indicates the elements measured by Materiel Availability, Materiel Reliability
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      446
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
and Mean Down Time metrics. The metrics are interrelated and along with the
CONOPS impact the LCC.
This overarching perspective provides context for the trade space available to a PM and
for articulation of the overall objective of maximizing the operational effectiveness. This
is critical because trade-offs outside the trade space (i.e., program parameter changes)
can require approval of both the Milestone Decision Authority and Validation Authority
since validated KPP threshold values cannot be reduced without Validation Authority
approval. Consequently, it is critical the design trade space established by the values
selected for the sustainment metrics are established early and be acceptable to the user
and acquirer communities. As a result, the user and sponsor should be involved with the
determination of the design trade space. Finally, to help ensure the metrics goals are
met, the program should establish supporting metrics for key drivers (e.g., logistics
footprint, manning levels, ambiguity rates for diagnostics) uniquely tailored for the
system and the projected operating environment as the design requirements are
allocated.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      447
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
5.2.1. Supportability Analysis
Section 5.4 provides areas of focus for each acquisition phase. In general, however,
life-cycle management can be thought of in terms of three broad periods.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      448
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         should be used to ensure the preferred System Concept & Support CONOPS,
         are consistent with the projected Operational CONOPS taking into account "real
         world" constraints including "core", statutory requirements, existing supply chain,
         etc. Generally this is done by considering the sustainment effectiveness and
         O&S affordability of systems currently performing the same or similar capabilities.
         These are analyzed and serve as benchmarks to assess alternatives; with the
         intent of incremental improvement over current (legacy) system capability
         readiness and cost.
    •    Evaluate product support capability requirements using a notional Support
         CONOPS for trades and LCC estimates in evaluating the alternatives.
    •    Identify enabling sustainment technology needed to meet life-cycle sustainment
         goals especially when the risk of achieving the incremental improvements is high
         (e.g., a robust software architecture, health management, diagnostics,
         prognostics. etc.).
    •    Assess the operational and life-cycle risks associated with sustainment
         technologies, especially those requiring development.
    •    Assess the intellectual property considerations needed, to include the technical
         data rights and computer software needed to sustain and support a system.
    •    Integrate supportability performance into systems engineering, initial acquisition
         strategic planning, and as benchmark criteria for test and evaluation.
    •    Refine associated performance requirements based on technology development
         results (positive and negative) to achieve the preferred system concept &
         Support CONOPS.
    •    Refine supportability performance requirements and life-cycle sustainment
         concepts, based on evolving technology and changes in the CONOPS.
Acquisition: Here, supportability analysis helps reduce risks and create/field the
system and its supply chain with provided feedback into the design process. This is
accomplished by assessing the effect of system plans, development, and production on
sustainment effectiveness, readiness, and O&S affordability. The intent is to act early to
mitigate evolving circumstances that may adversely impact deployed readiness. This
includes using systems engineering in designing the system and its supply chain;
producing both concurrently; and testing to verify the total system requirements have
been achieved. Specifically systems engineering is used in designing for support and:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      449
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
              o LCC drivers for the system, it’s support concept and maintenance
                concept/plan.
            o The optimum mix of driver values to meet KPPs and their corresponding
                confidence levels.
            o Effectiveness (KPP/KSA Outcomes) if the supply chain performs at
                today's levels (as well as if current trends continue or with anticipated
                trends).
    •    Taking the test/initial operations results and predicting likely mature values for
         each of the KSA and enabler drivers.
    •    Providing integrated Sustainment KPP/KSA estimates into the Defense
         Acquisition Management Information Retrieval (DAMIR) system.
During this period more realistic and detailed data is used in the models/simulations to
reduce risk and define achievable performance & sustainment requirements.
Consequently, a mix of design estimates/contract requirements, sustainment, and
Maintenance Plan metrics are used when conducting sustainment trades/analysis
depending on the period and objective. In addition, expected trends for system, enabler
& supply chain metrics and their confidence levels are also needed requiring the use of
data models. This requires that:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      450
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
During this period, the system program measures and tracks the supply chain and its
effectiveness and use models that include the driver metrics to determine root causes of
problems or anticipate future problems.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      451
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                            Figure 5.2.1.2.F1. Supportability Relationships
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      452
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
standards for this method.
The technical input and maintenance task analysis provide a detailed understanding of
the necessary logistics support element requirements to sustain required materiel
availability. The MTA process identifies support tasks and the physical location where
they will be accomplished considers the costs, availability implications, and statutory
requirements. (The Depot Source of Repair (DSOR) process is key in determining
location.) This in turn produces a product support package that identifies support
element requirements and associated product data based on the system reliability and
maintainability. The product support package provides descriptions of the following
topics:
The steps shown in figure 5.2.1.2.F1 are not necessarily carried out in a linear
progression. Design increments and the continuous assessment of test results and in-
service system performance will identify needs for system improvements to enhance
reliability, and maintainability and to overcome obsolescence, corrosion, or other
sustainment problems. Additional information including a detailed process description,
considerations in implementing the process and data element definitions, can be found
in MIL-HDBK-502 . (Note: This document is currently in the update process.)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      453
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
DoD Instruction 4151.20 .)
Core Logistics Capability. Title 10 U.S.C. 2464 and DoDI 4151.20 require core
logistics capability that is government-owned and government-operated (including
government personnel and government owned and operated equipment and facilities)
to ensure a ready and controlled source of technical competence with the resources
necessary to ensure effective and timely response to mobilization, national defense
contingency situations, or other emergency requirements. These capabilities must be
established no later than 4 years after achieving IOC. These capabilities should include
those necessary to maintain and repair systems and other military equipment that are
identified as necessary to fulfill the strategic and contingency plans prepared by the
Chairman of the Joint Chiefs of Staff. (Excluded are special access programs, nuclear
aircraft carriers, and commercial items, as defined by (Title 10 U.S.C. 2464).) Core
logistics capabilities should be performed at government owned-government operated
(GO-GO) facilities of a military department. Such facilities should be assigned sufficient
workload to maintain these core capabilities and ensure cost efficiency and technical
competence in peacetime while preserving the surge capacity and reconstitution
capabilities necessary to fully support strategic and contingency plans.
Depot Source of Repair (DSOR) Analysis. The process to help the PM select the best
value in depot maintenance support is implemented through the Depot Source of Repair
(DSOR) analysis. The Depot Source of Repair Guide provides additional information for
accomplishing the required Core Logistics Analysis/Source of Repair Analysis in
determining the source of repair for depot level workload. The DSOR decision process
is an integral part of sustainment planning and mandatory for systems/equipment
requiring depot maintenance. DoD Directive 4151.18 , Maintenance of Military Materiel,
requires DSOR assignments be made by the PM using the DSOR assignment decision
logic. The process should be completed before entering into firm commitments or
obligating funds for other than interim depot support. The DSOR decision is typically
made during the Engineering & Manufacturing Development and the Production and
Deployment phases.
The DSOR decision process consists of two major elements, normally performed
sequentially: The first is the organic versus contract source of repair determination. This
determination is made by the PM using a DoD Component approved analysis process
that gives consideration to core requirements. Title 10 USC 2464 , Core Logistics
Capabilities; Title 10, USC 2466 , Limitations on the Performance of Depot Level
Maintenance of Materiel, and DoD Directive 4151.18 provide further guidance for this
process.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      454
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
133(I), MCO P4790.10B, and DLAD 4151.16 . All new acquisitions, equipment
modifications, and items moving to or from contract depot maintenance support are to
be reviewed for interservice potential in accordance with this regulation.
The DSOR decision process has the potential to reduce program costs by effectively
using commercial and organic depot maintenance resources. The process helps ensure
the DoD Components maintain the core depot maintenance capability, as required by
statute that meets military contingency requirements and considers interservice depot
maintenance support and joint contracting. In performing this analysis, the PM should
ensure that maintenance source of support decisions comply with the following statutory
requirements.
5.2.2. Life-Cycle Costs (LCC) and Product Support Business Case Analysis (BCA)
5.2.2. Life-Cycle Costs (LCC) and Product Support Business Case Analysis (BCA)
LCC is the cost to the government to acquire and own a system over it’s useful life. LCC
includes all life-cycle management costs (e.g. development, acquisition, operations,
support, and disposal). As such it consists of the elements of a program's budget, as
well as supply chain or business processes costs that logically can be attributed to the
operation of a system. Section 3.1.5 provides additional information but generally LCC
includes direct program costs as well as any allocable "indirect cost" elements. This can
include such costs as delivering fuel/batteries, recruiting/ accession training of new
personnel, individual training, environmental and safety compliance, management
headquarters functions, etc.
Early program decisions ultimately determine and drive the LCC, the majority of which is
incurred after a system is deployed. Consequently beginning with the requirements
determination and during each life-cycle phase, LCC estimates should play a major role
in the program decision process for evaluating affordable alternatives during the design
and trade-off processes. (See DoD Directive 5000.01, E1.1.4, E1.1.17, and E1.1.29 .)
As a result, the operating and support portion of the LCC is now treated as a military
requirement via the JCIDS's Ownership Cost KSA. For this reason, LCC analysis should
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      455
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
be performed to the level appropriate for the decision and alternatives considered.
However, since projections are based on assumptions, cost estimates shall include an
assessment of confidence levels and should also include the associated cost drivers.
The Product Support Business Case Analysis (BCA) is used to assist in identifying the
product support strategy that achieves the optimal balance between Warfighter
capabilities and affordability. (Other names for a BCA are Economic Analysis, Cost-
Benefit Analysis, and Benefit-Cost Analysis. Regardless of the name, is it a structured
analytical process that aids decision making in identifying and comparing alternatives by
examining the mission and business impacts (both financial and non-financial), risks,
and sensitivities.) The PSM should prepare a Product Support BCA for major product
support decisions, especially those that result in new or changed resource
requirements. The BCA should consider organic and commercial alternatives when
determining the optimal support solution (e.g. DLA, TRANSCOM, Service activities and
commercial options). Each of the key stakeholders should be informed of the BCA
process and support the analysis by providing the information needed to make an
informed decision. To aid this process, the Product Support BCA Guidebook provides
an analytic, standardized, and objective foundation upon which credible decisions can
be made.
Life-cycle cost analysis can be very effective in reducing the LCC of the system and its
support strategy. (Within DoD, reduction and control of LCC is also done through a
variety of initiatives including Will-Cost and Should-Cost Management, etc.) However,
one cost model is not sufficient to address all of the alternatives a PM must consider.
The level of detail, analysis process used, and LCC elements considered should be
tailored to the decision being made, focusing on cost drivers and costs that will be
incurred by the government and not just on direct program office costs. The objective is
to seek and eliminate low-value added ingredients of program costs.
For most decisions, the sunk costs, costs that will not be impacted by the alternatives
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      456
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
and absolute value of the alternatives can be ignored. The analysis should be focused
instead on the relative cost element differences between the alternatives considered
and the cost drivers for each. Consequently, the cost analysis should include
appropriate key performance measure, such as O&S cost-per-operating-hour, cost-per-
pallet miles, cost-per-seat miles, etc., when assessing alternative solutions. The Cost
Analysis Requirements Description (see section 3.4.4.1 ) reflects the life-cycle
sustainment requirements for preparing the LCC estimate and the Cost Analysis
Improvement Group Operating and Support Cost Estimating Guide also provides useful
information relative to the cost estimating process, approach, and other considerations.
M&S can be an effective tool in the supportability analysis and evaluation process in
implementing life-cycle management principles because all the sustainment/materiel
readiness driver metrics can be considered in parallel (also see section 4.3.19.1 ).
Consequently, the sustainment M&S objective should be to use validated models to
consider materiel availability/readiness implications when assessing the merits of
alternatives throughout the life cycle. M&S should be used in assessing the alternatives
for major decisions affecting the design and deployment of both the end item and it’s
support system. Properly applied M&S encourages collaboration and integration among
the varied stakeholders (including the test and transportation communities) facilitating
materiel availability and system effectiveness.
The models should be used throughout the life cycle and should include the multiple
materiel availability stakeholder contribution and funding streams for the supply chain
components. (The level of detail used varies based on several factors including, but not
limited to, the systems complexity, criticality to the user, program phase, and risk.) In all
cases, M&S efforts should consistently and credibly look at/trade off life-cycle
alternatives in a repeatable fashion. In addition, the underlying assumptions and drivers
for the values of each of the sustainment metrics should be documented as thresholds,
objectives, and estimates evolve through the life cycle. (See the RAM-C Guide for
additional information.)
The Supply Chain Operations Reference (SCOR) model, figure 5.2.4.F1, captures a
consensus view of the supply chain plan, source, maintain/make, deliver, and return,
processes in a framework linking business process, metrics, best practices, and
technology features into a unified structure for effective supply chain management and
for improving related supply chain activities. In this context, the supply chain includes
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      457
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
the transportation and maintenance chains as well as the spare/repair parts chain
required to provide the user flexible and timely materiel support during peacetime,
crises, and joint operations. Most of these supply chain activities are governed by DoD
regulation 4140.1-R, Supply Chain Materiel Management Regulation which provides
further DoD guidance and information. Maintenance requirements within the supply
chain are governed by DoD Directive 4151.18 , Maintenance of Military Materiel.
Building off the SCOR efforts, the Design Chain Operations Reference (DCOR)
model links business process, metrics, best practices and technology features into a
unified structure to support communication among design chain partners and to improve
the effectiveness of the extended supply chain. The model is organized around five
primary management processes which focus on product development and research &
development. As is in the case of SCOR, this consensus model can be used to describe
design chains can be simple or complex using a common set of definitions.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      458
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
5.3. Supportability Design Considerations
5.3.2. Reliability
5.3.3. Maintainability
Figure 5.3.1.F1 lists key system architecture attributes which can provide a solid
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      459
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
sustainment foundation. The focus on openness, modularity, scalability, and
upgradeability is critical to implementing an incremental acquisition strategy. In addition,
the architecture attributes that expand system flexibility and affordability can pay
dividends later when obsolescence and end-of-life issues are resolved through a
concerted technology refreshment strategy. However trade-offs are required relative to
the extent each attribute is used as illustrated in the Commercial Off-the-Shelf (COTS)
case.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      460
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
or minimize single source options.
Modular Open Systems Approach (MOSA) . Open system architectures help mitigate
the risks associated with technology obsolescence and promote subsequent technology
infusion. MOSA can also help to provide interoperability, maintainability, and
compatibility when developing the support strategy and follow-on logistics planning for
sustainment. It can also enable continued access to cutting edge technologies and
products and prevent being locked into proprietary technology. Applying MOSA should
be considered as an integrated business and technical strategy when examining
alternatives to meet user needs. PMs should assess the feasibility of using widely
supported commercial interface standards in developing systems. Closely related to
MOSA is the Open System Architecture (OSA) approach to software development. This
concept, which relies upon the sharing of software code can significantly enhance
affordability. For a detailed discussion of OSA see the Open Systems Architecture
Guide (insert hyperlink here to the Guide and BCA). MOSA should be an integral part of
the overall acquisition strategy to enable rapid acquisition with demonstrated
technology, incremental and conventional development, interoperability, life-cycle
sustainment, and incremental system upgradeability without major redesign during initial
procurement and re-procurement.
5.3.2. Reliability
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      461
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
a focus will play a significant role in minimizing the necessary logistics footprint, while
maximizing system survivability and availability.
Reliability estimates evolve over time. Generally, the initial estimates are based on
parametric analyses and analogies with like or similar systems operating in the same
environment and adjusted via engineering analysis. As the design evolves and as
hardware is prototyped and developed, the engineering analysis becomes more
detailed. In addition to estimates and modeling, testing at the component, subsystem, or
system level may be necessary to assess or improve reliability. Approaches such as
accelerated life testing, environmental stress screening, and formal reliability
development/growth testing, should be considered and incorporated into program
planning as necessary. To assure the delivery of a system that will achieve the level of
reliability demanded in field use, a methodical approach to reliability assessment and
improvement should be a part of every well-engineered system development effort. The
Reliability Availability and Maintainability (RAM) Guidance provides a structure,
references, and resources to aide in implementing a sound strategy. It is crucial the
reliability approach be planned to produce high confidence the system has been
developed with some margin beyond the minimum (threshold) reliability. This will allow
for the inevitable unknowns that result in a decrease between the reliability observed
during development and that observed during operational testing and in-service. In
addition to reliability, the Reliability, Availability, Maintainability & Cost (RAM-C)
Rationale Report Manual provides guidance in how to develop and document realistic
sustainment Key Performance Parameter (KPP)/Key System Attribute (KSA)
requirements with their related supporting rationale; measure and test the requirements;
and manage the processes to ensure key stakeholders are involved when developing
the sustainment requirements.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      462
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
5.3.3. Maintainability
Condition Based Maintenance Plus. When it can support the materiel availability,
prognostics & diagnostics capabilities/technologies should be embedded within the
system when feasible (or off equipment if more cost-effective) to support condition
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      463
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
based maintenance and reduce scheduled and unscheduled maintenance. Health
management techniques can be very effective in providing maintainers with knowledge,
skill sets, and tools for timely maintenance and help reduce the logistics footprint.
Condition based maintenance plus (CBM+) (the application of technologies, processes,
and procedures to determine maintenance requirements based, in large part, on real
time assessment of system condition obtained from embedded sensors), coupled with
reliability centered maintenance can reduce maintenance requirements and reduce the
system down time. (CBM+ references include the DoDI 4151.22 , the CBM+ Guidebook
, and the CBM+ DAU Continuous Learning Module (CLL029) .) The goal is to perform
as much maintenance as possible based on tests and measurements or at pre-
determined trigger events. A trigger event can be physical evidence of an impending
failure provided by diagnostic or prognostics technology or inspection. An event can
also be operating hours completed, elapsed calendar days, or other periodically
occurring situation (i.e., classical scheduled maintenance). Key considerations in
implementing this concept include:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      464
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Tools-integrated electronic technical manuals (i.e., digitized data) (IETMs);
         automatic identification technology (AIT); item-unique identification (IUID);
         portable maintenance aids (PMAs); embedded, data-based, interactive training
    •    Functionality-low ambiguity fault detection, isolation, and prediction; optimized
         maintenance requirements and reduced logistics support footprints; configuration
         management and asset visibility.
Program managers can minimize life-cycle cost while achieving readiness and
sustainability objectives through a variety of methods in the design of the system and its
maintenance / sustainment program. Below are technologies that should be considered
to improve maintenance agility and responsiveness, increase materiel availability, and
reduce the logistics footprint:
    •    Serialized Item Management (SIM). SIM ( DoDI 4151.19 ) can be used to aid
         asset visibility and the collection and analysis of failure and maintenance data.
         (Also see section 4.3.18.14 ) The SIM program should be structured to provide
         accurate and timely item related data that is easy to create and use. While SIM is
         a DoD wide initiative, the primary function for the program is in ensuring the
         marking of the population of select items (parts, components, and end items) with
         a universal item unique identifier (IUID) ( DoDI 8320.04 ). IUID should be used on
         tangible property, including new equipment, major modifications, and re-
         procurement of equipment and spares. As a minimum populations from the
         following categories should be considered for marking:
             o Repairable items down to and including sub-component repairable unit
                level;
             o Life limited, time controlled, or items with records (e.g., logbooks,
                equipment service records, Safety Critical Items); and
             o Items that require technical directive tracking at the part number level.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      465
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         technologies from which to choose, ranging from simple bar codes to radio
         frequency identification technology. In choosing the specific technology, the PM
         should consider that the technology will change over the life cycle both for the
         program and the supply chain management information systems using the
         information. Consequently, it is important the PM take into account the need to
         plan for and implement an iterative technology refreshment strategy. In addition,
         since AIT is used by supply and maintenance management information systems
         it is important that items selected for serialized item management be marked in
         conformance with MIL STD 129.
    •    Need for special handling or supportability factors. This includes the need for
         special facilities or packaging, handling, storage, and transportation (PHS&T)
         considerations. This is usually driven by physical needs (e.g., size, weight,
         special materials) but can also include eliminating excessive set up and teardown
         times or the in ability to transport systems without disassembly and reassembly.
5.4.1.3. Activities/Processes
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      466
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
5.4.1. Developing the Support Concept and Establishing Requirements
Effective sustainment begins with the supportability analysis to form CDD specifications
for each supportability parameter to be designed, developed, or procured as proven
commercial technology. It is these analysis-driven supportability parameter
specifications, once integrated through systems engineering with all other technical
parameters, which drive deployed system operational availability, sustainment
effectiveness, and operator ownership affordability. As discussed below, supportability
analyses establish supportability performance capability KPP/KSA parameters for
Sustainment in the Joint Capabilities Integration and Development System (JCIDS)
requirements documentation and are central to the systems engineering process of
identifying and refining all system technical performance capabilities.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      467
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
are led by the sponsor and linked into the Life-Cycle Management System at each
phase and milestone.
The JCIDS Instruction ( CJCS Instruction 3170.01 ) and Manual require that key
considerations for sustainment be addressed early in the analysis as indicated below:
    •    A Key Performance Parameter for Sustainment has been mandated which treats
         logistics supportability as a performance capability inherent to the systems
         design and development
    •    A Sustainment Key Performance Parameter (Materiel Availability) and two
         mandatory supporting KSAs (Materiel Reliability and Ownership Cost) are
         required for all JROC Interest programs involving materiel solutions.
    •    Logistics supportability becomes an inherent element of operational
         effectiveness.
    •    The Capability Development Document and Capability Production Document
         (CPD) must state the operational and support-related/sustainment performance
         attributes of a system that provides the desired capability required by the
         Warfighter -- attributes so significant that they must be verified by testing and
         evaluation
    •    The DOTMLPF includes analysis of the entire life cycle, including the
         sustainment; environment, safety, and occupational health (ESOH); and all
         Human Systems Integration (HSI) domains.
    •    The process to identify capability gaps and potential materiel and non-materiel
         solutions must be supported by a robust analytical process that objectively
         considers a range of operating, maintenance, sustainment, and acquisition
         approaches and incorporates innovative practices -- including best commercial
         practices, HSI, systems engineering (including safety and software engineering),
         collaborative environments, modeling and simulation, and electronic business
         solutions.
    •    The approaches identified should include the broadest possible range of joint
         possibilities for addressing the capability gaps. For each approach, the range of
         potential sustainment alternatives must be identified and evaluated as part of
         determining which approaches are viable.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      468
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Duration of support.
The purpose of this phase is to assess potential materiel solutions and developing a
Technology Development Strategy (TDS). This includes identifying and evaluating
affordable product support alternatives with their associated requirements to meet the
operational requirements and associated risks. Consequently, in describing the desired
performance to meet mission requirements, the sustainment metrics should be defined
in addition to the traditional performance design criteria (e.g., speed, lethality). This is
because reliability, reduced logistics footprint, and reduced system life-cycle cost are
most effectively achieved through inclusion from the beginning of a program and
therefore should be addressed in the AoA Plan.
Along with articulating the overall system operational effectiveness objective, this phase
is critical for establishing the overarching trade space available to the PM in subsequent
phases. User capabilities are examined against technologies, both mature and
immature, to determine feasibility and alternatives to fill user needs. Once the
requirements have been identified, a gap analysis should be performed to determine the
additional capabilities required to implement the support concept and it’s drivers within
the trade space.
5.4.1.3. Activities/Processes
During this phase, various alternatives are analyzed to select the materiel solution and
develop the TDS to fill any technology gaps. Key activities involve identifying and
evaluating alternatives and their system sustainment and product support implications.
This process is critical because the resulting details guide the acquisition community on
refining the concept selected and identifying potential operating and support resource
constraints.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      469
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
functional sustainment performance and associated life-cycle cost analysis expertise to
help ensure the AoA assesses the ability of each material alternative candidate to meet
and sustain the systems JCIDS performance sustainment capability parameters. It is
important that the analysis of alternatives includes alternative maintenance and
sustainment concepts consistent with the physical and operational environment of the
proposed system. Specific consideration should be given to the associated performance
metrics to achieve the required effectiveness goals and the overall ability to accomplish
a mission, including the ability to sustain the system. Consequently, during this phase
the focus is on determining the system level sustainment metrics and values that
provide the balance between mission effectiveness, LCC, logistics footprint, and risk
that best represents Warfighter needs. This needs to be done for each system
alternative analyzed and for their associated sustainment and maintenance strategies.
The strategies must then be broken down to their respective drivers to determine the
gaps between what is needed to achieve the mission capability and what is currently
achievable. The drivers then become performance-based metrics for sustainment
enablers. The gaps indicate risk areas and become candidates for potential technology
development initiatives. Since operational suitability is the degree to which a system can
be used and sustained satisfactorily in the field (in war and peace time), consideration
should be given to reliability, availability, maintainability, compatibility, transportability,
interoperability, sustainment, documentation, and all the HSI domains (Manpower,
Personnel, Training, HFE, Environment, Safety, Occupational Health, Survivability, and
Habitability).
Data collected and analyzed during the analysis of alternatives should be retained
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      470
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
because it can be useful for subsequent performance-based product support analysis
including providing the baseline for logistics footprint and other factors affecting the in-
theater operations concept. (See section 3.3.3 ) As a result, the sustainment related
data should be maintained in a manner to make it easy to update program deliverables
during subsequent phases, especially prior to milestone decisions.
The focus should be on ensuring the metrics are traceable to the ICD, CDD, other
JCIDS analysis, or agreement with the user community on the values for each metric
and on documented analyses. The analyses should use the most appropriate data
sources and include comparisons of corresponding values for analogous existing
systems. Where there is a wide difference between values being achieved by today's
systems and those needed for the projected environment, further analysis should be
done to determine the enabler technologies (e.g., diagnostics, prognostics) required to
achieve the sustainment metrics. The analysis should identify the corresponding
performance requirements for key enabling technologies. The results should be
included in the TDS and Draft CDD.
Many of the actions and subsequent results in this phase are reviewed during technical
reviews. The actions and results discussed in this section should be accomplished even
if the specific referenced reviews do not occur. The actions and results are tied to the
reviews to reflect the relative timeframe in which the actions should be accomplished.
The ASR helps ensure the preferred system and product support solution satisfies the
Initial Capabilities Document. Generally, the review assesses the evaluated alternative
systems to ensure that at least one of the alternatives has the potential to be cost
effective, affordable, operationally effective and suitable, and can be developed to
provide a timely solution at an acceptable level of risk. See section 4.2.9 for additional
information on how the ASR ensures the requirements agree with the customer’s' needs
and expectations.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      471
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
For this review to be fully effective, the support concept should be addressed as an
integral part of the system concept. During the review, the system concept should be
assessed with particular attention to understanding the driving requirements for
reliability, availability, maintainability, down time, life-cycle costs, and the enabling
technologies required to meet user requirements. Completion of the ASR should
provide:
The focus of this phase is on identifying the initial concept and any critical product
support capability requirements. Affordable operational effectiveness is the overarching
sustainment objective that should be considered during the JCIDS process.
Implementing the process contained in figure 5.4.1.3.1.F1 results in the preferred
system concept and the planning to mature the enabling technologies. The conclusion
of this phase produces the initial acquisition strategy (including the sustainment
strategy), contractual documents required to continue into the Technology Development
Phase and includes the initial support & maintenance concepts as well as LCC and
manpower estimates for the system concept.
Table 5.4.1.4.T1 identifies the most critical documents that should incorporate or
address sustainment/logistics considerations. Entry documents should be complete
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      472
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
when the phase is initiated and include the specific product support issues to be
addressed in the phase along with a notional Maintenance & Sustainment Concept of
Operations (CONOPS) consistent with the projected Operational CONOPS. Exit
documents are completed or, in the case of the Maintenance & Sustainment CONOPS,
updated based on the analysis of alternatives results. The key sustainment elements to
be addressed in the next phase should be included in the Acquisition Strategy, the
Technology Development Phase RFP, and Source Selection Plan.
             Entry Documents:
             Initial Capabilities Document
             Analysis of Alternatives Plan
             Alternative Maintenance & Sustainment Concept of Operations
             Exit Documents:
             Analysis of Alternatives (including Market Research results)
             Draft Capability Development Document
             Test and Evaluation Strategy
             Technology Development Strategy
             SEP
             Life-Cycle Sustainment Plan
The Analysis of Alternatives Report should describe the alternative maintenance and
sustainment concepts consistent for each alternative analyzed along with the support
capabilities drivers and any gaps.
The exit documents should contain the following sustainment related information for the
preferred system concept:
Life-Cycle Sustainment Plan In this phase and preparing for MS-A, the LCSP focuses
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      473
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
on the approach for developing the sustainment metrics and product support strategy.
Emphasis is on the:
During this phase, support considerations should address the degree to which a
systems design and planned logistics resources support its readiness requirements and
wartime utilization. This includes consideration of activities and resources (such as fuel)
necessary for system operation as well as real world constraints and environment. It
also includes all resources that contribute to the overall support cost (e.g., personnel;
equipment; technical support data; and maintenance procedures to facilitate the
detection, isolation, and timely repair/replacement of system anomalies).
Modeling and simulation combined with LCC analysis are critical best practices and
should be included in the AoA Plan. In addition, both should be used as a source
selection factor in the Technology Development Phase selection process and to define
the desired ranges for the sustainment metrics thresholds and objectives.
During this phase, both acquisition and O&S costs need to be considered in evaluating
affordable alternatives. Also during this phase, key sustainment related cost
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      474
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
performance criteria, such as O&S cost per operating hour or cost per ton-mile, can be
considered when conducting design trade-off analyses.
During this phase M&S supports the requirements determination efforts by analyzing
the impact of various alternatives to determine an achievable range of the sustainment
metrics values to meet the functional requirements. M&S should be used to assess the
alternatives, ensuring all sustainment metrics are considered in parallel and not at the
expense of the others. In addition, sensitivity analyses should be used to determine the:
    •    Optimum mix of key metric values (e.g., LCC and readiness drivers) required to
         meet the requirements and identify corresponding confidence levels for each of
         the alternatives
    •    Impact on sustainment, LCC, and readiness drivers if the supply chain performs
         at today's performance levels.
    •    Associated sustainment/maintenance concepts for each of the alternatives to be
         used as the baseline in subsequent phases
Combining these factors will help identify specific areas where new technology is
required to achieve or to reduce risks and increase the probability of achieving the
requirements.
5.4.2.1. Overview
5.4.2.2. Activities/Processes
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      475
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
(SRR)
5.4.2.1. Overview
The purpose of this phase is to reduce technology risks (including required sustainment
technologies to achieve the needed materiel availability) and determine the
technologies to be integrated into the system. The focus is on developing the
preliminary design (down to the subsystem/equipment level), reducing integration and
manufacturing risk, and, from a sustainment perspective:
This phase is the most critical for optimizing system sustainment through designed-in
criteria to help ensure sustainability. Particular attentions should be paid to reducing the
logistics footprint, implementing human systems integration, and designing for support
to help ensure life-cycle affordability. Also, during this phase detailed plans for
organizing to manage the implementation of the product support package should begin.
The support concept should be defined going into this phase. The phase should be
used to define the design-to requirements and to design the product support package.
Technology demonstrations and prototyping should be conducted to help determine
mature, affordable technologies to be included in the system and support system
designs. The demonstrations results coupled with analysis should be used to refine
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      476
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
requirements and the LCC estimate, narrow the ranges of all program metrics, and
increase confidence the values can be met at an affordable cost.
5.4.2.2. Activities/Processes
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      477
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Perform a market analysis (both public and private) for the needed system and
         product support capabilities to fill the gaps. The analysis should address the
         extent and scope of opportunities for using commercial items and processes. It
         should consider and assess the:
            o Elements of support currently provided (for any legacy systems to be
                replaced).
            o Current measures used to evaluate support effectiveness.
            o Current effectiveness of required support.
            o Existing support data across the logistics support elements.
            o Existing technologies and associated support that impact the new system.
    •    Develop the functional characteristics and performance specification of the
         system and its support system based on the best balance between mission
         performance, life-cycle cost, logistics infrastructure and footprint, and risk. An
         analysis should be conducted to identify key performance and related support
         parameters for inclusion in the CDD. The analysis should form the basis of
         design requirements for subsequent phases and will affect the KPPs/KSAs and
         the overall capability of the system to perform and endure in the required mission
         environment. ROM LCC estimates should be developed and included in the
         analysis results based on the following key elements:
            o Preliminary manpower and personnel requirements estimates. This should
                also include an assessment of any constraints in both quantity and skill
                levels and the use of contractor support.
            o Operational effectiveness, reliability, maintainability, supportability and
                interoperability drivers. This should include embedded and external
                diagnostics, prognostics, and other maintenance enabler technologies that
                will be required based on suitably mature new design technology. In
                identifying the drivers and their threshold and objective values,
                performance histories of similar systems should be examined to determine
                the feasibility/risks of achieving the required levels and develop a risk
                mitigation plan. If one has to be developed, the corresponding benefit’s
                and resource requirements for each of the drivers should be identified.
            o Logistics footprint metric estimates, deployment requirements, and other
                factors affecting the in-theater operational concept. This should include
                the elements the program will be responsible for and the supply chain
                performance requirements upon which the program will require to meet
                operational effectiveness objectives.
Depot Maintenance: During this phase, the following actions are required:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      478
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Program the resources for the technical data, facilitation, and equipment
         requirements.
    •    Summarize the results of these actions in the Acquisition Strategy submitted for
         Milestone B approval.
During this phase, and in preparing for MS-B, the program focuses on finalizing the
sustainment metrics, sustainment requirements integration into the design, expanding
on the sustainment strategy and maintenance concept and an execution plan describing
the design, acquisition, fielding, and competition of sustainment activities to deliver the
product support package. The LCSP documents the maintenance & support concepts
based on the results of any technology demonstrations and analyses performed to date.
It should describe the envisioned sustainment capabilities as viewed by the user and
major support providers (e.g., the maintainer, supplier and transportation providers).
Taking into account the real world constraints and limitations (including "core"
requirements, statutory requirements, etc.), it should include the:
The maintenance & sustainment strategy should be refined from the projected systems
reliability and the preliminary sustainment concept of operations to meet the operational
requirement in the planned environment. They are then used to determine the supply
chain performance requirements, along with the key enabling features needed to
implement the strategy. These enablers can range from system design features (e.g.
condition based maintenance) to supply chain features (e.g., rapid distribution of
tailored support packages, just in time training / distance support, total asset visibility
anywhere in the support chain, dedicated rapid response support teams analyzing real
time data). The details should be described in sufficient detail to provide assurance that
risks are understood and the gaps can be filled.
Core logistics and repair sources are critical elements in establishing appropriate repair
and support capability. New and emerging systems may lack mature data at this stage,
but by using data from similar current systems and subsystems, planning for a
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      479
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
sustainment strategy can evolve. Key activities should include establishing the baseline
for trade studies by identifying notional maintenance levels and activities for major
subsystems, taking into account system/subsystems with a core capability.
The gaps between the current state of the art and current sustainment/maintenance
capabilities versus what is required (along with the risk) should be used to identify
technologies needing to be developed and demonstrated in subsequent phases. They
should also be used in developing the implementation plan for proceeding with the best
value alternative and summarized in the LCSP. The following are key considerations in
developing the performance/cost/schedule/ sustainment and risk tradeoff analysis:
Core Capability Planning and Analysis. The requirement for determining core
requirements and applying this methodology extends to all weapon systems and
equipment operated by each DoD Component, regardless of where depot-level
maintenance is actually performed ( DoDI 4151.20 , "Depot Maintenance Core
Capabilities Determination Process"). The following depot maintenance core capability
requirements determination methodology is used to determine essential DoD depot
maintenance capability requirements for each DoD Component, and the workloads
needed to sustain those capabilities.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      480
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         workload differences.
             o If not, will the system be used or is it planned to be used in support of a
                JCS contingency scenario? If the answer is "Yes" then Section 2464 of
                Title 10, United States Code requirements for the establishment of organic
                core capability apply.
             o If the answer to either question is 'yes', an initial core capability
                requirement determination analysis must be conducted and candidate
                Depot Source of Repair (DSOR) depot-level maintenance facilities
                identified by the DoD Component(s).
    •    After core requirements have been determined, the PM/JPO shall take
         appropriate steps to assure that the requirements for the establishment of
         organic capability are included in all product support acquisition requirements
         (e.g. need for tech data, peculiar support equipment, facilities and/or Public
         Private Partnership).
    •    While not part of the core determination process it is at this stage that any
         requirements to assign a portion of the proposed workload to an organic depot to
         provide reasonable assurance of future compliance with the 50/50 requirements
         be identified and provided by the DoD Component(s) MMOs to the PM/JPO
         along with justification and documentation for use in designing the product
         support strategy.
Many of the actions and subsequent results in this phase are reviewed during technical
reviews. The actions and results discussed in this section should be accomplished even
if the specific referenced reviews do not occur. The actions and results are tied to the
reviews to reflect the relative timeframe in which they should be accomplished.
The SRR is conducted to ascertain the results of the prototyping and demonstrations
relative to the system technical requirements. It determines the direction and progress
of the systems engineering effort and the degree of convergence upon a balanced and
complete system baseline. (See section 4.2.10 for additional information.) The purpose
is to ensure all system requirements (performance requirements and sustainment
requirements) derived from the Initial Capabilities Document (ICD) or draft CDD are
defined, consistent and achievable within cost, schedule and any other constraints.
Generally the SRR assesses the prototyping results with respect to the system
requirements captured in the system specification and support strategy to ensure they
are consistent with the system and support solution as well as available technologies.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      481
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
predicated on determining the system and support element requirements are based on
available technology, a sustainable support concept, and program resources (e.g.,
funding, staffing, processes and schedule). Logistics and product support subject matter
experts should participate to ensure the critical sustainment system and support
elements enabler technologies required to implement the support strategy and achieve
the needed materiel availability are included in the planning and performance
specifications. Understanding and accepting the program risk inherent in the system
specification, Systems Engineering Plan and Life-Cycle Sustainment Plan is key to a
successful review. The SRR should provide:
The SFR ensures the system functional baseline has a reasonable expectation of
satisfying the CDD requirements within the allocated budget and schedule. A critical
SFR aspect is the development of representative operational and product support use
cases for the system. System performance and the anticipated functional requirements
for operations, maintenance and sustainment are assigned to sub-systems and support
systems hardware and support systems hardware & software after analysis of the
operational and support environments. The SFR determines whether the systems
functional definition is fully decomposed to its lowest level forming the functional
baseline, and that IPTs are prepared to start preliminary design. Additional information
for this review can be found in section 4.2.11 .
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      482
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         adequate processes for achieving the sustainment metrics are in place.
    •    Defining the detailed support concept functionality requirements for system and
         subsystem elements to ensure system functional requirements are sufficiently
         detailed and understood to enable system design supportability analyses to
         proceed.
    •    Ensuring program sustainment development efforts (including system and
         software critical path drivers), with corresponding schedules, are included in
         LCSP updates.
    •    Ensuring the updated Cost Analysis Requirements Description (CARD) (or a
         CARD-like document) based on the system functional baseline, captures the key
         program sustainment cost drivers, development costs, production costs, and
         operation & support costs for all aspects of sustainment and human system
         integration.
The PDR helps ensure the systems allocated baseline and its associated support
system have a reasonable expectation of satisfying the CDD requirements within the
allocated budget, staffing, and schedule and have an acceptable risk level. Details can
be found in section 4.2.12 but in summary the PDR assesses the preliminary design
captured in the preliminary subsystem product specifications for each configuration item
(hardware and software) and ensures each function, in the functional baseline, has
been allocated to one or more system configuration items. The PDR evaluates the
subsystem requirements to determine whether they correctly implement all system
requirements allocated to the subsystem. The Integrated Product Team (IPT) should
review the results of peer reviews of requirements, preliminary design documentation
(including Interface Control Documents) along with the plans for development and
testing for both system performance and supportability aspects to ensure the system is
ready to proceed into detailed design and test procedure development.
    •    Addressing the supportability requirements to support the CDD and ensuring the
         supportability functionality are allocated to each system or subsystem and they
         can be achieved within the budgets and schedule. This includes ensuring the
         Failure Mode Effects and Criticality Analysis, Maintainability Analysis, and
         Reliability Centered Maintenance Analysis results have been factored into the
         allocated requirements, preliminary design, and risk assessment. In addition to
         ensuring adequate processes for achieving the sustainment metrics are in place,
         this includes ensuring the HSI design factors have been reviewed and included in
         the overall system design.
    •    Setting the allocated baseline for any system and/or major subsystem product
         support package elements. This includes defining the detailed support concept
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      483
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         functionality requirements for subsystem product support package elements to
         ensure system functional requirements are sufficiently detailed and understood to
         enable more detailed supportability analyses to proceed.
    •    Defining the test success criteria for development testing and operational testing
         (for both operationally effective and suitable) requirements and the general test
         approach for key sustainment enablers or drivers.
    •    Ensuring program sustainment development efforts (including system and
         software critical path drivers with corresponding schedules) are included in LCSP
         updates.
    •    Ensuring the updated Cost Analysis Requirements Description (CARD) (or a
         CARD-like document) based on the system allocated baseline, captures the key
         program sustainment cost drivers, development costs, production costs, and
         operation & support costs for all aspects of sustainment and Human Systems
         Integration (HSI.)
The TRA is a metrics based process that assesses the maturity of critical technology
elements, including sustainment drivers, conducted concurrently with technical reviews.
From a sustainment perspective, the process should be used for assessing risk and the
adequacy of technology maturation planning when the support concept or sustainment
drivers depend on specific new or novel technologies to meet system threshold
requirements in development, production, or operation. If a key enabler or sustainment
driver (e.g., reliability, turnaround time) does not meet required performance levels or
significant performance advances is required over what is currently achieved with
existing technology, then a plan for maturing the critical technology should be
developed, explaining in detail how the performance level will be reached within the
programs schedule and resources. See section 10.5.2 for additional information.
IBRs are used throughout the program whenever earned value management is used.
IBRs establish a mutual understanding of the project performance measurement
baseline. While they have a business focus, IBRs can also be useful in ensuring
sustainment is considered in the acquisition process when the efforts required to
achieve the Sustainment KPP, KSAs and any other key sustainment enabler metrics
are included in the reviews. These reviews and resultant understanding also provide for
a plan of action to evaluate the risks inherent in the program measurement baseline and
the management processes during project execution. Additional information can be
found in section 11.3.1.3 .
The focus of this phase is on reducing risk and defining achievable performance and
sustainment requirements. This begins with the analysis of alternatives that include
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      484
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
examining alternative operating and system support concepts, with specific
consideration of performance-based requirements. Success is demonstrated by
identifying key performance and related sustainment metrics (with their basis) as design
requirements that affect the overall capability of the system to perform and endure in the
required mission environment. (In addition to the Sustainment KPP/KSAs, the metrics
can include other supportability, maintainability, interoperability, manpower or footprint
measures.) Implementing the process contained in figure 5.4.2.2.F1 produces the
refined supportability objectives and, in some cases, anticipated constraints based on
the technology assessments. The conclusion of this phase results in the contractual
documents required to continue (including the related sustainment requirements and
actions) and updated system baseline support & maintenance concepts, LCC, and
manpower estimates.
Table 5.4.2.3.T1 identifies the most critical documents that should incorporate or
address sustainment/logistics considerations. The key sustainment elements to be
addressed in the next phase should be included in the Acquisition Strategy and the
materiel availability enabler requirements should be included in the Engineering and
Manufacturing System Development RFP as well as the Source Selection Plan. The exit
documents from this phase should focus on the materiel availability driver metrics
(including drivers for the enablers) and the baseline support strategy. They should also
contain the following sustainment related information:
             Entry Documents:
             Analysis of Alternatives
             Technology Development Strategy Draft Capability Development
             Document (including sustainment technology issues)
             Test and Evaluation Strategy
             Life-Cycle Sustainment Plan
             Exit Documents:
             Analysis of Alternatives (including Market Research results)
             System Performance Specification
             Capability Development Document
             Preliminary Design Review Results
             Test and Evaluation Master Plan (TEMP)
             Information Support Plan
             Acquisition Strategy
             Cooperative Opportunities
             Technical Data Rights Strategy
             Core Logistics Analysis/Source of Repair Analysis
             Industrial Capabilities
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      485
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
             Life-Cycle Sustainment Plan
             Life-Cycle Cost Estimate and Manpower Estimate
             Preliminary Maintenance Plans
             Acquisition Program Baseline (APB)
             Affordability Assessment (including DoD Component Cost Analysis &
             ICE)
    •    AoA - the sustainment driver metrics and product support strategies for each
         alternative considered along with any gaps and major assumptions
    •    System Performance Specification - objectives and thresholds for the
         sustainment driver metrics including the corresponding enabler drivers
    •    CDD - the information necessary to deliver an affordable and supportable
         capability using mature technology. The following sustainment drivers information
         should be included:
             o System maintenance/support profiles and use case scenarios
             o The corresponding support and maintenance effectiveness measures
             o Description of the specific capabilities required to achieve the support
                 concept and/or to reduce risks in achieving the values required to meet
                 the operational requirements. It should include metrics for each of the key
                 enabling technologies (e.g., reliability/ maintenance rates,
                 diagnostics/prognostics effectiveness measures)
    •    Preliminary Design Review Results the description and status of the
         sustainment driver design features
    •    Technology Readiness Assessment - approach for achieving the required
         enabling sustainment technologies (including design criteria for each of the
         drivers in the preliminary system design specification) (see section 10.5.2 )
    •    Test and Evaluation Master Plan (TEMP) identification of the metrics and
         enabling/driver technologies to be evaluated in subsequent phases, the approach
         for evaluating them, and test points (see section 9.5.5 )
    •    Data Management Strategy the long term strategy integrating data
         requirements across all functional disciplines.
    •    Information Support Plan for acquiring and managing the data required to
         execute the support concept in the operational environment (see section 7.3.6 )
    •    Acquisition Strategy containing the LCSP executive summary
    •    Life-Cycle Sustainment Plan (LCSP) summary of the maintenance &
         sustainment concepts including the support locations and duration. It should
         focus on the support strategy including the contracting strategy to acquire the
         major elements of the support concept and the specific incentives being used to
         help achieve the sustainment drivers and enablers
    •    Life-Cycle Cost Estimate and Manpower Estimate the major assumptions and
         values being used for the sustainment drivers and enablers (see Chapters 3 and
         6 ) It should also include the confidence level of the values being achieved
    •    Acquisition Program Baseline (APB) description of the sustainment metrics,
         criteria, and logistics funding requirements (see section 10.9 )
    •    Affordability Assessment an assessment based on the likelihood of the key
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      486
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         sustainment metrics being achieved (also see section 3.2.2 )
During this phase, the focus should be on refining the threshold and objective range
value estimate for each sustainment metric based on more detailed analysis identifying
the technical capabilities, risks, and limitations of the alternative concepts and design
options. Analysis should also be performed to identify the impacts the sustainment
metrics will have on mission success and materiel availability. The key enabling
requirements to achieve the sustainment metrics should be allocated to the major
system level and included in the system specification. Even this early it is important to
establish the reliability requirements and assess the extent to which the system will
likely meet the requirements. Consequently, the reliability of the technology or system
should be included in the technology readiness assessments.
Detailed plans for monitoring, collecting and validating key metrics should be
established to provide empirical data to evaluate technical performance, system
maturity, and the projected logistics burden. Detailed test criteria should be developed
for each metric (including any key dependent enabling technologies) to provide
information about risk and risk mitigation as the development and testing continue. The
test strategy/requirements to provide data and analysis support to the decision process
should be documented in the TEMP.
M&S combined with LCC analysis are important best practices to help assess the
success in reducing program risk. In addition, both should be used in the Engineering &
Manufacturing Development Phase source selection process and to define the
sustainment objectives and thresholds to be placed on contract. The data used for the
assessments and analysis (including the projected sustainment demand) should be
complied and saved for analyses in subsequent phases.
The analyses are iterative, evolving and expanding as more specific design and other
technical information on the actual equipment is identified. While the focus is high level
for the system at the beginning of this phase, it should also consider requirements for
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      487
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
key enablers in terms of "what is required" vice "how it is accomplished". As the phase
progresses, the analysis should determine the relative cost vs. benefits of different
support strategies (including potential source of support decisions). The impact and
value of performance/cost/schedule/sustainment trade-offs based on the preliminary
design should continue expanding to the lowest level of the work break down structure
as the design evolves across this and subsequent life-cycle phases.
A complete supportability analysis should be performed for any parts of the system for
which the government is going to provide the product support package vice using a
contracted approach with materiel availability as the performance measure. Figure
5.4.2.5.1.F1 shows the key system reliability, maintainability and supportability system
engineering processes. The affordable system operationally effectiveness analysis
process coupled, with available tools and opportunities - such as modeling and
simulation, performance testing, supportability testing/demonstration, technical data
validation, and maintenance assessments - should be proactively applied and
integrated with the systems engineering process. For example, system requirements
can be used to develop a system reliability/availability block diagram as a basis for
modeling and analysis. This approach can identify opportunities for targeted system
redundancy, ease of reconfiguration, and derating, etc., and can thereby enhance
system level reliability and availability. In addition, reliability, maintainability
(BIT/prognostics), and supportability/ logistics demonstrations can provide the data to
assess achievement of RAM requirements.
The level of detail performed by the government team will vary by the extent to which
performance-based product support contract is used but that will not impact the general
process, including the program major events. As a result, the supportability analysis
process should take advantage and be an integral part of the major engineering events
and processes, including but not limited to the System Requirements Review (SRR) and
Preliminary Design Review (PDR).
As illustrated in Figure 5.4.2.5.1.F1, a FMECA helps identify the ways in which systems
can fail, performance consequences, and serve as basis in the identification of Critical
Safety Items as well as potential areas for preventative maintenance for the system.
When conducted in a timely fashion, the FMECA can be used to support trade-offs
between performance and life-cycle costs to drive design improvements. A Fault Tree
Analysis (FTA) assesses the safety-critical functions within the systems architecture and
design. A Maintainability Analysis and Prediction (MAP) assesses the maintenance
aspects of the systems architecture, including maintenance times and resources. This
analysis identifies strategic opportunities for focused diagnostics, prognostics, and
performance monitoring/fault localization, leading to reduced system maintenance times
and cost drivers. A level of repair analysis optimally allocates maintenance functions for
maximum affordability and materiel availability.
Once FMECA, FTA, and MAP are completed and system design has been established,
RCM analysis develops a focused, cost-effective system preventive maintenance
program. RCM uses a system based methodical approach to determine causes of
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      488
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
failure, failure consequences, and a logic tree analysis to identify the most applicable
and effective maintenance task(s) to prevent failure, if possible. RCM also provides
rules for determining evidence of need for condition based maintenance to perform
maintenance only upon evidence of need. ( DoDI 4151.22, Enclosure 3 )
The activities shown in figure 5.4.2.5.1.F1 are not necessarily carried out in a linear
progression. Design increments and the continuous assessment of test results and in-
service system performance will identify needs for system improvements to enhance
reliability, maintainability, overcome obsolescence, corrosion, or other sustainment
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      489
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
needs.
Once the preferred system, system support concepts and enabling technologies are
selected, case scenarios reflecting system support, maintenance, and logistics are
refined. These scenarios identify significant system support, maintenance, and logistic
requirements and objectives. These are compared to the Sustainment KPP/KSA
threshold and objective and expanded in depth as the hardware design matures and the
process is iterated until an affordable systems operational effective solution is achieved.
             Technology
                                         Technology Description
             Maturity
             Low Risk                    Existing Mature Technologies
                                         Maturing Technologies; New Applications of
             Medium Risk
                                         Mature Technologies
                                         Immature Technologies; New Combinations of
             High Risk
                                         Maturing Technologies
M&S should be used to refine sustainment objectives (this includes the Sustainment
KPP and KSAs as well as any other LCC or readiness driver metrics) and identify any
constraints based on technology assessments. The technology demonstration results
should be modeled to project likely capabilities and the associated confidence levels
that enabling technologies will be achievable in the operational environment. It should
also be used to develop initial/notional system level product sustainment strategy and
maintenance concepts for major sub systems. All of these elements will be used to
project the mature Sustainment KPP/KSA values and their associated confidence levels
they will be met within the CONOPS.
As the design evolves, modeling and simulation can be used to help keep the product
support elements in balance between and within system hardware elements. This is
done by allocating the sustainment, LCC or readiness driver metrics to specific
subsystems and equipment’s. These requirements are then used to develop the specific
system level support strategies and maintenance plans along with their design-to
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      490
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
requirements for both the system and it’s logistic support system. Modeling at this level
of detail provides more creditability especially relative to the following efforts important
in this phase:
5.4.3.1. Overview
5.4.3.2. Activities/Processes
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      491
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
5.4.3.4.3.1. Maintenance
5.4.3.4.3.2. Supply
5.4.3.4.3.3. Transportation
5.4.3.1. Overview
The purpose of this phase is to develop a detailed integrated design and ensure
producibility and operational supportability. The focus is on producing detailed
manufacturing designs, not solving a myriad of technical issues. Prototyping and
analysis should have been applied prior to this phase to discover and resolve issues to
ensure the design is based on a mature technology and is achievable within cost,
schedule and sustainment constraints. From a sustainment perspective this means
paying particular attention to reducing the logistics footprint; implementing human
systems integration; designing for supportability; and ensuring affordability, integration
with the supply chain, interoperability, and safety. All of these factors are used to refine
the performance-based support concept and strategy, with the associated requirements,
and to identify potential support providers.
5.4.3.2. Activities/Processes
During this phase, the focus is on developing the requirements for the long-term
performance-based support concept and the initial product support package. In
accomplishing this, life-cycle management documents and analyses are refined as a
result of the detailed design process, iterative systems engineering analyses and
developmental test results. During this phase, the critical sustainment metrics are also
refined and incentives developed for eventual performance-based support contracts
and/or performance-based agreements. Stakeholders (including potential support
providers) are identified and included in Integrated Product/Process Team (IPT)
processes to build an early understanding of and buy-in for sustainment requirements
and objectives. Also during this phase, the support concept is refined and potential
support providers are identified. Incentives to design for support and to design a cost-
effective support concept can, and should, be linked to the support strategy.
Identification and involvement of the potential support providers and integrator early
during these efforts is essential for program success.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      492
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
performance/cost/schedule/sustainment trade-offs; and to create the data required to
support and justify the support strategy. During this phase, data will be compiled,
refined, and analyzed consistent with acquisition policy and Defense Acquisition Board
(DAB) requirements to develop and document a best value long term support strategy.
The assessment process determines the right mix between organic and commercial
performance-based support and should consider all requirements (including statutory)
when determining the best value long term sustainment approach to optimize readiness
while minimizing cost. The programs should use accepted decision making tools and
processes, such as Business Case Analysis, Economic Analysis, DSOR Analysis,
Decision Tree Analysis, and/or other appropriate best value assessments. At this point,
no firm source of support decisions should be made until sufficient data is collected and
the risks are determined. (Determination of core capability workload requirements
should be made after the system passes Critical Design Review (CDR).) As a result, the
analysis results should be used and expanded throughout the program life cycle to:
Almost all of the values used during this phase should be based on engineering
estimates and actuals or test results. The level of detail performed by the government
team will vary by the extent to which industry is used to achieve the materiel availability
in a performance-based logistics contract product support package. (The government's
detailed supportability analysis requirements decrease in depth as a direct function of
the level the performance standards are set in the contract requirements, the portion of
the system covered, and the sustainment functions for which the contractor is
responsible.)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      493
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
mitigating failure modes throughout the system design and development process, since
relying solely on testing as a means of improving reliability has been shown to be risky
and costly. Consideration should be given to using such practices as physics of failure
reviews, environmental stress screening, and highly accelerated life testing. A test
analysis and fix program should be implemented to increase reliability and it should be
expanded as more of the hardware (including prototypes) is tested and operated by the
users. Using this process, failure modes may be found through analysis and testing are
then eliminated or reduced by design or process changes as appropriate.
Shortchanging this effort early in development, particularly at the subsystem and
component level, is a frequent cause of later program delays and cost increases as the
flaws inevitably show up in system level performance.
In this phase and in preparing for MS-C, the LCSP focuses on the efforts to manage the
sustainment related risk focusing on the implementation of the product support
package. This includes the approach for developing and fielding the product support
package describing who is doing what, where, and when with the associated budgets.
In addition to refining and expanding on the sustainment management approach,
schedule and costs aspects, specific attention should be paid to the:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      494
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
5.4.3.2.2. Technical Reviews in Engineering and Manufacturing Development
Regardless of the acquisition strategy chosen relative to PDR timing and prototyping,
any remaining initial systems design activities and reviews not finished during the
Technology Development phase (i.e., System Requirements Review (SRR), System
Functional Review (SFR), or Preliminary Design Review (PDR)) are completed early in
the EMD phase. Section 5.4.2.2.3 provides a description of the sustainment aspects
that should be considered for each review and is not repeated in this section. If the PDR
was not conducted prior to Milestone B, the PM should include the results of the
sustainment assessment in the PDR report and Post-PDR Assessment.
The CDR helps to ensure the system can satisfy the CDD requirements within the
allocated budget, staffing and schedule. Details can be found in section 4.2.13 , but in
summary the CDR results in an initial product baseline for the system, hardware,
software, maintainability, supportability, and the product support elements, including
support equipment, training systems, and technical data. Subsystem detailed designs
and product support elements are evaluated during the review to determine whether
they correctly implement system requirements and if the system is mature enough to
proceed into fabrication, demonstration, and test.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      495
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         sustainment enablers or drivers requirements are complete. If the test results to
         date do not indicate the operational test success is likely or risk has increased,
         new developmental and operational testing criteria and plans should be
         considered, along with fallback plans.
    •    Program sustainment development efforts with corresponding schedules,
         including system fabrication, test, and software critical path drivers, are included
         in LCSP updates.
    •    Updated Cost Analysis Requirements Description (CARD) (or a CARD-like
         document) based on the initial product baseline, captures the key program
         sustainment cost drivers, development costs, production costs, operation and
         support costs for all aspects of sustainment and HSI.
The TRR helps to ensure the subsystem or system is ready to proceed into a formal
test. Details can be found in chapter 9 , but in summary it assesses test objectives, test
methods and procedures, test scope, and safety confirming test resources have been
properly identified and coordinated. Consequently, there are two primary logistics roles
in the TRR. One is to help ensure the test is properly planned and resourced (e.g.,
people, facilities, data systems, support equipment, and any other product support
elements) to achieve the test objectives. The second role is to ensure the tests will
identify and help control risk by verifying and validating key sustainment drivers are in
place to achieve the Sustainment KPP and KSAs. This can be accomplished by building
off the system performance tests as well as structuring specific tests and
demonstrations focused on sustainment drivers including maintainability. Regardless of
stage of development or the level of testing (component, subsystem, or system), the
basic tenets contained in section 4.3.3.4.3 apply. This includes, but is not limited to,
identifying the:
The SVR is a product and process assessment to ensure the system can proceed into
production within cost, staffing, schedule, and other system constraints with an
acceptable risk level. Details can be found in section 4.2.14 but in summary the SVR
assesses the system functionality, determining if it meets the functional requirements,
and verifies final product performance. The SVR is often conducted concurrently with
the Production Readiness Review ( section 4.2.15 ) and Functional Configuration Audit (
section 4.2.14 ). Product support IPT members as well as independent sustainment
subject matter experts should participate to:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      496
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Address system supportability and, based on developmental testing or analysis
         whether the sustainment features will satisfy the Capability Development
         Document/draft Capability Production Document and Sustainment KPP/KSAs.
    •    Adequate processes are in place so the sustainment performance metrics can be
         used to help the program to succeed in meeting user needs.
    •    Ascertain if the system is supportable within the procurement, operations, and
         support budgets.
The PRR determines whether the design is ready for production and if the producer has
accomplished adequate production and product support planning. Details can be found
in section 4.2.15 , but in summary it determines if production or production preparations
incur unacceptable risks that might breach schedule, performance, cost, or other
established criteria thresholds. The review evaluates the full, production configured
system to determine if it correctly implements all system requirements, including
embedded sustainment enablers. Product support IPT members, as well as
independent sustainment subject matter experts, should participate to ascertain that the
product support baseline has been established, documented and the:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      497
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
5.4.3.3. Engineering & Manufacturing Development Phase Results/Exit Criteria
The focus of this phase is to ensure the system design incorporates the critical
supportability/ logistics requirements, develops the product support element capabilities,
and demonstrates the key support and sustainment capabilities are mature.
Implementing the process contained in figure 5.4.3.2.F1 produces the detailed
supportability/logistics requirements and the initial designs. The conclusion of this phase
results in the contractual documents required to continue into the Production and
Deployment Phase as well as the system prototype logistics equipment and processes.
The program should be able to demonstrate acceptable performance in the
development, test & evaluation, and operational assessments, to include:
Table 5.4.3.3.T1 identifies the most critical documents that should incorporate or
address supportability/ logistics considerations. The logistics related data in program
deliverables should be updated prior to milestone decisions and to support the various
major design reviews (e.g., CDR, and FCA). The key sustainment elements required for
low rate initial production systems and initial operational test and evaluation (IOT&E)
should be addressed in the LCSP which is summarized in the Acquisition Strategy.
Materiel availability enabler driver initiatives should be included in the RFP as well as
the Source Selection Plan.
From a logistics perspective, the exit documents should focus on the results of the
maintenance planning process, the materiel availability driver initiatives, and their
associated metrics. In addition to updating the support strategy, sustainment funding
requirements, key logistics parameter and logistics testing criteria, the annual
determination of the distribution of maintenance workloads required by statute, an
auditable depot level maintenance core capability and workload assessment should be
completed bi-annually.
             Entry Documents:
             Initial Capabilities Document and Capability Development Document
             Acquisition Strategy
             Acquisition Program Baseline
             Preliminary Design Review Results
             Developmental Test and Evaluation Report
             Operational Test Plan and Test & Evaluation Master Plan (TEMP)
             Life-Cycle Sustainment Plan
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      498
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
             Exit Documents:
             Update documents from MS B
             Capability Production Document
             Technical Data Rights Strategy
             Approved Maintenance Plans
             Life-Cycle Sustainment Plan
During this phase, the focus should be on achieving the objective range value estimate
for each of the Sustainment KPP/KSAs, along with their supporting driver metrics, and
on further analysis (including analysis of the results of any demonstrations that have
been performed). The analysis should be performed to:
    •    Ensure the various metric performance values are consistent with each other as
         each is refined
    •    Ensure the design/production process does not degrade the system’s ability to
         meet the sustainment metrics
    •    Identify the operation impacts the sustainment metrics enablers will have on
         mission success and materiel availability
The models for establishing and tracking projecting expected values should be refined
and the requirements for the metrics should be further allocated to the equipment level.
Key metrics data should be collecting and used to validate the models, evaluate
technical performance, evaluate system maturity and determine the logistics footprint.
The key enabling requirements to achieve the sustainment metrics should be included
in the system specification and PBAs. Detailed test criteria should be developed for
each metric (including any key dependent enabling technologies) to provide information
about risk and risk mitigation as the development and testing continue. The sustainment
test strategy/requirements should be documented in the TEMP.
The extensive life of our systems and rapid technology change has heightened the
importance of technology refreshment and obsolescence management. Consequently,
successful parts management necessitates the need to address diminishing
manufacturing sources and material shortages in the proposal, design, and sustainment
phases of a product (to include the systems and support elements). The PM should
develop a proactive approach to effectively resolve obsolescence problems before they
have an adverse impact on the LCC and system availability. The following are potential
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      499
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
approaches the PM should consider:
    •    Determining the appropriate set of metrics to align the various supply chain
         segments to achieve materiel availability. The specific metrics and their values
         should be determined regardless of who is executing the action to meet the user
         needs in the operational environment and be based on the system
         characteristics.
    •    Selecting the sources of support to sustain the system. Working with the
         maintenance community, the PM should use the most effective sources of
         support that optimize the balance of performance and life-cycle cost, consistent
         with statutory requirements and required military capability. The sources may be
         organic or commercial, but the focus should be on optimizing customer support
         and achieving maximum system availability at the lowest LCC. In making the
         determination, the PM shall ( DoD Instruction 5000.02, Enclosure 8, paragraph
         2.d. ) work with the manpower community to determine the most efficient and
         cost effective mix of DoD manpower and contract support.
    •    Providing the mechanisms and product support elements (including technical
         data) to implement the source of support decisions. In doing so, the strategy and
         resources required to implement the strategy should foster and ensure
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      500
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         competition throughout the life of the system.
    •    Monitoring execution against the metrics to ensure the respective stakeholders
         are engaged in providing the system support to the user. Effective supply chain
         management requires data collection and data sharing within and between all
         elements of the supply chain (public and private). There should be a process to
         collect data throughout the manufacturing process and operations period so the
         data may be mined for product and process improvements using trend analysis
         to effectively communicate/collaborate with a shared understanding of the
         environment.
5.4.3.4.3.1. Maintenance
Program managers should determine the most effective levels of maintenance and
sources based on materiel availability and cost factors. In early deployments the best
value may be to use existing contractor capabilities for interim support. However core
sustaining workload must be accomplished in Government owned facilities with
Government owned equipment and personnel. If it has not already been completed, the
PM should perform the analysis discussed in sections 5.2.1.2 and 5.4.2.2.2 to
determine the maintenance source that complies with statutory requirements,
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      501
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
operational readiness and best value for non-core workloads.
Figure 5.4.3.4.3.1.F1 depicts some of the key benefits of well-designed public private
partnerships. However, care is needed when third parties are involved. For example for
information technology/software support some level of organic support needs to be
resident and none of the support can be sent to non-approved third party countries (i.e.
India, China etc.) without thorough analysis and State Department approval. Further
examples and discussion of public private partnerships can be found in DoDI 4151.21
and on the Acquisition Community Connection web site .
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      502
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                Figure 5.4.3.4.3.1.F1. Public Private Partnership Opportunities
5.4.3.4.3.2. Supply
Competitive Process. Supply support may be included as part of the overall system
procurement or as a separate competition. The competitive selection process should
result in a contract with a commercial source and/or an agreement with an organic
source that prescribes a level of performance in terms of materiel availability and cost.
The PM may use a competitive process to select the best value supply support provider
or include supply support in an overarching performance-based logistics support
arrangement. While access to multiple sources of supply may be encouraged to reduce
the risks associated with a single source, it is imperative that a single entity be
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      503
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
established as a focal point of responsibility. Particular attention should be given to
prime vendor contracts for specific commodities and virtual prime vendor contracts for a
wide range of parts support for specific subsystems. Additional guidance appears in
DoD Directive 4140.1 and DoD 4140.1-R .
Organic Supply Source of Support. The PM should select organic supply sources of
support when they offer the best value. ( DoD Directive 5000.01, E1.1.17 ) When
changing the support strategy for fielded equipment from organic to contractor support
or from contractor to organic support, DoD owned inventory that is unique to that
system should be addressed in the source of support decision.
5.4.3.4.3.3. Transportation
The PM is encouraged to determine the best overall support strategy for the customer
to include the use of all available transportation alternatives, including those provided by
original equipment manufacturers (OEMs), third party logistics providers, or commercial
transportation providers. These alternatives may include the use of commercial
transportation services and facilities to the maximum extent practicable; the use of
organic transportation consistent with military needs; or the combination of both
commercial and organic transportation to support customer requirements. As in supply
support, the PM should strive to structure a support arrangement, such as performance-
based logistics contracts, that will consolidate the responsibility for transportation in a
single entity. Regardless of the approach taken, when making the transportation source
decision the PM needs to ensure the entire end-to-end chain is considered including the
"last mile" aspects along with any required implementing technology (e.g., IUID).
In considering transportation options, the PM should also plan for transition of the
supply and distribution chain from normal operations to expeditionary operations in
austere locations that are not served, at least initially, by commercial transportation
services and facilities. Transportation alternatives in contractual arrangements must
require the contractor to comply with established business rules, when the DoD organic
distribution system is used in lieu of or with the commercial transportation service. All
contractual arrangements requiring that deliveries be made using door-to-door
commercial transportation must include a provision that requires vendors to notify the
contracting officer or the contracting officer's designee when they are unable to use
door-to-door commercial transportation and to request alternate shipping instructions.
The contracting officer or contracting officer's designee must expeditiously provide
alternate shipping instructions and make the appropriate contract price adjustments. For
additional information, see the on-line Defense Transportation Policy Library .
Arms, Ammunition, and Explosives . PMs should refer to DoD 4500.9-R, Defense
Transportation Regulation, Part 2 , and DoD Manual 5100.76-M , Physical Security of
Sensitive Conventional Arms, Ammunition and Explosives (AA&E), for transportation
and security criteria regarding the movement of arms, ammunition, and explosives.
Contract provisions should apply to the prime contractor and all subcontractors.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      504
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
5.4.3.4.4. Other Considerations
Modeling and simulation combined with supportability analysis are important best
practices to design and develop the individual product support elements required to
implement the support strategy. During this phase they are applied to lower and lower
levels of detail as the design matures. The supportability analysis should continue to be
used to determine the relative cost vs. benefits of different support strategies (including
the source of support decisions). The data should be refined and the results included in
the LCSP and used to support contract negotiations. Use of Open Source Architecture
(OSA) practices is another effective methodology to increase affordability and
supportability. Code reuse is a force enabler that provides for more efficient software
development programs that can cut across multiple program areas.
Once product support elements are developed and prototyped, modeling and simulation
can also be used to provide confidence the sustainment metrics will mature to sufficient
levels when the system and supply chain are deployed. This is accomplished with the
use of models that take test results and predict likely capabilities. The same concepts
are applied to provide confidence levels of what the enabling technologies will be able
to achieve in the operational environment and identify any anticipated constraints. All of
these factors are then used to project the mature sustainment metric values and their
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      505
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
associated confidence levels for the projected Concept of Operations.
5.4.4.1. Overview
5.4.4.2. Activities/Processes
5.4.4.1. Overview
The logistics purpose in this phase is to achieve a materiel availability capability that
satisfies mission needs. Milestone C authorizes entry into Low Rate Initial Production, at
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      506
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
which time the design should be mature. The supportability design feature requirements
should have been verified and validated as operationally suitable and effective at an
affordable cost. At this point, the support requirements should be fully defined and
performance-based product support agreements and funding expectations documented
and signed. Funding should also be identified and available for testing and
implementation of the performance-based strategy. Once operational test and
evaluations have determined the effectiveness, suitability, and supportability of the
system, the full rate production and deployment decision is made.
5.4.4.2. Activities/Processes
During this phase, the emphasis is on finalizing equipment product support packages/
maintenance plans, managing and deploying the initial sustainment capabilities, and
demonstrating the product support capabilities and effectiveness. Once they have been
demonstrated, the emphasis is on fully fielding and implementing the sustainment
capabilities to provide the users the capabilities identified in their requirements
documents. Measuring the product sustainment package's effectiveness (including the
associated supply chain) is an important aspect of the management responsibilities in
this phase. Figure 5.4.4.2.F1 highlights the key phase activities.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      507
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Figure 5.4.4.2.F1. System Support Implications in the Production and Deployment
                                     Phase
    •    Ensuring actions are taken to provide the user support required to sustain the
         system within the budget provided, including highlighting to senior management
         the consequences and impacts on the Sustainment KPP/KSAs of budget
         constraints.
    •    Coordinating with the contractors, supply chain and operators to ensure each
         understands and is implementing responsibilities in accordance with the LCSP in
         an integrated fashion.
    •    Monitoring any changes to the design, operational environment and supply chain
         and adjusting the product support elements within the product support package
         accordingly.
    •    Looking for improvements to reduce the product support package cost.
During this phase, the reliability of contractor cost and performance data should be
verified by monitoring contracts. Increased use of Defense Contract Management
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      508
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Agency and Defense Contract Audit Agency in overseeing contracts should be
considered.
10 USC 2464 requires the establishment of the capabilities necessary to maintain and
repair systems and other military equipment required to support military contingencies
(i.e., core capabilities) at Government-owned, Government-operated facilities not later
than four years after achieving initial operating capability. During the production and
deployment phase, it is imperative for the PMs and Program Executive Officers to
ensure the prior planning for maintenance support is executed to meet the supportability
requirements of the system and/or subsystems. If organic depot maintenance is a
portion of the selected supportability strategy, it will require the activation of the requisite
organic depot maintenance capabilities.
The LCSP should be used to help manage the program's fielding efforts. It should focus
on the product support implementation plan and schedule, with emphasis on putting into
place the continuous process improvement management structure to review processes
and remove bottlenecks or constraints encountered by the user. The following aspects
should be emphasized along with the projected sustainment metric values by fiscal year
over the FYDP:
    •    The fielding plan details including any actions to adjust the product support
         package, ensure competition, and control costs.
    •    The analytical and management processes for how the sustainment performance
         will be measured, managed, assessed and reported as well as and achieve and
         maintain the sustainment requirements.
    •    The stakeholder roles in executing the sustainment strategy describing the
         relationships and responsibilities with key players, especially relative to the
         product support arrangements.
Under the total life-cycle systems management concept, the PM is responsible for the
timely fielding of an effective product support package, measuring its effectiveness, and
taking corrective actions when shortfalls are uncovered. The most effective time to
catch problems is before the system is deployed, so including reliability, maintainability
and supportability test requirements in the TEMP should be as important as other
performance measures. Sustainment KPP/KSA driver metrics should be monitored
thought out the test and deployment process to help provide confidence the system will
achieve the sustainment objectives in an operational environment.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      509
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
5.4.4.2.5. Pre-Initial Operational Capability Supportability Review
This review and its associated analysis should be performed at the DoD Component
level in conjunction with the OTRR to:
Many of the actions and subsequent results in this phase are reviewed during technical
reviews and should be accomplished even if the specific referenced reviews do not
occur. The actions and results are tied to the reviews to reflect the relative timeframe in
which they should be accomplished.
The OTRR is a product and process assessment to ensure the system can proceed into
Initial Operational Test and Evaluation with a high probability of successfully completing
operational testing. (See chapter 9 for additional information.) Many of the same actions
used to prepare for the Test Readiness Review (TRR) should be used in preparation for
this review. This test is critical because it provides the users the first real hands-on
indication as to whether the system is operationally effective and suitable.
Consequently, it is important the product support IPT members as well as independent
sustainment subject matter experts participate in the review to ensure the test:
    •    Is properly planned and resourced (e.g., people, facilities, data systems, support
         equipment, and any other product support elements) to achieve the test
         objectives. The Pre-Initial Operational Capability Supportability Review should be
         used to support this process.
    •    Will verify and validate the key sustainment drivers to achieve the Sustainment
         KPP and KSAs are included. This should include ensuring system reliability,
         maintainability, and support performance features are included and
         demonstrated.
    •    Is structured to include as much of the product support package that will be used
         in the operational environment. Where this is not possible, prototypes should be
         used to gain early user feedback on the product support package.
The PCA examines the end-item actual configuration as defined by the Technical Data
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      510
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Package and sets the final production baseline under government control. Details can
be found in section 4.2.16 , but in summary the audit verifies that design documentation
matches the item specified in the contract. In addition to the standard practice of
assuring product verification, the PCA confirms that manufacturing processes, quality
control system, measurement and test equipment, product support, and training are
adequately planned, tracked, and controlled. As such, this review should be used to
ensure the "as-produced" system is compliant with sustainment requirements and
objectives. To the extent lead times will allow, ordering the product support package
elements should be delayed until this review to ensure they are being bought for the
right configuration.
The focus of this phase is to deploy the initial sustainment capabilities and once the
system (both the system and its product support package) are demonstrated to be
operationally suitable and effective to then fully deploy the system. This should be
demonstrated by:
Implementing the process depicted in figure 5.4.4.2.F1 provides the materiel required to
gain full rate production/deployment approval and produce the product support
elements to sustain the system. The conclusion of this phase results in a fully fielded
and supported system. Table 5.4.4.3.T1 identifies the most critical documents that
should address sustainment considerations. Key logistics information complied during
this phase should be used to update the acquisition documents, along with the latest
sustainment strategy based on the actual technology development progress and/or
follow-on increments if an incremental acquisition strategy is used. Also, the
sustainment related data and performance-based requirements should continue to be
included in product and sustainment contracts and agreements to ensure the system is
effectively supported.
             Entry Documents:
             Test and Evaluation Reports
             Acquisition Program Baseline
             Operational Test Plan and Test & Evaluation Master Plan (TEMP)
             Life-Cycle Sustainment Plan
             Exit Documents:
             Update documents from MS C as appropriate
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      511
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
             Physical Configuration Audit Report
             Life-Cycle Sustainment Plan
             Information Supportability Certification
All the product support elements should be considered and focus should be on refining
and fielding them based on their demonstrated success and on confidence that the
requirements will be achieved.
The results and experience demonstrated in all the tests (including follow-on operational
test & evaluation (FOT&E)) and early operations should be considered in refining the
metric estimates. This, along with key supply chain performance and effectiveness
measures for similar fielded systems, should be used to increase the confidence levels
for the PM's estimates. Supply chain performance, Sustainment KPP/KSAs, and key
driver metrics should also be considered in the analysis. Special emphasis should be
placed on tracking the metrics for the drivers of key enabler technologies that have
been developed for the system or are critical for achieving the required materiel
availability. Consideration should be given to revising the product support package and
it’s agreements if major performance shortfalls are found.
When multiple production baselines are deployed or if the full product support package
is not deployed to support test or operations, the program manager should consider the
most effective support method. The alternatives considered can include employing
mixes of contractor and organic support over varied performance periods for each
configuration. This may result in the consideration of multiple performance agreements
and/or support strategies. In determining the best mix, the results from the Production
Readiness Review (PRR) and System Verification Review (SVR) should be considered
to ensure the product support elements are developed for all configuration / block
increments.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      512
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
5.4.4.4.3. Contractor Logistics Support/Contractors on the Battlefield (CLS/COTB)
Integration, In-Theater
Contractors can provide logistics support over a wide range of options, from interim
contractor support covering the initial fielding while the product support package is
being deployed, to supporting specific limited operations, to full contractor support.
When support strategies employ contractors in a battlefield environment, PMs should, in
accordance with Joint Publication 4-0 Chapter 5 and DoD Component implementing
guidance, coordinate with affected Combatant Commanders. This coordination must be
carried out through the lead DoD Component and ensure functions performed by
contractors, together with functions performed by military personnel, and government
civilians, are integrated in operations plans (OPLANs) and orders (OPORDs). During
this process the Combatant Commanders will:
The intent of the coordinated planning is to ensure the continuation of essential services
in the event the contractor provider is unable (or unwilling) to provide services during a
contingency operation. Contingency plans are required for those tasks that have been
identified as essential contractor services to provide reasonable assurance of
continuation during crisis conditions. PMs should also coordinate with the DoD
Component manpower authority in advance of contracting for support services to
ensure tasks and duties that are designated as inherently governmental or exempt are
not contracted.
Configuration control over the analysis and resulting data becomes important as the
design changes. The program should take steps to ensure that as the system changes,
the product support package is adjusted to take into account the various configurations
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      513
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
the user will encounter and the product support elements stay in sync.
Even well into operations, programs should evaluate opportunities for transitioning, in
whole or part, to performance-based logistics contracts by examining opportunities to
leverage public private partnerships. Experience has shown that, even with existing
capitalized infrastructure in place, legacy programs can transition to outcome based
contracts across the spectrum of subsystem or functional process support segments.
M&S continues to support the program improvement efforts by analyzing the impact of
proposed design refinement, maintenance processes, and budget alternatives on the
sustainment metrics/mission effectiveness. M&S should be used in assessing the
alternatives of both the system and its support system (especially the enabling
technologies), ensuring all critical metrics are considered in parallel and not at the
expense of others. In addition, taking early operational results and predicting likely
trends (with confidence levels) can be used to proactively anticipate problems so
corrective actions can be taken as the system is fielded to minimize adverse impacts on
the users. This also helps to provide confidence the critical sustainment metrics will
mature to sufficient levels when the system and supply chain are fully deployed and to
identify any anticipated constraints or limitations.
5.4.5.1. Overview
5.4.5.2. Activities/Processes
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      514
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
5.4.5.5.2. Supportability Analysis
5.4.5.1. Overview
In the total life-cycle systems management concept, providing user support and
managing the demilitarization/disposal of old systems are the PM's responsibilities.
During this phase, the PM is the system focal point to the user and should continually
assess the sustainability effectiveness of the fielded systems, adjusting the program as
required to support the user.
Users require readiness and operational effectiveness (i.e., systems accomplishing their
missions) in accordance with their design parameters in an operational environment.
Systems, regardless of the application of design for supportability, suffer varying
stresses during actual deployment and use. Consequently, the PM should apply the
systems engineering processes used in acquisition throughout the entire life cycle. The
difference is that during this phase actual use data including user feedback, failure
reports, and discrepancy reports rather than engineering estimates are used.
    •    Monitor system usage and supply chain against design baseline criteria and
         assumptions.
    •    Review and triage all use data and supplier data to determine operational
         hazards/safety risks, as well as readiness degraders.
    •    Develop alternatives to resolve critical safety and readiness degrading issues.
    •    Identify sub-optimal performers in the fielded product support system, and correct
         them through rebalanced product support elements or changes to the
         maintenance program.
    •    Enhance the performance and cost-effectiveness of the end-to-end supply chain
         to ensure materiel readiness continues to meet user needs.
    •    Identify redesign opportunities to enhance system effectiveness.
5.4.5.2. Activities/Processes
During this phase, the focus is on supporting the user by executing the sustainment
program and on making adjustments based on effectiveness and operating conditions
using systems engineering principles. However, the PM should not undertake depot
maintenance source of support decisions without consultation with accountable military
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      515
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
department logistics officials to ensure the DoD Component depot maintenance 50
percent limitation statutory requirement is being met. Figure 5.4.5.2.F1 highlights the
key sustainability and product support activities.
Under the total life-cycle systems management concept, the program manager
continually assesses the system performance from the users perspective. The PM
should use existing reporting systems and user feedback to evaluate the fielded system,
focusing on performance outcomes meaningful to the user. (If existing reporting
systems do not provide sufficient information, the PM should augment existing reporting
systems by collecting critical data required to assess performance and, where
necessary, work with the DoD Components to add the capabilities to the existing
reporting systems.) The data should be analyzed, comparing performance expectations
against actual performance, root causes of problems identified, and corrective actions
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      516
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
developed.
The PM should conduct regularly scheduled In-Service Reviews (also known as Post
IOC Reviews) with the users, assessing the current status, operational health and
corrective actions to satisfy user operational needs based on user feedback and
performance metrics. (See section 4.2.17 for additional information.) The ISR is a multi-
disciplined product and process assessment to ensure the system is employed with
well-understood and managed risk, so timely corrective actions can be taken. Leading
into and during the reviews engineering, sustainment stakeholders (e.g., suppliers,
representatives from primary supply chain providers, and the comptroller communities)
and product support IPT members, as well as independent sustainment subject matter
experts, should apply sound programmatic, systems engineering, and logistics
management processes to:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      517
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         resolution rate, and trends.
The reviews should be conducted at defined intervals to identify needed revisions and
corrections, and to allow for timely improvements in the strategies to meet performance
requirements for materiel readiness. At least initially, the In-Service Reviews will focus
on the product support package fielding including the product support providers
performance against the PBAs and other requirements. Consequently, the reviews with
the users and product support service providers should be on a semi-annual basis as
the support plans are executed (including transition from organic to contract support and
vice versa, if applicable). After the system has been fully deployed, the frequency of
these reviews should then be based on system performance (including trends), the
pace of technology, obsolescence issues, and safety. The program's In-Service
Reviews should be used to prepare for the DoD Component level assessments or
reviews.
Following the Full Rate Production Decision, the LCSP is the principle program
document governing the systems management and execution. It describes the actions
to be taken to meet the total system availability requirements based on measured
performance in the operational environment. The plan documents the results of the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      518
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
stakeholder actions and projects outcomes expected based on the budget and real
world conditions emphasizing the:
The conclusion of this phase results in the disposal of the system following statutory
regulations and policy. The PM should coordinate with DoD Component logistics
activities and DLA, as appropriate, to identify and apply applicable demilitarization
requirements necessary to eliminate the functional or military capabilities of assets (
DoD 4140.1-R and DoD 4160.21-M-1 ). The PM should coordinate with DLA to
determine property disposal requirements for system equipment, support assets, and
by-products (DoD 4160.21-M).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      519
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
system design changes.
During this phase, the PM should measure, track and report the supply chain
performance and its effectiveness, along with the sustainment metric drivers and the
root cause of any performance shortfalls. Special emphasis should be placed on
tracking the metrics for the drivers for the key enabler technologies that were developed
for the system or are critical for achieving the required materiel availability.
The following are important, but not the only, best practices to be used in this phase
since the concepts previously spelled out still apply. In each case, the best practices
involve Sustaining Engineering where the PM continually comparing performance
against expectations using actual equipment and support performance data, to revise,
correct and improve product support strategies to meet the users' requirements.
Adjusting the maintenance requirements using RCM and CBM+ principles can be a very
effective in optimizing the sustainment KPP and KSAs during the Operating and
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      520
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Support Phase. Additional approaches useful to the PM in balancing logistics resources,
decreasing repair cycle times, and/or improving readiness/availability include:
During this phase, the supportability analysis continues to focus on design changes
regardless of the need for the change (e.g., reliability shortfall, obsolescence issue,
safety concern) and adjusting the support package to accommodate the changes. In
this process, care should be given to ensure the analysis encompasses all previous
configuration/block increments across the entire platform and range of deployed
configurations. In doing this, the entire support strategy should be addressed to look for
opportunities to reduce the costs and logistics footprint.
Supportability analysis should also be used to adjust the support package based on
how it is performing. A wide range of changes (including moving between overhaul and
repair, improving off equipment diagnostic capabilities, transitioning to a commercial
supply chain management system, etc.) should be considered in determining the best
solution. The ability to continually compare performance against expectations using
actual equipment and support performance data to drive data analyses and a RCM
decision analysis is more efficient and reduces risks.
In both cases, use data is monitored/collected and analyzed using FMECA. Any failure,
maintenance or operational issues are verified and root causes, risk and severity are
determined. An analysis should then be performed to determine if the most cost
effective solution is a:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      521
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
In any proposed solution, the PM should work with the users to determine if the change
and the timeframe are acceptable. Once the agreements have been reached,
supportability analysis is used to adjust the appropriate product support package
elements.
During this phase M&S supports the program improvement efforts by analyzing the
impact of proposed continuous process improvements, ECPs, and budget alternatives
on the sustainment metrics as well as mission effectiveness. M&S can be used in
assessing the alternatives affecting the design and deployment of both the end item and
it’s support system. In addition, it can be used in a proactive mode to anticipate
problems by taking use data and user feedback to:
5.5. References
Product Support Manager Guidebook . This guide provides the PSM easy reference
in addressing key requirements for managing product support across the entire life
cycle of the weapon system. It serves as an operating guide to assist the PSM and the
Acquisition Community with the implementation of product support strategies in aligning
the acquisition and life cycle product support processes.
DoD Product Support Business Case Analysis (BCA) Guidebook. This guide
provides overall guidance for conducting a Product Support BCA. It provides a
standardized process and methodology for writing, aiding decision making, and
providing analytical decision support for a Product Support BCA. The guide should be
used in conjunction with other analytical tools and guidance in making product support
decisions across the life-cycle.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      522
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Commercial and Organic PBAs. It includes sample Performance-Based Agreements,
templates, contractual incentives, a Performance-Based Agreement Toolkit and other
resources. It also includes An End to End Customer Support PBA template that
provides a common framework and a checklist to consider when undertaking a
performance-based type agreement that may involve one or more supply chain support
services as well as PBA terms and definitions. (Note: This guide is in the update
process.)
Operating and Support Cost-Estimating Guide. This guide and DoD Manual 5000.4 ,
DoD Cost Analysis Guidance and Procedures provide procedures for life-cycle cost
estimates. They explain the policies and procedures, focusing on the preparation,
documentation, and presentation of cost estimates, and include and Operating and
Support Cost element structure.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      523
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
residing within the ACC framework. LOG CoP provides a number of resources for
implementing life-cycle logistics. The community space also allows members to share
(post to the website) their knowledge, lessons learned, and business case related
material, so that the entire logistics community has access and can benefit.
Environment, Safety, and Occupational Health (ESOH) . DoD ESOH Guidance for
systems acquisition programs can be found in Chapter 4 Systems Engineering and in
the ESOH Special Interest Area on the Acquisition Community.
The DoD Guide For Achieving Reliability, Availability, Maintainability (RAM) . This
document helps project managers and engineers to plan for and design RAM into
systems. The guide focuses on what can be done in the systems engineering process
to achieve effective levels of RAM, successfully demonstrate them during operational
test and evaluation, and sustain them through the systems life cycle. It can be used to
help capability document requirements writers and engineering organizations think
through the top-level sustainment requirements for RAM early in the life cycle to ensure
the system is sustainable and affordable throughout its life cycle.
The DoD Reliability, Availability, Maintainability & Cost (RAM-C) Rationale Report
Manual . This manual describes the development of the RAM-C Rationale Report. It
provides guidance in how to develop and document realistic sustainment Key
Performance Parameter (KPP)/Key System Attribute (KSA) requirements and related
supporting rationale. It addresses how the requirements must be measured and tested
throughout the system life cycle as well as the processes that should be followed when
developing the sustainment requirements.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      524
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
DEFENSE ACQUISITION GUIDEBOOK
Chapter 6 - Human Systems Integration (HSI)
6.0. Overview
6.4. Human Systems Integration (HSI) throughout the System Life Cycle
6.0. Overview
6.0.1. Purpose
6.0.2. Contents
6.0. Overview
DoD acquisition policy requires optimizing total system performance and minimizing the
cost of ownership through a "total system approach" to acquisition management by
applying Humans Systems Integration elements to acquisition systems. ( DoD Directive
5000.01 ).
6.0.1. Purpose
6.0.2. Contents
• Section 6.1 briefly reviews the total systems approach directed by DoD Directive
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      525
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         5000.01.
    •    Section 6.2 describes the importance of integration with respect to Human
         Systems Integration (HSI) implementation and its value in systems integration
         and risk management.
    •    Section 6.3 describes each of the domains of HSI: Manpower, Personnel,
         Training, Human Factors Engineering, Safety and Occupational Health,
         Survivability (Personnel), and Habitability. Each of these sub-sections contains
         an overview of the domain, addresses domain requirements, and a discussion of
         planning considerations.
    •    Section 6.4 follows with the implementation of HSI, to include formulation of the
         HSI strategy and the sequencing of expected HSI activities along the timeline of
         the Defense Acquisition Framework.
    •    Section 6.5 describes the human considerations associated with resource
         estimating and planning; it is the HSI complement to Chapter 3.
    •    Section 6.6 provides two reference lists for additional information .
6.2.1. Integrated Product and Process Development (IPPD) and Integrated Product
Teams (IPTs)
The total system includes not only the prime mission equipment and software, but also
the people who operate, maintain, and support the system; the training and training
devices; and the operational and support infrastructure. Human Systems Integration
(HSI) practitioners assist program managers by focusing attention on the human part of
the system and by integrating and inserting manpower, personnel, training, human
factors engineering, environment, safety, occupational health hazards, and personnel
survivability considerations into the Defense acquisition process. Consistent with DoD
Instruction 5000.02, Enclosure 8 , when addressing HSI, the program manager must
focus on each of the "domains" of HSI. These domains are outlined and explained
beginning in Section 6.3 . The focus on the domains; however, should also include a
comprehensive integration within and across these domains as outlined in Section 6.2 .
The key to a successful HSI strategy is comprehensive integration across the HSI
domains and also across other core acquisition and engineering processes. This
integration is dependent on an accurate HSI plan and includes the comprehensive
integration of requirements. The optimization of total system performance and
determination of the most effective, efficient, and affordable design requires upfront
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      526
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
requirements analyses. The HSI domains (manpower, personnel, training, environment,
safety and occupational health, human factors engineering, survivability, and
habitability) can and should be used to help determine and work the science and
technology gaps to address all aspects of the system (hardware, software, and human).
The program manager should integrate system requirements for the HSI domains with
each other, and also with the total system. As work is done to satisfy these
requirements, it is vital that each HSI domain anticipate and respond to changes made
by other domains or which may be made within other processes or imposed by other
program constraints. These integration efforts should be reflected in updates to the
requirements, objectives, and thresholds in the Capability Development Document s .
Values for objectives and thresholds, and definitions for parameters contained in the
Capabilities Documents, Manpower Estimate , Test and Evaluation Master Plan ,
Acquisition Plan and Acquisition Program Baseline , should be consistent. This ensures
consistency and thorough integration of program interests throughout the acquisition
process.
6.2.1. Integrated Product and Process Development (IPPD) and Integrated Product
Teams (IPTs)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      527
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
by ensuring the following:
The development of an HSI strategy should be initiated early in the acquisition process,
when the need for a new capability or improvements to an existing capability is first
established. To satisfy DoD Instruction 5000.02, the program manager should have a
plan for HSI in place prior to entering Engineering and Manufacturing Development. The
program manager should describe the technical and management approach for meeting
HSI parameters in the capabilities documents, and identify and provide ways to manage
any HSI-related cost, schedule, or performance issues that could adversely affect
program execution.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      528
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
The HSI plan should address potential readiness or performance risks and how these
risks should be identified and mitigated. For example, skill degradation can impact
combat capability and readiness. The HSI plan should call for studies to identify
operations that pose the highest risk of skill decay. When analysis indicates that the
combat capability of the system is tied to the operator's ability to perform discrete tasks
that are easily degraded (such as those contained in a set of procedures), solutions
such as system design, procedural changes or embedded training should be considered
to address the problem. Information overload and requirements for the warfighter to
dynamically integrate data from multiple sources can result in degradation of situational
awareness and overall readiness. Careful consideration of common user interfaces,
composable information sources, and system workload management will mitigate this
risk. An on-board "performance measurements capability" can also be developed to
support immediate feedback to the operators/maintainers and possibly serve as a
readiness measure to the unit commander. The lack of available ranges and other
training facilities, when deployed, are issues that should be addressed. The increased
use of mission rehearsal, as part of mission planning, and the preparation process and
alternatives supporting mission rehearsal should be addressed in the HSI plan. Team
skills training and joint battlespace integration training should also be considered in the
HSI plan and tied to readiness. Additionally, HSI issues should be addressed at system
technical reviews and milestone decision reviews.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      529
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
program manager should be prepared with design alternatives to mitigate the impact of
technology or software that is not available when expected.
6.3.1. Manpower
6.3.2. Personnel
6.3.3. Training
6.3.1. Manpower
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      530
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
example, material-handling equipment can be used to reduce labor-intensive material-
handling operations and embedded training can be used to reduce the number of
instructors.
DoD Directive 5000.01 directs the DoD Components to plan programs based on
realistic projections of the dollars and manpower likely to be available in future years.
Manpower goals and parameters should be based on manpower studies and analysis.
These studies and analyses should ensure that design options that reduce workload
and ensure program affordability are pursued, and that lower-priority design features do
not take precedence. Throughout the system life cycle, program managers should strive
to keep manpower and the associated ownership costs at desired/targeted levels.
Program managers should also preserve future-year resources rather than attempting
to compete for additional funding later to address Manpower, Personnel or associated
Training issues.
When there are Congressional or Administrative caps placed on military end strengths,
the introduction of a new system or capability will require compensating reductions
(trade-offs) elsewhere in the force structure or in the Individuals Account. Manpower
officials should identify areas for offsets, or "bill-payers," for the new system and
establish constraints based on available resources. If the new system replaces a
system in the inventory, manpower officials should determine whether the constraints
placed on the predecessor system also apply to the new system. They should consider
the priority of the new system and determine if either additional resources will be
provided, or if more stringent constraints will apply. Manpower authorities should
consider the availability of resources over the life of the program and weigh competing
priorities when establishing manpower constraints for acquisition programs. Reviews
should account for all military and civilian manpower and contract support needed to
operate, maintain, support, and provide training for the system over the entire life of the
program.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      531
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
manpower intensive, it may be prudent to establish a manpower Key Performance
Parameter (KPP) early in the acquisition process. Setting a KPP will ensure the system
fit’s within manpower parameters established by the Department, that agreed-upon
resource thresholds are not exceeded, and that the system will not require additional
resources from higher priority programs later in the acquisition process. A KPP should
only be established if the adverse manpower effect of exceeding the KPP outweighs the
overall benefits of the new capability. In all cases, manpower constraints and KPPs
must be defendable and commensurate with the priority and utility of the new capability.
Program Managers and HSI practitioners should work closely with the users and the
sponsoring organization to ensure agreement on the appropriate parameters.
The capability documents should also address specific, scenario-based, factors that
affect manpower, such as surge requirements, environmental conditions (e.g., arctic or
desert conditions), and expected duration of the conflict. These factors are capability-
related and directly affect the ability of the commander to sustain operations in a
protracted conflict.
Manpower analysts determine the number of people required, authorized, and available
to operate, maintain, support, and provide training for the system. Manpower
requirements are based on the range of operations during peacetime, low intensity
conflict, and wartime. They should consider continuous, sustained operations and
required surge capability. The resulting Manpower Estimate accounts for all military
(Active Reserve, and Guard), DoD civilian (U.S. and foreign national), and contract
support manpower.
DoD Instruction 5000.02 requires the program manager to work with the manpower
community to determine the most efficient and cost-effective mix of DoD manpower and
contract support, and identify any issues (e.g., resource shortfalls) that could impact the
program manager's ability to execute the program. This collaboration must be
conducted within the Human Systems Integration (HSI) framework to ensure integration
with the other HSI domains. The HSI lead for a program / project should be able to draw
expertise from the manpower community to provide program assistance. Generally, the
decision to use DoD civilians and contract labor in theater during a conflict where there
is a high likelihood of hostile fire or collateral damage is made on an exception basis. In
all cases, risk reduction should take precedence over cost savings. Additionally, the
program manager should consult with the manpower community in advance of
contracting for operational support services to ensure that sufficient workload is retained
in-house to adequately provide for career progression, sea-to-shore and overseas
rotation, and combat augmentation. The program manager should also ensure that
inherently governmental and exempted commercial functions are not contracted. These
determinations should be based on current Workforce Mix Guidance ( DoD Instruction
1100.22 ).
Consistent with sections E1.1.4 and E1.1.29 of DoD Directive 5000.01, the program
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      532
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
manager must evaluate the manpower required and/or available to support a new
system and consider manpower constraints when establishing contract specifications to
ensure that the human resource demands of the system do not exceed the projected
supply. The assessment must determine whether the new system will require a higher,
lower, or equal number of personnel than the predecessor system, and whether the
distribution of ranks/grade will change. Critical manpower constraints must be identified
in the capability document s to ensure that manpower requirements remain within DoD
Component end-strength constraints. If sufficient end-strength is not available, a
request for an increase in authorizations should be submitted and approved as part of
the trade-off process.
When assessing manpower, the system designers should look at labor-intensive (high-
driver) tasks. These tasks might result from accessibility or hardware/ software interface
design problems. These high-driver tasks can sometimes be eliminated during
engineering design by increasing equipment or software performance. Based on a top-
down functional analysis, an assessment should be conducted to determine which
functions should be automated, eliminated, consolidated, or simplified to keep the
manpower numbers within constraints.
Manpower requirements should be based on task analyses that are conducted during
the functional allocation process and consider all factors including fatigue; cognitive,
physical, sensory overload; environmental conditions (e.g., heat/cold), and reduced
visibility. Additionally, manpower must be considered in conjunction with personnel
capabilities, training, and human factors engineering trade-offs.
When reviewing support activities, the program manager should work with manpower
and functional representatives to identify process improvements, design options, or
other initiatives to reduce manpower requirements, improve the efficiency or
effectiveness of support services, or enhance the cross-functional integration of support
activities.
The support strategy should document the approach used to provide for the most
efficient and cost-effective mix of manpower and contract support and identify any cost,
schedule, or performance issues, or uncompleted studies that could impact the program
manager's ability to execute the program.
6.3.2. Personnel
Personnel factors are those human aptitudes (i.e., cognitive, physical, and sensory
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      533
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
capabilities), knowledge, skills, abilities, and experience levels that are needed to
properly perform job tasks. Personnel factors are used to develop the military
occupational specialties (or equivalent DoD Component personnel system
classifications) and civilian job series of system operators, maintainers, trainers, and
support personnel. Personnel officials contribute to the Defense acquisition process by
ensuring that the program manager pursues engineering designs that minimize
personnel requirements, and keep the human aptitudes necessary for operation and
maintenance of the equipment at levels consistent with what will be available in the user
population at the time the system is fielded.
DoD Instruction 5000.02 requires the program manager to work with the personnel
community to define the performance characteristics of the user population, or "target
audience," early in the acquisition process. The program manager should work with the
personnel community to establish a Target Audience Description (TAD) that identifies
the cognitive, physical, and sensory abilities-i.e. capabilities and limitations, of the
operators, maintainers, and support personnel expected to be in place at the time the
system is fielded. When establishing the TAD, Human Systems Integration (HSI)
practitioners should verify whether there are any recruitment or retention trends that
could significantly alter the characteristics of the user population over the life of the
system. Additionally, HSI analysts should consult with the personnel community and
verify whether there are new personnel policies that could significantly alter the scope of
the user population (e.g., policy changes governing women in combat significantly
changed the anthropometric requirements for occupational specialties).
Per DoD Instruction 5000.02, to the extent possible--systems should not be designed to
require cognitive, physical, or sensory skills beyond those found in the specified user
population. During functional analysis and allocation, tasks should be allocated to the
human component consistent with the human attributes (i.e., capabilities and limitations)
of the user population to ensure compatibility, interoperability, and integration of all
functional and physical interfaces. Personnel requirements should be established
consistent with the knowledge, skills, and abilities (KSAs) of the user population
expected to be in place at the time the system is fielded and over the life of the
program. Personnel requirements are usually stated as a percentage of the population.
For example, capability documents might require "physically accommodating the central
90% of the target audience." Setting specific, quantifiable, personnel requirements in
the Capability Documents assist the establishment of test criterion in the Test and
Evaluation Master Plan.
Personnel capabilities are normally reflected as knowledge, skills, abilities (KSAs), and
other characteristics. The availability of personnel and their KSAs should be identified
early in the acquisition process. The DoD Components have a limited inventory of
personnel available, each with a finite set of cognitive, physical and psychomotor
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      534
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
abilities. This could affect specific system thresholds.
The program manager should use the target audience description (TAD) as a baseline
for personnel requirements assessment. The TAD should include information such as
inventory; force structure; standards of grade authorizations; personnel classification
(e.g., Military Occupational Code / Navy Enlisted Classification) description;
biographical information; anthropometric data; physical qualifications; aptitude
descriptions as measured by the Armed Services Vocational Aptitude Battery (ASVAB));
task performance information; skill grade authorization; Military Physical Profile Serial
System (PULHES); security clearance; and reading grade level.
The program manager should assess and compare the cognitive and physical demands
of the projected system against the projected personnel supply. The program manager
should also determine the physical limitations of the target audience (e.g., color vision,
acuity, and hearing). The program manager should identify any shortfalls highlighted by
these studies.
The program manager should determine if the new system contains any aptitude-
sensitive critical tasks. If so, the program manager should determine if it is likely that
personnel in the target audience can perform the critical tasks of the job.
The program manager should use a truly representative sample of the target population
during Test and Evaluation (T&E) to get an accurate measure of system performance. A
representative sample during T&E will help identify aptitude constraints that affect
system use.
Consistent with DoD Instruction 5000.02, Enclosure 8 , the program manager should
summarize major personnel initiatives that are necessary to achieve readiness or
rotation objectives or to reduce manpower or training costs, when developing the
acquisition strategy. The Life-Cycle Sustainment Plan should address modifications to
the knowledge, skills, and abilities of military occupational specialties for system
operators, maintainers, or support personnel if the modifications have cost or schedule
issues that could adversely impact program execution. The program manager should
also address actions to combine, modify, or establish new military occupational
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      535
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
specialties or additional skill indicators, or issues relating to hard-to-fill occupations if
they impact the program manager's ability to execute the program.
6.3.3. Training
Training is any activity that results in enabling users, operators, maintainers, leaders
and support personnel, to acquire, gain or enhance knowledge, skills, and concurrently
develops their cognitive, physical, sensory, team dynamics and adaptive abilities to
conduct joint operations and achieve maximized and fiscally sustainable system life
cycles. The training of people as a component of material solutions, delivers the
intended capability to improve or fill capability gaps.
Cost and mission effective training facilitates DoD acquisition policy that requires
optimized total system performance and minimizing the cost of ownership through a
"total system approach" to acquisition management ( DoD Directive 5000.01 ).
The systems engineering concept of a purposely designed total system includes not
only the mission system-equipment, but more critically, the people who operate,
maintain, lead and support these acquired systems. Including the training, training
systems; and the operational and support infrastructure.
The Human Systems Integration (HSI) Training Domain assists program managers
throughout the acquired systems life cycle by focusing attention on the human interface
with the acquired system, and by integrating and inserting manpower, personnel,
training, human factors engineering, environment, safety, occupational health,
habitability, and survivability as Systems Engineered elements into the Defense
acquisition process consistent with DoD Instruction 5000.02, Enclosure 8
In all cases, the paramount goal of training for new systems is to develop and sustain a
ready, well-trained individual/unit, while giving strong consideration to options that can
reduce life-cycle costs and provide positive contributions to the joint context of a system
and provide a positive readiness outcome.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      536
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
weapon systems, training of user, operator, maintainer, and leader personnel will be
performed. (ref: DoDI 5000.02 Enclosure 2 & 6, Title 10 USC Sections 2433 & 2535)
To facilitate timely, cost effective and appropriate training content, development and
planning of training should be performed during the earliest phases (e.g. Material
Solution and Technology Development Phases) of the acquisition processes, outlined
within the AoA, System Training Plans (e.g. STRAPs, NTSPs or STPs) Acquisition
Strategies(AS) and Acquisition Program Baselines (APB). (ref: DoDI 5000.02 Enclosure
4, 7 & 8, Title 10 Sections 2433 & 2435)
To insure appropriate training for new systems acquisition and traceability to life cycle
sustainment costs estimates, systems engineering processes should assess training
impacts of material decision trades and appropriately document. New Equipment
Training (NET) plans (e.g. STRAPs, NTSPs and STPs) should identify service joint
warfighting training requirements. Training planning and training cost estimates should
be incorporated within the Cost Analysis Requirements Description (CARD) and Life
Cycle Sustainment Plans (LCSPs). (ref: DoDI 5000.02 Enclosure 7, DoDD 5000.04-M &
5141.01, Title 10 Sections 2433 & 2435)
Initially, the JCIDS process should address joint training requirements for military
(Active, Reserve, and Guard) and civilian support personnel who will operate, maintain,
lead and support the acquired system.
Training programs should employ integrated cost-effective solutions, and may consist of
a blend of capabilities that use existing training program insights and introduces new
performance-based training innovations. This may include requirements for school and
unit training, as well as new equipment training, or sustainment training. This also may
include requirements for instructor and key personnel training and new equipment
training teams.
Training planning should be initiated early, by the PM in coordination with the training
community within the capabilities development process beginning with the Capabilities
Based Assessment and Analysis of Alternatives which support development of the Initial
Capabilities Document , informing the Material Development Decision to support the
Material Solutions Analysis phase, and continues with development of the Capability
Development Document .
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      537
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Training should also be considered in collaboration with the other Human Systems
Integration (HSI) domains in order to capture the full range of human integration issues
to be considered within the Systems Engineering process.
Early training planning will inform the Capability Development Document and should
characterize the specific system training requirements and identify the training Key
Performance Parameter :
    •    The training community should consider whether the system be designed with a
         mode of operation that allows operators to train interactively on a continuous
         basis, even when deployed in remote / austere locations.
    •    The training community should consider whether the system be capable of
         exhibiting fault conditions for a specified set of failures to allow rehearsal of repair
         procedures for isolating faults or require that the system be capable of
         interconnecting with other (specific) embedded trainers in both static and
         employed conditions.
    •    The training community should consider whether embedded training capabilities
         allow enhancements to live maneuvers such that a realistic spectrum of threats is
         encountered (e.g., synthetic radar warnings generated during flight).
    •    The training community should consider whether the integrated training system
         be fully tested, validated, verified, and ready for training at the training base as
         criteria for declaring Initial Operational Capability.
From the earliest stages of development and as the system matures, the program
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      538
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
manager should emphasize training requirements that enhance the users capabilities,
improve readiness, and reduce individual and collective training costs over the life of the
system. This may include requirements for expert systems, intelligent tutors, embedded
diagnostics, virtual environments, and embedded training capabilities. Examples of
training that enhances users capabilities include:
Training devices and simulators are systems that, in some cases, may qualify for their
own set of HSI requirements. For instance, the training community may require the
following attributes of a training simulator:
The acquisition program will be specific in translating new system capabilities into the
system and its inherent training requirements.
From the earliest stages of development and as the future system design matures, the
program manager should emphasize training requirements that enhance the users
capabilities, interoperability, improve readiness, and reduce individual and collective
training costs over the life of the system. This may include requirements for expert
systems, intelligent tutors, embedded diagnostics, virtual environments, and embedded
training capabilities.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      539
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
6.3.3.4. Development of Training Requirements
When developing the training system, the program manager shall employ
transformational training concepts, strategies, and tools such as computer based and
interactive courseware, simulators, and embedded training consistent with the programs
acquisition strategy, goals and objectives and reflect the tenants outlined in the next
generation training strategy.
In addition, the program should address the requirement for a systems training key
performance parameter as described in the JCIDS Manual .
The USD (P&R), as a member of the Defense Acquisition Board (DAB), assesses the
ability of the acquisition process to support the Military Departments, COCOMs, and
other DoD Components acquisition programs from a manpower, personnel, and training
readiness perspective.
The acquisition program will characterize training planning, development and execution
within the Cost Analysis Requirements Description. Life Cycle Support Plans and
Manpower Estimate Reports tailored to each document-type. These training summaries
will capture - support traceability of planned training across acquisition and capability
documents, and will include logistics support planning for training, training equipment
and training device acquisitions and installations
A Special Note on Embedded Training. Both the sponsor and the program manager
will provide analysis that demonstrates careful consideration to the use of embedded
training as defined in DoD Directive 1322.18 : The sponsor's decisions to use
embedded training will be determined very early in the capabilities assessment process.
Analysis will be conducted to compare the embedded training with more traditional
training media (e.g., simulator based training, traditional classroom instruction, and/or
maneuver training) for consideration of a systems Total Operating Cost. The analysis
will compare the costs and the impact of embedded training (e.g., training operators and
maintenance personnel on site compared to off station travel to a temporary duty
location for training).
6.3.4.2. Overview
6.3.4.3. Parameters/Requirements
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      540
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
6.3.4.5.1. Analysis
6.3.6. Survivability
6.3.7. Habitability
The program manager employs human factors engineering to design systems that
require minimal manpower; provide effective training; can be operated, maintained and
supported by users; and are suitable (habitable and safe with minimal environmental
and occupational health hazards) and survivable (for both the crew and equipment). In
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      541
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
accordance with DoD Instruction 5000.02,
The human factors that need to be considered in the integration are discussed below:
6.3.4.2. Overview
Human factors are the end-user cognitive, physical, sensory, and team dynamic abilities
required to perform system operational, maintenance, and support job tasks. Human
factors engineers contribute to the acquisition process by ensuring that the program
manager provides for the effective utilization of personnel by designing systems that
capitalize on and do not exceed the abilities (cognitive, physical, sensory, and team
dynamic) of the user population. The human factors engineering community works to
integrate the human characteristics of the user population into the system definition,
design, development, and evaluation processes to optimize human-machine
performance for operation, maintenance, and sustainment of the system.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      542
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Cognitive interfaces (decision rules, decision support systems, provision for
         maintaining situational awareness, mental models of the tactical environment,
         provisions for knowledge generation, cognitive skills and attitudes, memory aids);
         and,
    •    Physical interfaces (hardware and software elements designed to enable and
         facilitate effective and safe human performance such as controls, displays,
         workstations, worksites, accesses, labels and markings, structures, steps and
         ladders, handholds, maintenance provisions, etc.).
6.3.4.3. Parameters/Requirements
Human factors requirements, objectives, and thresholds should be derived from each of
the Human Systems Integration (HSI) domains and should provide for the effective
utilization of personnel through the accommodation of the cognitive, physical, and
sensory characteristics that directly enhance or constrain system performance. In many
cases, the interface design limitation may require tradeoffs in several of the other
domains and vice, versa.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      543
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
operate in noisy environments where weapons are being fired or on an overcast
moonless night with no auxiliary illumination. Visual acuity or other sensory
requirements may limit the target audience for certain specialties.
HFE plays an important role in each phase of the acquisition cycle, to include
requirements development, system definition, design, development, evaluation, and
system support for reliability and maintainability in the field. To realize the potential of
HFE contributions, HFE must be incorporated into the design process at the earliest
stages of the acquisition process (i.e., during the Materiel Solution Analysis and
Technology Development phases). It should be supported by inputs from the other
Human Systems Integration (HSI) domains as well as the other Systems Engineering
processes . The right decisions about the human-machine interfaces early in the design
process will optimize human and hence, total systems performance. HFE participation
continues to each succeeding acquisition phase, continuing to work tradeoffs based on
inputs from the other HSI domains and the hardware and software designs /
adaptations. The HFE practitioners provide expertise that includes design criteria,
analysis and modeling tools, and measurement methods that will help the program
office design systems that are operationally suitable, safe, survivable, effective, usable,
and cost-effective. In any system acquisition process, it is important to recognize the
differences between the competencies (skills and knowledge) required for the various
warfighters. Application of HFE processes will lead to an understanding of the
competencies needed for the job, and help identify if requirements for knowledge, skills,
and abilities (KSAs) exceed what the user can provide and whether the deficiency will
lead to a training or operational problem. HFE tools and techniques can be used to
identify the KSAs of the target audience and account for different classes and levels of
users and the need for various types of information products, training, training systems
and other aids. While it is critical to understand the information processing and net-
centric requirements of the system, it is equally important to understand the factors
affecting format and display of the data presented to the user to avoid cognitive
overload. This applies equally to the system being designed as well as to the systems
which will interface with the system. The system should not place undue workload or
other stress on systems with which it must interface.
Human Factors Engineering (HFE) principles, guidelines, and criteria should be applied
during development and acquisition of military systems, equipment, and facilities to
integrate personnel effectively into the design of the system. An HFE effort should be
provided to: (a) develop or improve all human interfaces of the system; (b) achieve
required effectiveness of human performance during system operation, maintenance,
support, control, and transport; and (c) make economical demands upon personnel
resources, skills, training, and costs. The HFE effort should be well integrated with the
other Human Systems Integration domain participation, and should include, but not
necessarily be limited to, active participation in the following three major interrelated
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      544
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
areas of system development.
6.3.4.5.1. Analysis
Identify the functions that must be performed by the system in achieving its mission
objectives and analyze them to determine the best allocation to personnel, equipment,
software, or combinations thereof. Allocated functions should be further dissected to
define the specific tasks that must be performed to accomplish the functions. Each task
should be analyzed to determine the human performance parameters; the system,
equipment, and software capabilities; and the operational / environmental conditions
under which the tasks will be conducted. Task parameters should be quantified where
possible, and should be expressed in a form that permit’s effectiveness studies of the
human-system interfaces in relation to the total system operation. Human Factors
Engineering high-risk areas should be identified as part of the analysis. Task analysis
should include maintenance and sustainment functions performed by crew and support
facilities. Analyses should be updated as required to remain current with the design
effort.
Human Factors Engineering (HFE) should be applied to the design and development of
the system equipment, software, procedures, work environments, and facilities
associated with all functions requiring personnel interaction. This HFE effort should
convert the mission, system, and task analysis data into a detailed design and
development plan to create a human-system interfaces that will operate within human
performance capabilities, facilitate / optimize human performance in meeting system
functional requirements, and accomplish the mission objectives.
Human Factors Engineering (HFE) and the evaluation of all human interfaces should be
integrated into engineering design and development tests, contractor demonstrations,
flight tests, acceptance tests, other development tests and operational testing.
Compliance with human interface requirements should be tested as early as possible.
T&E should include evaluation of maintenance and sustainment activities and
evaluation of the dimensions and configuration of the environment relative to criteria for
HFE and each of the other Human Systems Integration domains . Findings, analyses,
evaluations, design reviews, modeling, simulations, demonstrations, and other early
engineering tests should be used in planning and conducting later tests. Test planning
should be directed toward verifying that the system can be operated, maintained,
supported, and controlled by user personnel in its intended operational environment
with the intended training. Test planning should also consider data needed or provided
by operational test and evaluation. (See section 9.5.2 ).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      545
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
6.3.4.6. Life-Cycle Sustainment Plan
The program manager should summarize the steps planned to be taken (e.g.,
government and contract deliverables) to ensure human factors engineering (HFE) is
employed during systems engineering over the life of the program to provide for
effective human-system interfaces and meet HFE and other Human Systems
Integration requirements.
Each of the various military departments / services treat the three Human Systems
Integration (HSI) domains of Environment, Safety, and Occupational Health differently,
based on oversight and reporting responsibility within each of the services. DoD ESOH
Guidance for systems acquisition programs can be found in Chapter 4, Systems
Engineering, section 4.3.18.9 , and in the ESOH Special Interest Area on the
Acquisition Community Connection . What is important to the HSI practitioner and the
systems engineer is that these three domains are of vital importance to the HSI effort
and must be integrated within the HSI effort. While the ESOH communities have unique
reporting requirements that trace to National level mandates, the importance of
integrating these domains in the HSI construct cannot be overemphasized. The human
aspect brings a host of issues to a system that must be accommodated in each of these
three areas and they must each be considered in consonance with the other HSI
domains . How they are considered in an integrated manner is left to the Program
Manager and Systems Engineering .
Environment includes the natural and manmade conditions in and around the system
and the operational context within which the system will be operated and supported.
This "environment" affects the human's ability to function as a part of the system.
Safety factors consist of those system design characteristics that serve to minimize the
potential for mishaps causing death or injury to operators, maintainers and supporters
or threaten the survival and/or operation of the system. Prevalent issues include factors
that threaten the safe operation and/or survival of the platform; walking and working
surfaces including work at heights; pressure extremes; and control of hazardous energy
releases such as mechanical, electrical, fluids under pressure, ionizing or non-ionizing
radiation (often referred to as "lock-out/tag-out"), fire, and explosions.
Occupational health factors are those system design features that serve to minimize the
risk of injury, acute or chronic illness, or disability; and/or reduce job performance of
personnel who operate, maintain, or support the system. Prevalent issues include noise,
chemical safety, atmospheric hazards (including those associated with confined space
entry and oxygen deficiency), vibration, ionizing and non-ionizing radiation, and human
factors issues that can create chronic disease and discomfort such as repetitive motion
diseases. Many occupational health problems, particularly noise and chemical
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      546
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
management, overlap with environmental impacts. Human factors stresses that create
risk of chronic disease and discomfort overlap with occupational health considerations.
Environment, safety and health hazard parameters should address all activities inherent
to the life cycle of the system, including test activity, operations, support, maintenance,
and final demilitarization and disposal. Environment, safety and health hazard
requirements should be stated in measurable terms, whenever possible. For example, it
may be appropriate to establish thresholds for the maximum level of acoustic noise,
vibration, acceleration shock, blast, temperature or humidity, or impact forces etc., or
"safeguards against uncontrolled variability beyond specified safe limit’s," where the
Capability Document s specify the "safe limit’s." Safety and health hazard requirements
often stem from human factor issues and are typically based on lessons learned from
comparable or predecessor systems. For example, both physical dimensions and
weight are critical safety requirements for the accommodation of pilots in ejection seat
designs. Environment, safety and health hazard thresholds are often justified in terms of
human performance requirements, because, for example, extreme temperature and
humidity can degrade job performance and lead to frequent or critical errors. Another
methodology for specifying safety and health requirements is to specify the allowable
level of residual risk as defined in MIL-STD-882D, "DoD Standard Practice for System
Safety," for example, "There shall be no high or serious residual risks present in the
system."
The Human Systems Integration Plan should recognize the appropriate timing for the
PESHE and define how the program intends to ensure the effective and efficient flow of
information to and from the ESOH domain experts to work the integration of
environment, safety and health considerations into the systems engineering process
and all its required products.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      547
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
figures to support trade-off analysis. There are nine health hazard issues typically
addressed in a health hazard analysis (HHA):
    •    Acoustical Energy. The potential energy that transmits through the air and
         interacts with the body to cause hearing loss or damage to internal organs.
    •    Biological Substances. An infectious substance generally capable of causing
         permanent disability or life-threatening or fatal disease in otherwise healthy
         humans.
    •    Chemical Substances. The hazards from excessive airborne concentrations of
         toxic materials contracted through inhalation, ingestion, and skin or eye contract.
    •    Oxygen Deficiency. The displacement of atmospheric oxygen from enclosed
         spaces or at high altitudes.
    •    Radiation Energy. Ionizing: The radiation causing ionization when interfacing
         with living or inanimate mater. Non-ionizing: The emissions from the
         electromagnetic spectrum with insufficient energy to produce ionizing of
         molecules.
    •    Shock. The mechanical impulse or impact on an individual from the acceleration
         or deceleration of a medium.
    •    Temperature Extremes and Humidity. The human health effects associated
         with high or low temperatures, sometimes exacerbated by the use of a materiel
         system.
    •    Trauma. Physical: The impact to the eyes or body surface by a sharp or blunt
         object. Musculoskeletal: The effects to the system while lifting heavy objects.
    •    Vibration. The contact of a mechanically oscillating surface with the human
         body.
6.3.6. Survivability
Survivability factors consist of those system design features that reduce the risk of
fratricide, detection, and the probability of being attacked; and that enable the crew to
withstand natural and man-made hostile environments without aborting the mission or
suffering acute chronic illness, disability, or death. Survivability attributes, as described
in the Joint Military Dictionary (JP 1-02) , are those that contribute to the survivability of
manned systems. In the HSI construct, the human is considered integral to the system
and personnel survivability should be considered in the encompassing "system" context.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      548
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
battlefield. NBC survivability, by definition, includes the instantaneous, cumulative, and
residual effects of NBC weapons upon the system, including its personnel. It may be
appropriate to require that the system "permit performance of mission-essential
operations, communications, maintenance, re-supply and decontamination tasks by
suitably clothed, trained, and acclimatized personnel for the survival periods and NBC
environments required by the system."
The consideration of survivability should also include system requirements to ensure the
integrity of the crew compartment and rapid egress when the system is damaged or
destroyed. It may be appropriate to require that the system provide for adequate
emergency systems for contingency management, escape, survival, and rescue.
The Joint Capabilities Integration and Development System capability documents define
the program's combat performance and survivability needs. Consistent with those
needs, the program manager should establish a survivability program. This program,
overseen by the program manager, should seek to minimize (1) the probability of
encountering combat threats, (2) the severity of potential wounds and injury incurred by
personnel operating or maintaining the system, and (3) the risk of potential fratricidal
incidents. To maximize effectiveness, the program manager should assess survivability
in close coordination with systems engineering and test and evaluation activities .
During early stages of the acquisition process, sufficient information may not always be
available to develop a complete list of survivability issues. An initial report is prepared
listing those identified issues and any findings and conclusions. Classified data and
findings are to be appropriately handled according to each DoD Component's
guidelines. Survivability issues typically are divided into the following components:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      549
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         and radar signatures and methods that may be utilized by enemy equipment and
         personnel. Methods of reducing detectability could include camouflage, low-
         observable technology, smoke, countermeasures, signature distortion, training,
         and/or doctrine.
    •    Reduce Probability of Attack. Analysts should seek to reduce the probability of
         attack by avoiding appearing as a high value-target and by actively preventing or
         deterring attack by warning sensors and use of active countermeasures.
    •    Minimize Damage if Attacked. Analysts should seek to minimize damage, if
         attacked, by: 1) designing the system to protect the operators and crewmembers
         from enemy attacks; 2) improving tactics in the field so survivability is increased;
         3) designing the system to protect the crew from on-board hazards in the event
         of an attack (e.g., fuel, munitions, etc.); and, 4) designing the system to minimize
         the risk to supporting personnel if the system is attacked. Subject matter experts
         in areas such as nuclear, biological and chemical warfare, ballistics, electronic
         warfare, directed energy, laser hardening, medical treatment, physiology, human
         factors, and Information Operations can add additional issues.
    •    Minimize Injury. Analysts should seek to minimize: 1) combat, enemy weapon-
         caused injuries; 2) the combat-damaged systems potential sources and types of
         injury to both its crew and supported troops as it is used and maintained in the
         field; 3) the system’s ability to prevent further injury to the fighter after being
         attacked; and 4) the system’s ability to support treatment and evacuation of
         injured personnel. Combat-caused injuries or other possible injuries are
         addressed in this portion of personnel survivability, along with the different
         perspectives on potential mechanisms for reducing damage. Evacuation
         capability and personal equipment needs (e.g. uniform straps to pull a crew
         member through a small evacuation port are addressed here.
    •    Minimize Physical and Mental Fatigue. Analysts should seek to minimize
         injuries that can be directly traced to physical or mental fatigue. These types of
         injuries can be traced to complex or repetitive tasks, physically taxing operations,
         sleep deprivation, or high stress environments.
    •    Survive Extreme Environments. This component addresses issues that will
         arise once the warfighter evacuates or is forced from a combat-affected system
         such as an aircraft or watercraft and must immediately survive extreme
         conditions encountered in the sea or air until rescued or an improved situation on
         land is reached. Dependent upon requirements, this may also include some
         extreme environmental conditions found on land, but generally this component is
         for sea and air where the need is immediate for special consideration to maintain
         an individual's life. Survival issues for downed pilots behind enemy lines should
         be considered here.
The program manager should summarize plans for survivability in the Life-Cycle
Sustainment Plan and address survivability risks and plans for risk mitigation. If the
system or program has been designated by Director, Operational Test & Evaluation, for
live fire test and evaluation (LFT&E) oversight, the program manager should integrate
T&E to address crew survivability issues into the LFT&E program to support the
Secretary of Defense LFT&E Report to Congress ( 10 USC 2366 ). The program
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      550
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
manager should address special equipment or gear needed to sustain crew operations
in the operational environment.
6.3.7. Habitability
Habitability factors are those living and working conditions that are necessary to sustain
the morale, safety, health, and comfort of the user population. They directly contribute to
personnel effectiveness and mission accomplishment, and often preclude recruitment
and retention problems. Examples include: lighting, space, ventilation, and sanitation;
noise and temperature control (i.e., heating and air conditioning); religious, medical, and
food services availability; and berthing, bathing, and personal hygiene.
While a system, facility, and/or service should not be designed solely around optimum
habitability factors, habitability factors cannot be systematically traded-off in support of
other readiness elements without eventually degrading mission performance.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      551
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
6.4. Human Systems Integration (HSI) throughout the System Life Cycle
6.4.3.2.2. Allocations
6.4. Human Systems Integration (HSI) throughout the System Life Cycle
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      552
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
demonstrations.
It is equally important that this research work be rolled into the front end analyses that
lead to capability requirements. HSI considerations should be carefully examined during
the capabilities-based assessment, and the planning for and execution of the Analyses
of Alternatives. Failure to examine the human-centric issues up front may unduly
complicate integration in a defined materiel solution.
The Initial Capabilities Document may seek to establish a new capability, improve an
existing capability, or exploit an opportunity to reduce costs or enhance performance.
The Initial Capabilities Document describes the key boundary conditions and
operational environments that impact how the system is employed to satisfy the mission
need. Key boundary conditions include critical manpower, personnel, training,
environment, safety, occupational health, human factors, habitability, and survivability
factors that have a major impact on system performance and life-cycle costs. The
Doctrine, Organization, Training, Materiel, Leadership and Education, Personnel, or
Facilities considerations and implications section of the Initial Capabilities Document
should discuss all relevant domains of HSI .
As plans for the system mature, the capabilities documents should become more
specific and reflect the integration of program objectives. The program manager should
work with Human Systems Integration (HSI) practitioners and user representatives to
translate HSI thresholds and objectives in the capabilities documents into quantifiable
and measurable system requirements. The program manager should refine and
integrate operational and design requirements so they result in the proper balance
between performance and cost, and keep programs affordable. Additionally, system
requirements should serve as the basis for developing engineering specifications, and
should be reflected in the statement of work, contracts, Test and Evaluation Master Plan
, and other program documentation. Over the course of the acquisition process, as
trade-offs are made and plans for the system design mature, the capabilities documents
should be updated to reflect a more refined and integrated set of parameters.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      553
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
system or an increment of capability; reduce integration and manufacturing risk
(technology risk reduction occurs during Technology Development); ensure operational
supportability with particular attention to reducing the logistic footprint; implement
Human Systems Integration; design for producibility; ensure affordability and protection
of critical program information by implementing appropriate techniques such as anti-
tamper; and demonstrate system integration, interoperability, safety and utility.
Human Systems Integration considerations should be clearly defined and given proper
weight in solicitations and proposal evaluation guidelines provided to the government
evaluation team. The record of contractors in Human Systems Integration should be an
element of bid selection and contract performance criteria.
Once parameters are established in the Initial Capabilities Document and Capability
Development Document , Requirements Definition Package or Capability Drop, it is the
program manager's responsibility to ensure that they are addressed during the systems
engineering process , included in the Human Systems Integration (HSI) Plan and the
Systems Engineering Plan (SEP) , and properly considered during cost/performance
trade-off analyses. Consistent with paragraph E1.1.29 of DoD Directive 5000.01 , the
program manager applies HSI to optimize total system performance, operational
effectiveness, suitability, survivability, safety, and affordability. Program managers
should consider supportability, life-cycle costs, performance, and schedule comparable
in making program decisions. Each program is required to have a comprehensive plan
for HSI. It is important that this plan be included in the SEP or as a stand-alone HSI
Plan as the program(s) may require. As required by DoD Instruction 5000.02, the
program manager should take steps (e.g., contract deliverables and
Government/contractor Integrated Product Teams) to ensure human factors engineering
/cognitive engineering is employed during systems engineering. These steps should
occur from the Materiel Solution Analysis phase through the life of the program to
provide for effective human-machine interfaces, meet HSI requirements, and (as
appropriate) support a system-of-systems acquisition approach. The program manager
should also ensure that HSI requirements are included in performance specifications
and test criteria. Manpower, Personnel, and Training functional representatives, as user
representatives, participate in the systems engineering process to help produce the
proper balance between system performance and cost and to ensure that requirements
remain at affordable levels. Manpower, personnel, training, and supportability analyses
should be conducted as an integral part of the systems engineering process throughout
the acquisition life cycle, beginning with Materiel Solution Analysis and continuing
throughout program development.
Human Systems Integration (HSI) plays a major role in the design process. Front-end
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      554
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
analysis methods, such as those described in MIL-HDBK-46855A , should be pursued
to maximize the effectiveness of the new system. Initial emphasis should be placed on
"lessons learned" from legacy, predecessor or comparable systems to help identify and
eliminate characteristics in the new system that require excessive cognitive, physical, or
sensory skills or high aptitudes; involve complex fault location or workload intensive
tasks; necessitate excessive training; require proficiency training; or result in frequent or
critical errors or safety/health hazards. Placing an emphasis on the "human-in-the-loop"
ensures that systems are designed to operate consistent with human performance
capabilities and limitations, meet system functional requirements, and fulfill mission
goals with the least possible demands on manpower, personnel, and training. Moreover,
sound HSI applications can minimize added costs that result when systems have to be
modified after they are fielded in order to correct performance and safety issues.
6.4.3.2.2. Allocations
It is primarily the responsibility of the program manager, with the assistance of the
Integrated Product Teams, to establish performance specifications, design criteria
standards, interface standards, and data specifications in the solicitation and resulting
contract. Strong consideration should be given to establishing standards when uniform
configuration is necessary for ease of operation, safety, or training purposes. For
instance, a control panel or avionics suite may need to be standardized to enhance the
ability of the user to access information and to respond quickly in an emergency
situation. Standard features preclude the need to teach multiple (or conflicting)
responses to similar tasks. Standardization is particularly important when a standard
performance is required for safety reasons. For instance, rapid ejection from the cockpit
should require standard procedures and tasks. If there are unique health hazard or
survivability requirements, such as vibration or shock tolerances, extended temperature
range, or noise levels, standardization may be the most efficient way to ensure that the
system meets those special requirements. Preference should be given to specifications
and standards developed under the Defense Standardization Program. Regulatory
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      555
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
occupational exposure standards create performance thresholds. However, use of
guidance exposure criteria and ergonomic/Human Systems Integration guidelines
should be considered to ensure personnel protection, promote efficiency, and anticipate
more stringent standards that are likely to be required during the life cycle of the
system.
Performance standards for operators, maintainers, both individual and team, are
derived from the performance requirements of the total system. For example, human
performance requirements (e.g., completion times or success rates) presumes that in
order for the total system to achieve specified performance levels, the human will have
to complete tasks or achieve performance objectives within specified confidence levels
(usually expressed in terms of per cent of actions completed within a specified time-
frame and/or error limit). The training/instructional system should be developed to
ensure that operators can meet or exceed the personnel performance levels required to
operate/maintain the systems. Additionally, manpower should be determined based on
these same performance requirements. Operational tests should also be based on the
same criteria.
The objective of this phase is the execution of a support program that meets operational
support performance requirements and sustains the system in the most cost-effective
manner over its life cycle. As required by DoD Directive 5000.01, planning for O&S
should begin as early as possible in the acquisition process. Efforts during the O&S
phase should be directed towards ensuring that the program meets and has the
resources to sustain the threshold values of all support performance requirements.
Once the system is fielded or deployed, a follow-on operational testing program, to
assess performance, quality, compatibility, and interoperability, and identify deficiencies,
should be conducted, as appropriate. Post fielding verification of the manpower, and
information resulting from training exercises, readiness reports, and audits can also be
used to assess the operational capability of the system. During fielding, deployment,
and throughout operational support, the need for modifications to the system should be
assessed.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      556
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Components should plan programs based on realistic projections of the dollars and
manpower likely to be available in future years. When major manpower increases are
required to support the program, or major manpower shortfalls exist, they will be
identified as risks in the Manpower Estimate, and addressed in the risk assessment
section of the Acquisition Strategy . Program risks that result from manpower shortfalls
should be addressed in terms of their impact on readiness, operational availability, or
reduced combat capability.
The following DoD Directives and Instructions provide policy and direction:
The following military standards (MIL-STD), DoD Handbooks (DOD-HDBK), and Military
handbooks (MIL-HDBK) can be used to support Human Systems Integration analysis:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      557
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    MIL-PRF-29612 , "Performance Specification, Training Data Products"
    •    "A Guide for Early Embedded Training Decisions," U.S. Army Research Institute
         for the Behavioral and Social Sciences Research Product 96-06
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      558
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
DEFENSE ACQUISITION GUIDEBOOK
Chapter 7 - Acquiring Information Technology
7.0. Overview
7.1. Introduction
7.8. The Clinger-Cohen Act (CCA) -- Subtitle III of Title 40 United States Code
(U.S.C.)
7.0. Overview
7.0.1. Purpose
7.0.2. Contents
7.0.1. Purpose
The goal of this chapter is to help program managers (PMs) and Sponsors/Domain
Owners implement Department of Defense (DoD) policies intended to achieve
fundamentally joint, net-centric, distributed forces capable of rapid decision superiority
and massed effects across the battle space. This chapter explains how the DoD is using
a net-centric strategy to transform DoD warfighting, business, and intelligence
capabilities. The chapter provides descriptions and explanations of many of the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      559
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
associated topics and concepts. This chapter also discusses many of the activities that
enable the development of net-centric systems, however, not all activities are the direct
responsibility of the PM. Many activities reflect Department-level effort that occurs prior
to, or outside of, the acquisition process. The detailed discussions of such a broad set
of activities are presented here to help the PM understand the context of the capabilities
described in the Joint Capabilities Integration and Development System (JCIDS)
documents and required of the system under development.
7.0.2. Contents
This chapter contains ten sections that present the Program Manager with a
comprehensive review of topics, concepts, and activities associated with the acquisition
of Information Technology (IT), including National Security Systems (NSS).
Section 7.1, "Introduction," explains net-centric information sharing in the context of the
discussions and requirements outlined in the various other sections of this chapter.
Section 7.2, "DoD Information Enterprise (DoD IE)," explains several important
concepts that provide a foundation for acquiring net-centric Information Technology
(including NSS). The overarching concept is that the DoD Enterprise Architecture (DoD
EA) is used to describe and document current and desired relationships among
warfighting operations, business, and management processes, the entities involved,
and the information used. The IT architectures (i.e., IT solutions) are then aligned with
the DoD EA.
DoD Architecture Framework (DoDAF) views that comprise architectures that are the
DoD EA, and the DoD EA as a whole:
The section discusses the DoD IEA and its role in helping PMs and Sponsors/Domain
Owners describe their transition from the current environment to the future net-centric
environment. Sections 7.3 through 7.10 elaborate on specific areas on which the
Sponsors/Domain Owners and PMs should focus as they work to deliver and improve
the reach, richness, agility, and assurance of net-centric capabilities.
Section 7.4, "Sharing Data, Information, and Information Technology (IT) Service,"
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      560
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
provides guidance on implementing DoD Net-centric Data Strategy and Goals, and
outlines Data, Information, and IT Services Sharing tasks as they relate to the
acquisition process.
Section 7.8, "Clinger-Cohen Act," helps PMs and Sponsors/Domain Owners understand
how to implement Subtitle III of title 40 United States Code (formerly known as division
E of the Clinger-Cohen Act (CCA) and hereinafter referred to as "Title 40/CCA") and
associated regulatory requirements.
Section 7.9, "Post Deployment Reviews," discusses how the Department of Defense
(DoD) uses the Post Implementation Review to inform Sponsors of the degree to which
their IT/NSS investments closed the needed capability gaps.
Section 7.10, "Commercial, Off-The-Shelf (COTS) Solutions," provides insight into DoD
guidance regarding acquisition of COTS software products.
In summary, this chapter should help PMs and Sponsors/Domain Owners understand
and apply the tools of the DoD EA so that they can more effectively:
    •    Describe and measure the degree to which their programs are interoperable and
         supportable with the DoD IE.
    •    Ensure their programs employ and institutionalize approaches that make data
         visible, accessible, understandable, trusted, interoperable and responsive.
    •    Achieve the Department's objectives for IA.
    •    Ensure their programs will have assured interoperable access to electromagnetic
         spectrum.
    •    Achieve these goals within the constraints of the law and where possible, through
         the use of commercially available solutions.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      561
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
7.1. Introduction
7.1. Introduction
The DoD Transformation Planning Guidance (April 2003) defines the desired outcome
of transformation as "fundamentally joint, network-centric, distributed forces capable of
rapid decision superiority and massed effects across the battle space." The goal of this
chapter is to help PMs and Sponsors/Domain Owners implement the DoD policies that
are intended to achieve this outcome. This introduction briefly explains net-centricity in
context of the requirements outlined in the various other sections of this chapter.
The Department's approach for transitioning to net-centric operations and warfare and
achieving the net-centric information sharing vision focuses on five key areas where
increased attention and investment will bring the most immediate progress towards
realizing net-centric goals:
This approach uses the Information Enterprise (IE) as "the organizing and transforming
construct for managing information technology throughout the Department." It envisions
moving to trusted network-centric operations through the acquisition of services and
systems that are secure, reliable, interoperable, and able to communicate across a
universal Information Technology infrastructure, including NSS. This Information
Technology infrastructure includes data, information, processes, organizational
interactions, skills, and analytical expertise, as well as systems, networks, and
information exchange capabilities. The rest of this chapter describes the concepts,
topics, and activities to achieve this transformation.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      562
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
7.2. DoD Information Enterprise
7.2.1. Introduction
7.2.1. Introduction
To provide a conceptual framework for this change, the Department has defined a
Department of Defense Information Enterprise (DoD IE) as an organizing construct. The
DoD IE consists of the Department of Defense information assets, processes, activities,
and resources required to achieve an information advantage and share information
across the Department and with mission partners. The DoD IE includes:
    •    The information itself, which is a key asset to the Department, and the
         Department's management over the information life cycle.
    •    The processes, including risk management, associated with managing
         information to accomplish the DoD mission and functions.
    •    Activities related to designing, building, populating, acquiring, managing,
         operating, protecting and defending the information enterprise.
    •    Related information resources such as personnel, funds, equipment, and
         information technology, including national security systems.
The DoD IE vision is transforming the Department into an agile enterprise empowered
by access to and sharing of timely and trusted information. The net-centric vision of the
DoD IE is to function as one unified DoD Enterprise, creating an information advantage
for our people and mission partners by providing:
    •    A rich information sharing environment in which data and services are visible,
         accessible, understandable, and trusted across the enterprise.
    •    An available and protected network infrastructure (the Global Information Grid
         (GIG)) that enables responsive information-centric operations using dynamic and
         interoperable communications and computing capabilities.
PMs and Sponsors/Domain Owners should use this vision to help guide their acquisition
programs. This vision requires a comprehensive information capability that is global,
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      563
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
robust, survivable, maintainable, interoperable, secure, reliable, and user-driven to be
operationally suitable, safe, effective, usable and affordable across the life cycle of the
systems.
The IT infrastructure of the Department is the GIG. The GIG is the Department's globally
interconnected end-to-end set of information capabilities for collecting, processing,
storing, disseminating, and managing information on demand to warfighters, policy
makers, and support personnel. The GIG includes owned and leased communications
and computing systems and services, software (including applications), data, security
services, other associated services, and National Security Systems. Non-GIG IT
includes stand-alone, self-contained, or embedded IT that is not and will not be
connected to the enterprise network.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      564
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
priorities to address critical barriers to its realization.
The published DoD IEA describes the integrated Defense Information Enterprise and
the rules for the information assets and resources that enable it. The DoD IEA unifies
the concepts embedded in the Department's net-centric strategies into a common
vision, providing relevance and context to existing policy. The DoD IEA highlights the
key principles, rules, constraints and best practices drawn from collective policy to
which applicable DoD programs, regardless of Component or portfolio, must adhere in
order to enable agile, collaborative net-centric operations. The DoD IEA provides
information for applying it in architecture development and complying with it.
Extracts:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      565
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         characterize these interrelationships.
Extract:
The DoD components under /DoD CIO leadership are required to develop an Enterprise
Architecture that aligns with the DoD EA, and use their architecture and the DoD EA to
guide the acquisition of IT.
Each IT acquisition program (or set of programs) is also required to develop a solution
architecture comprised of DoDAF viewpoints determined to meet the needs of the PM
(Fit-for-Purpose) and then use these products over the program life cycle to guide,
monitor, and implement solutions in alignment with the DoD Enterprise Architecture as
described in the DoD IEA.
Using these architectures and plans, the)/DoD CIO, in collaboration with Under
Secretary of Defense for Acquisition, Technology and Logistics (USD(AT&L)) and
portfolio managers will conduct capability assessments, guide systems development,
and define the associated investment plans as the basis for aligning resources
throughout the Planning, Programming, Budgeting, and Execution process.
It is DoD policy that all Information Technology (IT), including NSS, and major
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      566
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
modifications to existing IT will be compliant with the Title 40/CCA, DoD interoperability
regulations and policies, and the most current version of the DoD Information
Technology Standards Registry (DISR). Establishing interoperability and supportability
in a DoD system is a continuous process that must be managed throughout the life
cycle of the system. The following elements comprise the Net-Ready Key Performance
Parameter (NR-KPP): 1) compliant architecture; 2) compliance with DoD Net-centric
Data and Services strategies; 3) compliance with applicable GIG Technical Guidance;
4) verification of compliance with DoD information assurance requirements; and 5)
compliance with supportability elements to include spectrum utilization and information
bandwidth requirements, Selective Availability Anti-Spoofing Module (SAASM) and the
Joint Tactical Radio System, as applicable. (See CJCSI 6212.01, Enclosure A,
paragraph 1.e.)
Extract:
This document reissues and renames DoD Directive 8000.01 and cancels 8100.01. The
new DoD Directive 8000.01 requires the following:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      567
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    The capital planning and investment control process interface with the DoD key
         decision support systems for capability identification; planning, programming,
         budgeting, and execution; and acquisition.
    •    Review of all Information Technology (IT) investments for compliance with
         architectures, IT standards, and related policy requirements.
    •    Acquisition strategies appropriately allocate risk between the Government and
         contractor; effectively use competition; tie contract payments to performance;
         and, where practicable, take maximum advantage of commercial off-the-shelf
         and non-developmental item technology.
    •    Information solutions structured in useful segments, narrow in scope and brief in
         duration; each segment solves a specific part of the overall mission problem and
         delivers a measurable net benefit independent of future segments.
7.2.3.3. DoD Chief Information Officer (CIO) Use of the DoD Information Enterprise
Architecture (IEA)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      568
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
       the Federal Enterprise Architecture (FEA).
    5. An architecture is considered a strategic information asset and should be
       appropriately secured, shared and made available to any DoD user or mission
       partner to the maximum extent allowed by law and DoD policy.
Detailed compliance requirements for the DoD EA are contained in the DoD IEA. To
comply with the DoD EA, an information technology (IT)-based initiative or an
acquisition program, throughout its life cycle should:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      569
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         Architectures developed in the DoD are more easily leveraged when they are
         widely visible and accessible across DoD. Widely visible and accessible
         architectures result in increased information sharing, reuse, and a more common
         understanding of the bigger picture. A fully federated EA can only be realized if
         all architectures in DoD are properly registered in DARS with appropriate links
         and relationships. DARS is located at https://dars1.army.mil/IER2/ and includes a
         tutorial for the registration process.
    •    Mandatory Core Designated DoD Enterprise Services are common, globally-
         accessible services designated by the DoD CIO and mandated for use by all
         programs and initiatives. No capability comparable to the Mandatory Core
         Designated DoD ES is to be developed unless there is a waiver granted by the
         DoD CIO.
7.2.3.3. DoD Chief Information Officer (CIO) Use of the DoD Information Enterprise
Architecture (IEA)
The DoD CIO uses the DoD Information Enterprise Architecture (IEA) in all three of the
major decision processes of the Department.
The DoD CIO uses the DoD IEA throughout the processes included in operating the
Joint Capabilities Integration and Development System (JCIDS) to:
The DoD CIO uses the DoD IEA throughout the Planning, Programming, Budgeting and
Execution (PPBE) process to:
    •    Review and provide recommendations for development of the Guidance for the
         Development of the Force and the Joint Programming Guidance.
    •    Provide recommendations to the Senior Level Review Group relating to
         Information Technology (IT) (including National Security Systems (NSS)),
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      570
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         interoperability, and Information Assurance (IA).
    •    Review and evaluate Program Change Proposals and Budget Change Proposals
         relating to IT (including NSS), interoperability, and IA.
    •    Provide recommendations for Program Objective Memorandum planning and
         programming advice.
Finally, the DoD CIO uses the DoD IEA throughout the Defense Acquisition Process to:
The following sections outline steps that the DoD Components, Combat Developers,
Sponsors, Domain Owners, DoD Agencies, Program Managers, and/or other assigned
managers should take to facilitate DoD Information Enterprise Architecture (IEA)
compliance and net-centric information sharing when acquiring Information Technology-
enabled capabilities that will interoperate within the Global Information Grid.
At Milestones, A, B, and C, architects should assure that any new architectural models
they develop conform to the current version of the DoD Architecture Framework
(DoDAF). The latest version of the DoDAF is always available on the DoD Architecture
Registry System (DARS) website, URL https://dars1.army.mil/. Existing architecture
models that require an update for reasons other than a DoDAF version change should
include the updates necessary to conform with the most current DoDAF. Stable
architecture models that do not otherwise require an update do not need to be updated
solely because the DoDAF has changed. Also, IAW DoD policy, all AV-1s must be
registered in the DARS. Instructions on how to do this are on the DARS portal.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      571
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
7.2.4.1. Before Milestone A
Ensure that appropriate steps are taken to prepare or update a concept of operations
and an operational view (High-level Operational Concept Description, OV-1) of the
integrated (solutions) architecture for key mission areas and business processes using
the DoD Architecture Framework (DoDAF) and the guidance in CJCS Instruction
6212.01. The Initial Capabilities Document (ICD) should reflect this architecture work,
as prescribed by CJCS Instruction 3170.01 and in the format provided in the JCIDS
Manual. It also supports analysis of alternatives, business process reengineering
efforts, development of the acquisition strategy and acquisition information assurance
(IA) strategy, and provides key artifacts that support development of the Information
Support Plan. Ensure that architectures adhere to the DoD net-centric strategies.
Ensure that the mandatory architecture views align with the DoD Information Enterprise
Architecture (IEA) and show linkage to parent enterprise architectures, where available,
and DoD Component and DoD-level Capability Portfolio Management architecture
descriptions as they emerge.
Compliance with the DoD IEA is mandatory for any platform, program of record, system,
subsystem, component, or application that conducts communications. Business
systems should align with the Business Enterprise Architecture.
Build or update the architecture and supporting views (All Views, Capability Views, Data
and Information Views, Operational Views, Project Views, Services Views, Systems
Views, and Standards Views).
Begin development of the Information Support Plan for review. Use the criteria in CJCS
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      572
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Instruction 6212.01 to guide the acquisition of net-centric capabilities.
Update the architecture and supporting views (All Viewpoint, Capability Views, Data and
Information Views, Operational Views, Project Views, Services Views, Systems Views,
and Standards Views) and ensure changes are reflected in the Capability Production
Document, as prescribed by CJCS Instruction 3170.01 in the format provided in the
JCIDS Manual, and in the Net-Ready Key Performance Parameter (NR-KPP). If the
program is entering the acquisition process at Milestone C, develop an NR-KPP using
guidance in CJCS Instruction 6212.01.
Use the criteria in CJCS Instruction 6212.01 to ensure services and data products
delivered by the acquisition align with the Department's objectives for net-centricity.
Prepare and submit the Information Support Plan for final review.
Address all information exchange requirements as part of the Information Support Plan
and the Information Technology and National Security Systems Interoperability
Certification processes.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      573
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
7.2.5.2. DoD Information Technology (IT) Standards Registry (DISR)
7.2.5.5. Global Information Grid (GIG) Enterprise Services (GIG ES) Capability
Development Document
The following paragraphs describe the major sources of guidance and tools related to
the DoD Enterprise Architecture and supporting DoD strategies for implementing the
architecture in information technology (including National Security Systems) programs.
Program Managers and sponsors/domain owners should use the guidance, tools, and
strategies outlined below throughout a program's life cycle to meet a variety of statutory
and regulatory requirements.
DoDAF has been designed to meet the specific business and operational needs of the
DoD. It defines a way of representing an enterprise architecture that enables
stakeholders to focus on specific areas of interests in the enterprise, while retaining
sight of the big picture. To assist decision-makers, DoDAF provides the means of
abstracting essential information from the underlying complexity and presenting it in a
way that maintains coherence and consistency. One of the principal objectives is to
present this information in a way that is understandable to the many stakeholder
communities involved in developing, delivering, and sustaining capabilities in support of
the stakeholder's mission. It does so by dividing the problem space into manageable
pieces, according to the stakeholder's viewpoint, further defined as DoDAF-described
Models.
Each viewpoint has a particular purpose, and usually presents one or combinations of
the following:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      574
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
expense of the underlying data.
    •    The All Viewpoint describes the overarching aspects of architecture context that
         relate to all viewpoints.
    •    The Capability Viewpoint articulates the capability requirements, the delivery
         timing, and the deployed capability.
    •    The Data and Information Viewpoint articulates the data relationships and
         alignment structures in the architecture content for the capability and operational
         requirements, system engineering processes, and systems and services.
    •    The Operational Viewpoint includes the operational scenarios, activities, and
         requirements that support capabilities.
    •    The Project Viewpoint describes the relationships between operational and
         capability requirements and the various projects being implemented. The Project
         Viewpoint also details dependencies among capability and operational
         requirements, system engineering processes, systems design, and services
         design within the Defense Acquisition System process. An example is the
         Vcharts in Chapter 4 of the Defense Acquisition Guide.
    •    The Services Viewpoint is the design for solutions articulating the Performers,
         Activities, Services, and their Exchanges, providing for or supporting operational
         and capability functions.
    •    The Standards Viewpoint articulates the applicable operational, business,
         technical, and industry policies, standards, guidance, constraints, and forecasts
         that apply to capability and operational requirements, system engineering
         processes, and systems and services.
    •    The Systems Viewpoint , for Legacy support, is the design for solutions
         articulating the systems, their composition, interconnectivity, and context
         providing for or supporting operational and capability functions.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      575
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                              Figure 7.2.5.1.F1 DoDAF Models Viewpoints
DoDAF Viewpoints
Typically the Combat Developer (or Domain Owner/Sponsor) will be responsible for the
architecture description prior to Milestone B with the Program Manager taking on the
responsibility subsequent to the approval at Milestone B.
The DoD IT Standards Registry is an online repository for a minimal set of IT standards
to support interoperability. These standards are used as the "building codes" for all
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      576
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
systems being procured in the DoD. Use of these building codes facilitates
interoperability among systems and integration of new systems into the Information
Enterprise. In addition, the DISR provides the capability to build profiles of standards
that programs will use to deliver net-centric capabilities.
When building systems, requests for proposals (RFPs) and contract statements of work
(SOWs) should be reviewed as part of approved acquisition processes to ensure IT
standards established in Initial Capabilities Documents, Capability Development
Documents, and Capability Production Documents (Intelink account required) are
translated into clear contractual requirements. In addition, RFPs and contract SOWs
should contain additional requirements for contractors to identify instances where cost,
schedule, or performance impacts may preclude the use of IT standards mandated in
DISR. Key net-centric elements that program architectures should focus on include:
    •    Internet Protocol Ensure data packets are routed across network, not switched
         via dedicated circuits. Focus on establishing IP as the convergence layer.
    •    Secure and Available Communications Encrypted initially for core network;
         goal is edge-to-edge encryption and hardened against denial of service. Focus is
         on Black (encrypted) Transport Layer to be established through the
         Transformational Communications Architecture implementation.
    •    Assured Sharing Trusted accessibility to net resources (data, services,
         applications, people, devices, collaborative environment, etc). Focus on assured
         access for authorized users and denied access for unauthorized users.
    •    Quality of Service Data timeliness, accuracy, completeness, integrity,
         availability, and ease of use. This is envisioned as being measured through the
         Net-Ready Key Performance Parameter. Focus on Service Level Agreements
         and service protocols with quality and performance metrics.
The DoD Net-Centric Data Strategy provides the basis for implementing and sharing
data in a net-centric environment. It describes the requirements for inputting and
sharing data, metadata, and forming dynamic communities to share data. Program
Managers (PMs) and Sponsors/Domain Owners should comply with the explicit
requirements and the intent of this strategy, which is to share data as widely and as
rapidly as possible, consistent with security requirements. Additional requirements and
details on implementing the DoD Data Strategy are found in section 7.4. (Refer to DoD
Net-Centric Data Strategy, May 2003, issued by Assistant Secretary of Defense for
Networks and Information Integration (DoD Chief Information Officer (DoD CIO).
The DoD Net-Centric Services Strategy (NCSS) reflects the DoD's recognition that a
service-oriented approach can result in an explosion of capabilities for our warfighters
and decision makers, thereby increasing operational effectiveness. A service-oriented
approach can accelerate the DoD's ongoing effort to achieve net-centric operations by
ensuring that our warfighters receive the right information, from trusted and accurate
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      577
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
sources, when and where it is needed.
The DoD NCSS builds upon the DoD Net-Centric Data Strategy's goals of making data
assets visible, accessible, and understandable. This strategy establishes services as
the preferred means by which data producers and capability providers can make their
data assets and capabilities available across the DoD and beyond. It also establishes
services as the preferred means by which consumers can access and use these data
assets and capabilities.
    •    Supported by the required use of a single set of common standards, rules, and
         shared secure infrastructure provided by the Enterprise Information Environment
         Mission Area (EIEMA);
    •    Populated with appropriately secure mission and business services provided and
         used by each Mission Area;
    •    Governed by a cross-Mission Area board, which is chaired by the DoD CIO;
    •    Managed by Global information Grid (GIG) Network Operations (NetOps).
When this vision is achieved, all members of the DoD will realize significant benefits. A
common infrastructure enables force capabilities to be readily networked in support of
joint warfighting and operations. Interoperability of capabilities is improved when Military
Departments, Agencies, and mission partners create reusable "building blocks" through
the use of services. The coordinated management of this environment under GIG
NetOps provides the necessary situational awareness for joint forces to use the
capabilities that are available. The DoD's commitment to govern this evolution will
greatly improve the ability to respond to evolving operations and missions. (Refer to:
DoD Net-Centric Services Strategy, Strategy for a Net-Centric, Service Oriented DoD
Enterprise, March, 2007, issued by DoD CIO.)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      578
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         Information sharing problems exist within communities; the solutions must come
         from within those communities.
    •    Data, services and applications must be visible, accessible, understandable, and
         trusted to include consideration of "the unanticipated user". All needs can never
         be fully anticipated. There will inevitably be unanticipated situations,
         unanticipated processes, and unanticipated partners. By building capabilities
         designed to support users outside of the expected set, the Department can
         achieve a measure of agility as a competitive advantage over our adversaries.
    •    Enterprise Services providing data or information shall be authoritative and, thus,
         trusted as being accurate, complete and having assured integrity. Authoritative
         information has a pedigree that can be traced to a trusted source.
    •    Enterprise Services must be hosted in environments that meet minimum GIG
         computing node standards in terms of availability, support and backup. A small
         set of Enterprise Services, designated as Core Enterprise Services, are
         mandated for DoD-wide use by the DoD CIO in order to provide enterprise-wide
         awareness, access and delivery of information via the GIG.
Refer to: DoD Information Enterprise Architecture (IEA) issued by DoD CIO.
The DoD IA Strategic Plan defines an enterprise-wide strategic direction for assuring
information and guides planners, programmers, strategists and organizational leaders.
The Net-Centric Enterprise IA Strategy serves as an annex to the DoD IA Strategic
Plan, and focuses specifically on amplifying the goals and approaches for transforming
to the IA essential to safeguarding a net-centric information environment.
The Net-Centric Enterprise IA Strategy is a driver for the IA Component of the Global
information Grid (GIG) Architecture. The Net-Centric IA Strategy describes the DoD
strategy for integration of IA into the global, net-centric information environment. The
end-to-end IA component of the GIG is comprised of a set of informational documents
and DoD Architecture Framework (DoDAF) products (tools) that define IA constructs as
conceptualized and specified for integration of IA into the net-centric information
environment in support of a secure, globally interconnected, end-to-end set of
information capabilities, associated processes, and personnel for collecting, processing,
storing, disseminating, and managing information on demand to warfighters, defense
policymakers, and support personnel. The intent of the Net-Centric IA Strategy is to
reflect an approach to IA concepts and definitions from a "services" point-of-view
instead of a "system" point-of-view, without specifying requirements related to specific
implementations or architectures.
7.2.5.5. Global Information Grid (GIG) Enterprise Services (GIG ES) Capability
Development Document
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      579
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
The GIG ES Capability Development Document is currently focused on nine core
enterprise services to be provided by the Net Centric Enterprise Services (NCES)
Program. These services are the foundation for the initial net-centric capabilities to be
provided by the Defense Information Systems Agency. The Capability Development
Document describes the overall set of services in detail.
The NCES program will develop the core enterprise services incrementally. The NCES
Program Plan describes the increments and their anticipated schedule. Each program
that is dependent upon the core services being developed by the NCES program should
address the impact of the incremental NCES schedule on their program.
Supportability for IT systems and NSS is the ability of systems and infrastructure
components, external to a specific IT or NSS, to aid, protect, complement, or sustain the
design, development, testing, training, or operations of the IT or NSS to achieve its
required operational and functional capabilities .
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      580
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
7.3.2.5. CJCS Instruction 6212.01, "Interoperability and Supportability of
Information Technology and National Security Systems"
    •    Section 4.1 of this Directive requires IT and NSS employed by U.S. Forces to
         interoperate with existing and planned systems and equipment of joint, combined
         and coalition forces and with other U.S. Government Departments and Agencies,
         as appropriate (based on capability context).
    •    Section 4.3 requires that IT and NSS interoperability and supportability needs, for
         a given capability, be identified through:
             o The Defense Acquisition System (as defined in the DoD 5000 series
                 issuances);
             o The Joint Capabilities Integration and Development System process;
             o The Doctrine, Organization, Training, Materiel, Leadership and Education,
                 Personnel and Facilities (DOTMLPF) change recommendation process.
    •    Section 4.5 provides that IT and NSS interoperability be verified early, and with
         sufficient frequency throughout a system's life, or upon changes affecting
         interoperability or supportability, to assess, evaluate, and certify its overall
         interoperability and supportability within a given capability. Joint interoperability
         certification testing shall be as comprehensive as possible, while still being cost
         effective, and shall be completed prior to fielding of a new IT and NSS capability
         or upgrade to existing IT and NSS.
    •    Section 4.8 requires that interoperability and supportability needs be balanced
         with requirements for Information Assurance (IA).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      581
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         supportability requirements specified in the NR-KPP. . . .
    •    6.2.3.6.1. All IT and NSS, regardless of ACAT, must be tested for interoperability
         before fielding and the test results evaluated and systems certified by the DISA
         (JITC). IT and NSS interoperability test and evaluation shall be conducted
         throughout a system's life, and should be achieved as early as is practical to
         support scheduled acquisition or procurement decisions. Interoperability testing
         may be performed in conjunction with other testing (i.e., DT&E, OT&E, early-user
         test) whenever possible to conserve resources.
    •    6.2.3.6.2. IT and NSS interoperability testing can occur in multiple stages.
         Evolutionary acquisitions or procurements, and normal life-cycle modifications,
         result in a progressively more complete capability. Therefore, there may be
         instances when it is important to characterize a system's interoperability before
         all critical interface requirements have been tested and certified. However, all
         critical interfaces, identified in the NR-KPP, which have been tested, must be
         successfully certified for interoperability prior to fielding. When appropriate (e.g.,
         between successful completion of OT and the fielding decision), the DISA (JITC)
         shall issue interim interoperability certification letters specifying which of the
         system's interoperability needs have been successfully met and which have not.
         The DISA (JITC) shall issue an overall system certification once the system
         successfully meets all requirements of the NR-KPP validated by the Chairman of
         the Joint Chiefs of Staff. The DISA (JITC) shall provide interoperability
         certification letters to the USD(AT&L), the USD(C)/CFO, the DoD CIO, the
         DPA&E (now DCAPE), the DOT&E, the Chairman of the Joint Chiefs of Staff,
         and the Commander, USJFCOM, as well as to the OTA and program manager,
         as applicable.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      582
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         (DUSD(A&T)), the DoD CIO, and the Director, Joint Staff J-6, throughout the
         system life-cycle.
    •    Enclosure 6, paragraph 3 states: During DT&E, the materiel developer shall:
            o d. Assess technical progress and maturity against critical technical
               parameters, to include interoperability, documented in the TEMP; and
            o h. In the case of IT systems, including NSS, support the DoD Information
               Assurance Certification and Accreditation Process and Joint
               Interoperability Certification process; . . .
7.3.2.5. CJCS Instruction 6212.01, "Net Ready Key Performance Parameter (NR
KPP)"
1.a. Defines responsibilities and establishes policy and procedures to develop the NR
KPP and NR KPP certification requirement for all information technology (IT) and
national security systems (NSS) that contain joint interfaces or joint information
exchanges.
DoDD 4630.05, and DoDI 4630.8 as modified by the Interim Guidance for
Interoperability of IT and NSS along with the CJCS Instruction 6212.01, provide insights
into the relationship between key interoperability and supportability activities and the
JCIDS and DAS processes.
The Net-Ready Key Performance Parameter (NR-KPP) has been developed to assess
net-ready attributes required for both the technical exchange of information and the end-
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      583
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
to-end operational effectiveness of that exchange. The NR-KPP replaces the
Interoperability KPP, and incorporates net-centric concepts for achieving IT (including
NSS) interoperability and supportability. The NR-KPP assists Program Managers
(PMs), the test community, and Milestone Decision Authorities in assessing and
evaluating IT (including NSS) interoperability.
In accordance with the DoD 4630 Series, architecture products or views defined in the
DoD Architecture Framework (and related discussion in DoD Instruction 4630.8 ) shall
be used to assess information exchange and use for a given capability. The functional
proponent, domain owner, Principal Staff Assistant, and PM use the supporting
architecture products or views in developing the NR-KPP and preparing the ISP.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      584
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
7.3.4.2. DoD Net-Centric Data Strategy
Compliance with the DoD Directive 8320.02, "Data Sharing in a Net-Centric Department
of Defense" and the DoD Net-Centric Data Strategy is an essential prerequisite of net-
centric operations. For a program to gain Interoperability and Supportability
Certification, program data and services must be "exposed" by making data elements
and provided services visible, accessible, and understandable to any potential user with
access to the GIG, both anticipated and unanticipated.
Verification of compliance with the DoD Net-Centric Data Strategy and DoD Net-Centric
Services Strategy will be accomplished through the analysis of the sponsor-provided
architecture and verification products with accompanying text detailing the program's
compliance strategy. Documentation (via architecture products or other forms) must
clearly identify all net-centric services and data as adopted from Universal Core,
Domain Cores, and COIs.
The GTG is an evolving web enabled capability providing the technical guidance
necessary for an interoperable and supportable GIG built on Net-Centric principles. The
GTG provides a one-stop, authoritative, configuration managed source of technical
compliance guidance that synchronizes previously separate efforts. The GTG is
designed to enable users to decide which guidance is applicable and to find detailed
information and artifacts needed to meet functional requirements (GIG features and
capabilities), DISR-mandatory GIG net-centric IT standards, supporting GIG IT
standards, and GIG Technical Profiles (GTPs).
The GTG is the source for all technology guidance and standards implementation
information used in describing GTPs necessary to meet the net centric operational
requirements specified in the system/service views of an architecture. The GTG
contains a program characterization questionnaire and compliance declaration matrix
that points to applicable GTPs. The GTPs are built from DISR-mandated IT Standards
reflected in a standards profile and include associated implementation guidance,
reference architecture and testing criteria necessary to meet all GIG-related
requirements characterized in the architecture system/service views. GTG Content
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      585
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
includes:
    •    The GTG is designed to enable users to decide which guidance is applicable and
         to find detailed information and artifacts on:
             o Associated technical functional requirements (GIG features and
                 capabilities);
             o DISR-mandated GIG net-centric IT standards;
             o Supporting GIG IT standards;
             o Associated profiles;
             o Reference implementations; and
             o Test criteria.
    •    The GTPs are aligned with the DoD IEA and are determined based on if following
         criteria capability:
             o Spans organizational boundaries;
             o Is mandatory or mission critical across the GIG Enterprise;
             o Can be characterized in a GIG Technical Profile;
             o Is essential for resolving GIG end-to end interoperability issues;
             o Enables net centric information sharing for multiple acquisition programs;
                 and
             o Is important from a security perspective.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      586
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
7.3.5.3. Global Information Grid (GIG) Technical Guidance (GTG) Compliance
The following checklist summarizes the requirements for demonstrating compliance with
the NR-KPP and should be useful in preparing for milestone approvals:
    •    Applicable Architecture Products, AV-1, AV-2, OV-1, OV-2, OV-3, OV-4, OV-5,
         OV-6c, OV-7 (for Final ISP of Record review), SV-2, SV-4, SV-5, SV-6, SV-11
         (for Final ISP of Record review), DISR Standards Compliance with TV-1 and TV-
         2.
    •    Compliant with Net-Centric Data Strategy and Net-Centric Services Strategy,
         Data Exposure Verification Tracking Sheets.
    •    Applicable GTG citations, GTG statements, and the corresponding DISR-
         Mandated GTP IT Standards included in the PMs TV-1 as necessary to meet the
         net-centric operational characterized in the architecture system views.
    •    IA requirements including availability, integrity, authentication, confidentiality, and
         non-repudiation, and issuance of an accreditation decision by the Designated
         Approval Authority.
    •    Applicable Supportability requirements to include SAASM, Spectrum and Joint
         Tactical Radio System requirements (see section 7.6).
    •    Have all architecture products been developed in accordance with the DoD
         Architecture Framework (DoDAF)?
    •    Does the AV-1 describe a net centric environment? (Note: If this is a non-net-
         centric environment, i.e., a legacy network, make sure that is noted in the
         architecture.)
    •    Has the TV-1 been prepared using applicable information technology standards
         profiles contained in the DISR?
    •    Have all the interfaces listed in the OV-2, OV-3, and SV-6 been appropriately
         labeled with the GIG core enterprise services needed to meet the requirements
         of the applicable capability architecture?
    •    Have specific capability architecture OV-6c time event parameters been
         correlated with GIG architecture OV-6c?
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      587
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Have verifiable performance measures and associated metrics been developed
         using the architectures, in particular, the SV-6?
The GTG has a compliance regime with granularity appropriate to the Milestone phase
or maturity of a program.
The spectrum supportability process includes national, international, and DoD policies
and procedures for the management and use of the electromagnetic spectrum. The
CDD/PD must document the following:
    •    Permission has been (or can be) obtained from designated authorities of
         sovereign ("host") nations (including the United States) to use that equipment
         within their respective borders; and the newly acquired equipment can operate
         compatibly with other spectrum-dependent equipment already in the intended
         operational environment (electromagnetic compatibility).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      588
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    All IT, including NSS, must comply with DoD Instruction 4650.01 (see also
         section 7.6).
7.3.6. Information Support Plan (ISP), Enhanced Information Support Plan (EISP),
and Tailored Information Support Plan (TISP)
7.3.6. Information Support Plan (ISP), Enhanced Information Support Plan (EISP),
and Tailored Information Support Plan (TISP)
The ISP comes in several forms as a document (ISP or TISP) or as data in the form of
an EISP tool (for both ISP and TISPs). The preferred format is using the data-centric
EISP tool. The EISP is evolving to become the only acceptable form for ISP content.
The ISP provides the PM a mechanism to identify information-related dependencies, to
manage these dependencies and to influence the evolution of supporting systems to
meet the demands of the system as it evolves to meet the warfighter's needs and
capabilities. In the case where the supporting system will not be available, the ISP
should provide the PM with awareness of this problem in sufficient time to adjust the
program in the most cost effective and operationally efficient manner.
The end-product of the ISP/EISP/TISP is the identified issues and risks associated with
information needs and dependencies of the program. Information issues and risks
should be treated as any program issue or risk as defined in the AT&L's "Risk
Management Guide for DoD Acquisition, Sixth Edition, Version 1, August, 2006."
Information issues and risks should be managed as defined in this guide and presented
in acquisition decision meetings (such as Overarching Integrated Product Team
meetings) by the PM as any other area of issue and risk is presented (e.g., reliability
risks).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      589
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         Effectiveness"Regulatory Requirements Applicable to All Acquisition Programs,"
         requires that all acquisition programs, regardless of acquisition category level,
         submit an ISP at Pre-EMD Review (Initial ISP), CDR (Revised ISP), Milestone C
         (ISP of Record, unless waived), at major system or software updates (Updated
         ISP), and at Program Initiation for ships (Initial ISP).
    •    DoD Instruction 4630.8, Enclosure 4 provides a ISP content requirements and
         guidelines..
    •    CJCS Instruction 6212.01 also provides detailed implementing guidance
         regarding the ISP and specifically the TISP.
    •    Intelligence Community Policy Guidance 801.1 provides requirement for
         Intelligence Community programs to develop an ISP.
7.3.6.2. Information Support Plan (ISP) Integration into the Acquisition Life Cycle
7.3.6.2.2. Before Pre-EMD Review prior to Milestone B (or program initiation for
ships)
7.3.6.2. Information Support Plan (ISP) Integration into the Acquisition Life Cycle
An ISP provides the methodology for meeting a program's information needs and
managing the issues and risks associated with those need. It ensures compliance with
DoD CIO policy and is used by various other activities to monitor compliance and
sufficiency. The Joint Staff utilizes the ISP in the Interoperability and Supportability
Certification process; J2 utilizes the ISP for intelligence supportability (CJCS Instruction
3312.01); and the ISP is used as part of Title 40/CCA statutory oversight, oversight of
Information Assurance (IA), spectrum supportability, and the National Signature
Program.
The ISP is a living document or living data for the EISP, which is developed over the life
cycle of a program. At each point of review, the ISP builds and follows the information
needs required by a program to meet its intended capability(ies). A completed ISP
answers the following seven questions for information needed to support the
operational/functional capability(ies) of a system.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      590
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    What information is needed?
    •    How good must the information be?
    •    How much information (needed or provided)?
    •    How will the information be obtained (or provided)?
    •    How quickly must it be received in order to be useful?
    •    Is the information implementation net-centric?
    •    Does it comply with DoD information policies?
There are three ISP development approaches during the life cycle:
A traditional ISP Document for Acquisition Category (ACAT) I, IA, and designated ISP
Special Interest Programs (see (Office of the DoD CIO Memorandum, Subject:
"Information Support Plan (ISP) Special Interest List," located on the JCAPT-E Policy
and Guidance website)
    1. The Information Needs and Discovery Process will continue to follow the 13
       steps in DoD Instruction 4630.8, Enclosure 4 plus the addition of a Net-Ready
       Key Performance Parameter analysis.
    2. A data-centric ISP (EISP) for ACAT I, IA, and designated ISP Special Interest
       Programs, using an Extensible Markup Language (XML)-based ISP data
       collection tool provided by DoD CIO and an associated data converter that
       generates a properly formatted ISP document from the data. This relieves the
       ISP developer from needing to produce and format a written ISP document and
       sets the stage for other options of analysis and presentation of ISP data.
An EISP installer, along with the EISP documentation, is available from the JCPAT-E
website. In addition to the installer, an EISP Guidebook and technical users guide for
the EISP can also be downloaded from the JCPAT-E website. If a user has a problem
downloading or installing the EISP, they can contact eisp_help@bah.com.
    3. Tailored ISP (TISP) Document for ACAT II and below programs (including ISP
       Special Interest) and non-ACAT programs that receive Joint Staff (JS) approval
       may use this method. These programs may tailor the content of their ISP per the
       procedures in section 7.3.6.9. Authorized programs can obtain a final decision
       from the JS for their tailored plan to include any special needs identified by the
       JS for the intelligence and supportability/interoperability certification processes
       required by CJCS Instruction 3312.01 and CJCS Instruction 6212.01. The final
       DoD Component approved plan (TISP) will be submitted to DoD CIO ISP
       document repository (via the DISA-managed JCPAT-E) tool (site requires
       certificate and/or login)).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      591
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
and Updated ISP). Programs under the various other DoD acquisition lifecycles may
follow different milestone events as shown in Figure 7.3.6.2.F1, but also build towards a
final ISP of Record.
ISPs for ACAT I, IA programs, and ISPs or TISPs for Special Interest programs will
undergo a complete OSD-level review. ISPs or TISPs for all other ACAT II and below
programs and Non-ACAT programs will be reviewed using the JSreview process as
described in CJCSI 6212.01.
Figure 7.3.6.2.F1, "ISP Submission Timeline," illustrates when ISPs must be submitted
to the JCPAT-E tool. It depicts when ISP reviews occur and lists the activities
associated with each review. All review timeframes describe the period of time in which
the ISP is open for stakeholders comments. All OSD level ISP reviews and JS level ISP
reviews will be completed within thirty calendar days of posting with the exception of a
final acceptance review by the Joint Staff which will last 15 calendar days. Additional
administrative days at the end of the review allow for the Joint Staff and DoD CIO to
consolidate and comment matrix or provide appropriate responses. See outline below
for the review of time frames and potential responses:
OSD or JS Responses:
    •    Initial ISP/TISP: Acceptance Memorandum from the DoD CIO or JS for TISP
    •    Revised ISP/TISP: Acceptance Memorandum from the DoD CIO or JS for TISP
    •    ISP/TISP of Record: Acceptance of a Component Approved ISP/TISP
         Memorandum from the DoD CIO and/or JS Interoperability Certification
    •    Updated ISP/TISP: Acceptance of a Component Approved ISP/TISP
         Memorandum from the DoD CIO and/or JS Interoperability Certification
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      592
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                               Figure 7.3.6.2.F1. ISP Submission Timeline
For a detailed description of the ISP staffing process see the DoD CIO, 26 August 2005
ISP Acquisition Streamlining Pilot Program memo, located within the Policy and
Guidance section of the JCPAT-E website.
While the ISP is not required until MS B, early development of the ISP will assist in
development of the program's architecture and Concept of Operations discussed in the
CJCS Instruction 3170.01. Beginning development of the EISP early will help define
information needs and dependencies for the program.
7.3.6.2.2. Before Pre-EMD Review prior to Milestone B (or program initiation for
ships)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      593
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
4630.8, CJCS Instruction 6212.01, CJCS Instruction 3170.01, and the JCIDS Manual to
ensure information supportability is addressed in the ISP and Capability Development
Document.
    •    Submit the ISP for formal, coordinated, Initial ISP Review according to DoD
         Instruction 4630.8 and Pilot Memorandum, DoD CIO, "Information Support Plan
         (ISP) Acquisition Streamlining Pilot Program," August 26, 2005 as amended by
         the 23 June 2011 PDUSD(AT&L) Memorandum Improving Milestone Process
         Effectiveness (Available to users in the JCPAT-E site).
    •    Submit an updated ISP for each major upgrade (e.g., block or increment).
    •    Submit the Updated ISP for formal, coordinated, Initial ISP Review according to
         DoD Instruction 4630.8 and Pilot Memorandum, DoD CIO, "Information Support
         Plan (ISP) Acquisition Streamlining Pilot Program," August 26, 2005. (Available
         to users in the JCPAT-E site)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      594
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Interoperability Test Certification by Joint Interoperability Test Command will not occur
without an Information Support Plan. Exceptions must be approved by both the DoD
CIO and Joint Staff.
A Portfolio ISP should be developed, which at a minimum will identify requirements for
support from common GIG infrastructure (e.g., communications). A Portfolio Systems
Architecture that conforms to the DoD Architecture Framework is required to guide the
integration of the portfolio, as well as a Portfolio Technical Architecture that complies
with the Net-Centric criteria. ISPs for families-of-systems or systems-of-systems (i.e.,
portfolios, enterprises, capability areas, and similar groupings) are encouraged as a way
to save time and resources. A platform aggregation of systems as conceptually
developed by the Navy is a logical method for implementation. However, this ISP
approach requires permission from the office of the DoD CIO and the Joint Staff. The
request should define the scope, details and expected process with DoD CIO before the
family-of-systems or system-of-systems ISP is initiated. Often the systems within a
particular set of systems are out of sync with programmatic acquisition events,
particularly in time sequence. Frequently, this situation can be accommodated by
creating a "parent" overarching, capstone, portfolio, or enterprise ISP, and adding
annexes to the ISP to cover the additional systems. Each time an annex or individual
element of the family-of-systems or system-of-systems is addressed, particular care
should be taken to include the interactions between the elements making up the overall
family- or system-of-systems and the parent operational architecture. The ISP should
address any information sharing and/or collaboration.
Based on past experience, a small program with few interfaces takes about 6 months to
get an ISP ready for review. For most programs, however, ISP preparation for initial
review takes about 1 year. Very complex programs, like a major combatant ship, it can
take from 18 to 24 months. The length of the process primarily depends on whether a
solution architecture exists or requires development.
The DoD CIO reviews all ISP documents for Acquisition Category I and IA programs,
and for other programs in which DoD CIO has indicated a special interest. This review is
performed on the JCPAT-E suite. JCPAT-E provides paperless, web-based support for
ISP document submission, assessor review and comment submission, collaborative
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      595
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
workspace, and consolidated review comment rollup. The DISA JCPAT-E functional
analyst is available to assist users with JCPAT-E functionality and to establish user
accounts. As a best practice, the JCPAT-E includes an ISP repository available for
viewing archived and current ISPs.
Program Managers and other stakeholders will find the links in Table 7.3.6.5.T1 useful
for Information Support Plan preparation, program analysis, and oversight.
• Web Site
                                        NIPRNET                                                    SIPRNET
           •    Defense Information Systems Agency's Joint C4I Program Assessment
                Tool
                        https://jcpat.csd.disa.mil/JCPAT                                  jcpat.csd.disa.smil.mil
           •    Defense Architecture Repository
                                                                                               Not applicable
                      https://dars1.army.mil/IER/index.jsp
           •    DoD Information Technology Standards Registry
      https://gtg.csd.disa.mil/ disronline.disa.smil.mil
          • Global Information Grid (GIG) Technical Direction
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      596
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
        Business Systems, Information Tech Systems,
                                                                                                571-372-4471
        Intelligence
        JCPAT-E Functional Analyst                                                              301-225-7400
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      597
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Provide the following program data to help the reviewer understand the level of
         detail to be expected in the ISP:
            o Program contact information (PM, address, telephone, email address, and
                 ISP point of contact).
            o Program acquisition category: Acquisition Category.
            o List Milestone Decision Authority: Defense Acquisition Board, Information
                 Technology Acquisition Board (or component Milestone Decision
                 Authority) or other.
            o Milestone covered by the specific ISP.
            o Projected milestone date.
            o Universal Identifier/DoD IT Portfolio Repository number.
            o Document Type.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      598
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
              o Life-cycle stages (Concept Refinement, Technology Development, System
                Development and Demonstration, etc.).
    •    The following steps (and notes) are based on using the Architecture developed in
         accordance with the DoDAF, during the JCIDS process.
Step 1: Identify the warfighting missions and/or business functions within the enterprise
business domains that will be accomplished/enabled by the system being procured.
The Mission Threads are based on the last version of the Joint Capability Areas and
allow a developer to bin a program's capabilities. A developer selects a Tier 1 Mission
Thread in the Enhanced ISP and is then able to select the Tier 2 and Tier 3 mission
threads that are children of the chosen Tier 1.
Note: If a Command and Control capability is the top-level driver of the function
breakdown, then the OV-4 (Command Relationships) will be a necessary product to
help define the functional capabilities needed. The OV-4 will likely require several OV-5
(Activity Model) functional breakdowns to enable each of the command elements
identified.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      599
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                        Figure 7.3.6.7.2.F1. Example Capability Breakdown
Note: The specific form of this information should capture key information from an OV-5
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      600
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
(Operational Activity Model) and/or other information source (e.g., an outline or
hierarchical graph). The important point is that the capability relationships are
understood and attributes are identified so that assessments can be made.
Step 3: Determine the operational users and notional suppliers of the information
needed.
Step 3.a: Provide an OV-2 to identify the operational nodes and elements that drive the
communications needed to enable the functional capabilities. For large
platforms/systems, this effort should identify the major operational nodes (information
drivers) within the platform, as well as nodes that are external to the platform/system
with which information will be shared.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      601
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Step 3.b: Map these nodes (internal and external systems and people) and their
activities to the functions identified in OV-5.
Step 4: Establish the quality of the data needed to enable the functions identified in OV-
5 and performed by the operational nodes in OV-2 (Operational Node Connectivity).
Note: When radio and other information transport systems are identified as providing
support, establish transmission quality parameters and then assess whether the
programs/systems intended to be used can meet these criteria.
Note: A factor in determining quality is the user (person or sub-system) (i.e., specifically
how does the user intend to use the information).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      602
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Step 5: Determine if timeliness criteria exist for the information.
Note: To help establish timeliness, use OV-6C (Operational Event Trace Diagram) to
establish event sequence. Considerations include:
Note: Ultimately this analysis should help estimate the bandwidth needs and should
provide an assessment as to whether adequate bandwidth is available. If bandwidth is
limited, what actions can be taken to reduce demand or use the bandwidth more
efficiently?
If data links are involved, identify them and also the message sets that will be
implemented.
Note: In many cases, this discussion will involve multiple levels of enabling systems.
For example, maybe the enabling system is a Global Command and Control System
(GCCS) application. GCCS rides on the Secret Internet Protocol Router Network
(SIPRNET). So both levels of this support should be discussed.
Step 8: Assess the ability of supporting systems to supply the necessary information.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      603
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Identify the external connections to the system using the system views and identify any
synchronization issues associated with schedule and/or availability of external systems.
Note: Supporting systems include collection platforms, databases, real time reports,
messages, networked data repositories, annotated imagery, etc.
    •    Assess the ability to collect, store, and tag the information (to enable discovery
         and retrieval).
    •    Assess the ability of networks to provide a means to find and retrieve the
         necessary data.
    •    Assess the ability of the information transport systems to move the volume of
         data needed.
    •    Assess synchronization in time (i.e., years relative to other system milestones)
         with supporting programs.
    •    Whether the information will cross security domains.
Note: If systems will connect to the intelligence Top Secret (TS)/ Sensitive
Compartmented Information (SCI) network, Joint Worldwide Intelligence
Communications System, or utilize TS/SCI information, they will have to comply with
Intelligence Community Directive (ICD) Number 503, "Intelligence Community
Information Technology Systems Security Risk Management, Certification and
Accreditation" and DCID 6/9, "Physical Security Standards for Sensitive
Compartmented Information Facilities," 18 November 2002.
Note: The number of levels of analysis will depend on the detail required to identify the
critical characteristics of the information needed to support the program. This should be
accomplished for all phases of the acquisition life cycle.
Note: It is anticipated that other communities such as the intelligence community may
have to assist in the determination and analysis of these information needs.
Note: DoD Instruction 4650.01 establishes spectrum management policy within the
Department of Defense. DoD Instruction 4630.8 and CJCS Instruction 6212.01 require
Spectrum Supportability (e.g., spectrum certification, reasonable assurance of the
availability of operational frequencies, and consideration of Electromagnetic
Environmental Effects) to be addressed in the ISP. The Services have additional
spectrum management policies and procedures.
To support the Spectrum Supportability process, the ISP should document the following:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      604
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Impact of the loss of a planned spectrum-dependent command, control, or
         communication link as a result of an unresolved spectrum supportability issue.
         (To be identified in the issue section of the ISP.)
Note: Consider individual Services net-centric policies and procedures that supplement
DoD Net-centric policy.
Note: This is an emerging requirement in the analysis required for ISPs. When Net-
Centric Enterprise Services (NCES) /Core Enterprise Services (CES) are available,
programs will be expected to conduct a detailed analysis of compliance. Programs
should be aware of this developing requirement, as it will become an essential part of
determining net-centricity and compliance with the DoD Information Enterprise (IE).
Step 10a: Using the information provided as a result of Step 7, the PM should evaluate
the program against measurement criteria from the DoD Information Enterprise
Architecture (IEA).
Step 10b: Provide an analysis of compliance with the emerging Net-Centric Enterprise
Services (NCES)/Core Enterprise Services (CES).
As the DoD IE CES develops, its specifications should be cross-walked with the ISP
system's planned network service specifications. Identify the issues associated between
the CES service specifications and those of the system that is the subject of the ISP.
Compliance would mean that the system would connect seamlessly with the defined
DoD-level enterprise services.
Step 11: Discuss the program's inconsistencies with the DoD Enterprise Architecture
and the program's strategy for getting into alignment.
Identify areas where the latest versions of the DoDAF and DoDIEA do not support
information needs. (See also DoD Directive 8000.01.)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      605
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Step 12: Discuss the program's IA strategy. Also provide a reference to the Program
Protection Plan, if applicable.
Step 13: Identify information support needs to enable development, testing, and
training.
For development: Weapon systems include information about potential targets that are
necessary to support system development. (Example: target signature data)
For testing: Include information support needs critical to testing (Example: Joint
Distributed Engineering Plant (JDEP)). Do not duplicate Test and Evaluation Master
Plan information except as needed to clarify the analysis. In addition, for information on
software safety testing please refer to software test & evaluation.
For training: Include trainers and simulators that are not a part of the program being
developed. Include:
    •     Identify risks and issues (as defined in DoD Instruction 4630.8) in a table similar
          to Table 7.3.6.7.3.T1 or in an outline containing the same data.
              o Group operational risks and issues under the mission impacted, then
                  under the impacted functional capability (for that mission).
              o When risks or issues involve more than one mission, subsequent missions
                  should be marked with the previous issue number and those fields that
                  remain the same should be so marked.
    •     Include the following column (or outline) headings:
              o Issue Number
              o Supporting System
              o Source Architectures (e.g., Command and Control (C2), Focused
                  Logistics, Force Protection, Force Application, Battlespace Awareness,
                  Space, etc.)
              o Issue Description
              o Risk/Issue Impact (Use the AT&L "Risk Management Guide for DoD
                  Acquisition" for this assessment)
              o Mitigation Strategy or Resolution Path
        Operational Issues
        Mission
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      606
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
      Functional Capabilities Impacted
                                                                                            Mitigation Strategy/
         Issue Supporting Source      Issue      Issue
                                                                                             Resolution Path
        Number  System Architecture Description Impact
                                                                                            (and Time-Frame)
Development Issues
Testing Issues
Training Issues
Risks and issues considered critical to the program's success will be briefed by the PM
at OIPT meetings. At a minimum, information risks and issues will be incorporated into
the PM's risk management program and treated as any other type of program risk and
issue.
Appendix A. References. Include all references used in developing the ISP. Include
Architectures; other relevant program documentation; relevant DoD, Joint Staff, and
Service Directives, Instructions, and Memos; ISPs or ISPs from other programs; any
applicable JCIDS documentation; and others as deemed necessary.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      607
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Appendix D. Acronym List: Provide an Integrated Dictionary (AV-2).
7.3.6.8.1. Applicability
7.3.6.8.2. Introduction
TISP instructions are available from the Joint Staff (CJCS Instruction 6212.01).
7.3.6.8.1. Applicability
The Tailored Information Support Plan (TISP) is designed to improve the Information
Support Plan (ISP) process by reducing the number of OSD-level reviews, streamlining
the ISP waiver process, and providing a tailored ISP option for ACAT II, III and non-
ACAT programs only.
7.3.6.8.2. Introduction
The EISP is designed to accommodate the TISP and is encouraged as the tool for TISP
development. ACAT II and below, as well as Non-ACAT programs, may tailor the
content of their ISP upon Joint Staff approval. At a minimum, the tailored plan will
provide explanation of the programs' Concept of Operations (CONOPS) and will provide
IT supportability analysis of the CONOPS. Additionally, the following set of architecture
products is required: AV-1, OV-1 (optional), OV-5, OV-6C (optional), SV-1 (optional),
SV-5, SV-6, and TV-1.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      608
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
TISP requests shall be requested via email to Joint Staff through the applicable
Service/Agency/Joint Forces Command Interoperability Test Panel (ITP) representative.
The TISP request form is on the CJCSI 6212 Resources Page. Joint Staff will respond
to the TISP request with a "Concur" or "Non-concur" via e-mail.
    •    If the mandatory sections of the form are not completed, the request will be
         returned to the submitter for completion.
    •    Joint Staff will review submitted TISP applications and will approve or deny entry
         into the TISP process.
    •    Applicants, and respective ISP representatives, will be notified via e-mail that a
         program can precede with development of a TISP.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      609
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
7.3.6.9. Information Support Plan (ISP) Waiver Process
The requirement for an ISP may be waived when the requirement for JCIDS
documentation has been waived, Joint Staff has determined that the Net-Ready Key
Performance Parameter or Interoperability Key Performance Parameter are not needed,
or the program does not meet any of the criteria identified in paragraphs 2.2.2, 2.2.3,
and 2.2.4 of DoD Instruction 4630.8 . Additionally, programs accepted under the Legacy
System Interoperability Validation and Certificate Request Process are waived from
producing an ISP.
Waiver requirements apply to all Acquisition Category (ACAT) and non-ACAT ISPs.
Each DoD Component has an ISP waiver review process. Waiver requests shall be sent
via email to DoD CIO by the appropriate DoD Component action officer for coordination
prior to approval. The waiver information will include: the program's name, the next
milestone, the capability(ies) the program provides, list any external information and
related connectivity, and the rationale for the waiver. The ISP waiver application form is
available on the CJCSI 6212 Resources Page . DoD CIO will respond to the waiver
request via memo indicating approval or disapproval. Waiver authority for non-ACAT
ISPs resides with the cognizant fielding authority. Upon final approval by DoD CIO, the
DoD Component will be provided a copy of the approved waiver. A test process is now
in effect (currently the Navy only) to allow the component some waiver authority. This
may be expanded. Legacy Waivers are defined in a DoD CIO memorandum and is
being changed as reflected in the text below. As of this guidebook the following will
apply for and be changed in future policy for legacy systems.
Fielded Legacy systems, ACAT II and below and non-ACAT programs, that meet all of
the conditions outlined below may request a waiver from the DoD Instruction 4630.8,
"Procedures for Interoperability and Supportability of Information Technology (IT) and
National Security Systems (NSS)," dated May 5, 2004, requirement to produce an ISP.
In addition to the standard ISP waiver, there are now two categories of legacy waivers
with the qualifying conditions shown below by category:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      610
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
              o    Have no major planned updates, incremental changes, spiral development
                   changes planned.
              o    Have no pre-existing interoperability deficiencies identified by the JITC.
              o    Be funded beyond FYDP, with no established retirement date, and
              o    Is currently connected to the Global Information Grid.
Waiver requests will follow the email-based waiver process described in CJCS
Instruction 6212.01. When the DoD CIO has electronically approved the request, the
fielded legacy system will follow the procedures established by the Joint Staff, for
interoperability certification and certificates to operate. Upon granting the waiver, the
Joint Staff will inform DoD CIO of the approval.
The DoD policy instruction for sharing data and IT services is contained in the issuance:
DoD Instruction 8320.02, Sharing Data, Information, and Information Technology (IT)
services in the Department of Defense, March, 2013.
The instruction provides overarching policy, procedures, and responsibilities for sharing
data, information, and IT services in the DoD. It is built upon the goals for these
respective areas as defined in the DoD Net-Centric Data Strategy (NCDS) (May 9,
2003) and the DoD Net-Centric Services Strategy (NCSS) (May 4, 2007).
The NCDS outlines the vision for managing data in a net-centric information sharing
environment. The strategy compels a shift to a "many-to-many" exchange of data,
enabling many users and applications to leverage the same data-extending beyond the
previous focus on standardized, predefined, point-to-point interfaces. Hence, the
objectives are to ensure that all data are visible, available, and usable-when needed
and where needed-to accelerate decision cycles. Specifically, the data strategy
describes 7 major net-centric data goals as presented in Table 7.4.1.T1, below.
             Goal                            Description
        Goals to increase Enterprise and community data over private user and
                                     system data
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      611
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                          Users and applications can discover the existence of data
                          assets through catalogs, registries, and other search
      Visible             services. All data assets (intelligence, no intelligence, raw,
                          and processed) are advertised or "made visible" by
                          providing metadata, which describes the asset.
                          Users and applications post data to a "shared space."
                          Posting data implies that (1) descriptive information about
                          the asset (metadata) has been provided to a catalog that is
      Accessible          visible to the Enterprise and (2) the data is stored such that
                          users and applications in the Enterprise can access it.
                          Data assets are made available to any user or application
                          except when limited by policy, regulation, or security.
                          Data approaches are incorporated into Department
                          processes and practices. The benefits of Enterprise and
      Institutionalize
                          community data are recognized throughout the
                          Department.
                Goals to increase use of Enterprise and community data
                          Users and applications can comprehend the data, both
      Understandable structurally and semantically, and readily determine how
                          the data may be used for their specific needs.
                          Users and applications can determine and assess the
                          authority of the source because the pedigree, security
      Trusted
                          level, and access control level of each data asset is known
                          and available.
                          Many-to-many exchanges of data occur between systems,
                          through interfaces that are sometimes predefined or
      Interoperable       sometimes unanticipated. Metadata is available to allow
                          mediation or translation of data between interfaces, as
                          needed.
                          Perspectives of users, whether data consumers or data
      Responsive to
                          producers, are incorporated into data approaches via
      User Needs
                          continual feedback to ensure satisfaction.
The NCSS outlines a vision to establish Web services (referred to as services hereafter)
as the preferred means by which data producers and capability providers make their
data assets and capabilities available across the DoD and beyond. It also establishes
services as the preferred means by which consumers can access and use these data
assets and capabilities. As with data, services are to be made visible, accessible,
understandable, and trusted. Specifically, the services strategy describes 3 major net-
centric services goals as presented in Table 7.4.1 T2, below.
Goal Description
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      612
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
      Provide                   Make information and functional capabilities available as
      Services                  appropriately secure services on the network.
                                Use existing services to satisfy mission needs before
      Use Services
                                creating duplicative capabilities.
                                Establish the policies and procedures for a single set of
      Govern the
                                common standards, rules, and shared secure infrastructure
      Infrastructure
                                and services throughout the DoD Enterprise to ensure
      and Services
                                interoperability.
      Monitor and
                                Implement services in accordance with DoD's GIG NetOps
      Manage
                                Strategy and concept of operations to ensure situational
      Services via
                                awareness of the NCE.
      GIG NetOps
A DoD Guide, DoD 8320.2-G, "Guidance for Implementing Net-Centric Data Sharing,"
April 12, 2006, stemming from the authority of DoD Directive 8320.02, " Sharing Data,
Information, and Information Technology (IT) services in the Department of Defense,
March, 2013, provides implementation guidance for the community-based
transformation of existing and planned information technology (IT) capabilities across
the DoD. The goal of this Guide is to provide a set of activities that members of
communities of interest (COIs) and associated leadership can use to implement the key
policies of DoD Directive 8320.02 and ultimately increase mission effectiveness across
the Department of Defense. The activities presented in this Guide may not apply to all
COIs and should be tailored as necessary.
The DoD Chief Information Officer " DoD Net-Centric Data Strategy", May 9, 2003,
defines the COI as "a collaborative group of users who must exchange information in
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      613
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
pursuit of their shared goals, interests, missions, or business processes and who
therefore must have shared vocabulary for the information they exchange." COIs are
organizing constructs created to assist in implementing net-centric information sharing.
Their members are responsible for making information visible, accessible,
understandable, and promoting trust all of which contribute to the data interoperability
necessary for effective information sharing. This chapter describes the roles,
responsibilities, and relationships of COIs in information sharing.
The focus for COIs is to gain semantic and structural agreement on shared information.
For COIs to be effective, their scope-that is, the sphere of their information sharing
agreements-should be as narrow as reasonable given their mission. Although the
Department of Defense or a Military Department might be considered a collaborative
group of users who have a shared mission, and thus a COI, achieving a shared
vocabulary across the entire Department of Defense or even across a Military
Department has proved to be very difficult to achieve due to the scope and magnitude
of the information sharing problem space. COIs represent a mechanism for
decomposing the DoD's information sharing problem space into manageable parts that
can be addressed by those closest to the individual parts.
COIs may be guided by the DoD's strategic goals, existing policy, and doctrine, or COIs
may form on an ad hoc basis to address a data sharing problem among known
stakeholders. While DoD Component-specific COIs may exist, COIs are most likely to
be functional or joint entities that cross organizational boundaries. Examples of a COI
might be a meteorology COI or a joint task force COI. COIs should include producers
and consumers of data, as well as developers of systems and applications.
Although COIs may vary, the key attributes (below) should be applicable for the majority
of COIs across the Department of Defense.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      614
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
7.4.2.2.1.3. Forward Planning
This section provides a set of activities to help guide the establishment, evolution, and
operations of a COI, as well as the fielding of real information sharing capabilities.
Readers new to COIs, in the process of organizing a COI, or belonging to a newly-
formed COI should consult this chapter.
COIs may take various forms and are not intended to be "one size fits all." These
groups can differ in how they operate, the timelines for their actions, the duration of their
existence, how they are governed, and whether or not they demonstrate information
sharing capabilities through pilot activities before operational use. As such, COIs should
determine what activities, and associated levels of effort, are necessary to ensure
sufficient governance and management of the COI.
The COI "Establish and Evolve" activity area focuses on identifying the purpose for a
community, identifying the community's needs, and establishing a COI to work toward
meeting those needs. The initial step in forming a COI is to identify a potential need for
such a group, the mission, and potential membership. In addition, before establishing a
new COI, potential members should identify other organizations and/or COIs that may
be addressing the same or similar problem area.
If a similar COI exists and there is considerable semantic overlap in the identified
problem area, potential members should reach out to the existing COI to leverage its
work and investigate opportunities for collaboration. Assuming that a new COI is
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      615
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
required, the process of establishing a new COI will involve the activities below.
Identify related COIs. Communities should use the COI Directory, the DoD CIO will fix
or delete to identify related efforts for coordination of governance forums and sharing
experiences. This directory maintains a listing of all DoD COIs that register, and
provides visibility into their activities. Identification of other COIs can both inform the
decision to establish a new COI and identify information sharing possibilities once a new
COI has been established.
Advertise the COI. To ensure that DoD users can discover the existence and mission
of a COI and have the opportunity to participate, a member of the COI should register
the COI in the COI Directory. To register, COIs should provide their name, point of
contact, mission, status, COI lead, and proposed governing authority.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      616
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
sharing capabilities as a result of reusing existing data assets rather than re-creating
data. Instituting measures of success helps ensure that the Enterprise continues to
invest in those opportunities that provide value to the Enterprise.
Continually gather user feedback. COI members should strive to meet user needs,
measure the value achieved through information sharing, and work with stakeholders to
identify near-term information sharing capabilities. As the COI evolves, so will
stakeholder priorities and needs. Periodically, members should reassess activities to
ensure that the COI is continuing to provide value and that it continues to address the
COI's mission with needed capabilities. This reassessment would include its support for
net-centric information sharing across the Department of Defense. COI members should
assess metric results to determine when the COI has achieved its mission and should
disband or turn over operations to continuing organizations.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      617
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
7.4.2.2.2.2. COI Management and Governance Implementation Activities
Select a COI lead. The COI lead is the point of contact and action officer for COI
activities. This role differs from that of the governing authority in that the COI lead is
responsible for the day-to-day functioning of the COI but should be in a position to
influence agreements and to help address issues that affect multiple DoD Components.
The COI lead interfaces with the COI governing authority to report status, resolve
issues, promote COI agreements, and to make recommendations on DoD Component's
plans and schedules. Other responsibilities include leading regular meetings;
establishing working groups, as needed; identifying other potential members; acting as
a liaison to the portfolio manager or other governing authority; coordinating with the
relevant program or system managers; collaborating with other COIs to reuse metadata
artifacts; and helping to mitigate any conflict within the COI.
Clarify relationships between groups involved in the COI. Although COI members
share a mission, establishing a clear understanding of information sharing relationships
among members rather than assuming that such an understanding already exists will
help shape COI responsibilities and direction.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      618
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
implementing policy.
Assess reusability of other resources. Using the DoD Data Services Environment
(DSE) (CAC-required), communities should identify opportunities for semantic and
structural metadata reuse. COIs should also consult other COIs for opportunities to
capitalize on operational data access services that can enrich their data sets and,
potentially, be integrated into their data sharing capabilities (e.g., a COI can build a new
capability using another COI service that is already in place).
Forward Planning. COIs should plan for the long-term maintenance of COI metadata
artifacts, including taxonomies and schemas, in consideration of other organizations
that have built services that depend on these artifacts. For COIs that are not planned for
long-term continuation, the COI should consult with the lead DoD Component
organization or governing authority to develop a plan for long-term maintenance, to
include configuration management.
COIs play a key role in implementing net-centric data sharing across the DoD. The
mission-focused and typically joint nature of COIs enable the identification and
development of net-centric information sharing capabilities that are of greatest value to
DoD users. Through pilots and operational information sharing capabilities, members of
COIs can demonstrate the mission value of using cross-Component data sources.
The "Capability Planning and User Evaluation" activity area focuses on defining an
information sharing capability that the COI needs, working with DoD Components to
implement the capability, and integrating it into ongoing operations. In some cases,
COIs, through their members and associated programs, systems, and data sources,
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      619
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
may develop pilot capabilities before engaging in full deployment of a capability. When
planning for information sharing capabilities, COI members should define a set of
requirements for the capability developers (associated with a program of record or
organization with data assets and budget). Associated programs of record inform DoD
processes as appropriate when planning for information sharing capabilities. Capability
developers are responsible for turning the requirements into a physical implementation
of data assets and services in accordance with COI agreements.
The overall goal of these activities is to assist a COI to evolve net-centric information
sharing capabilities. Through these activities, COIs should actively identify information
sharing needs and to integrate new capabilities supporting known needs of the COI, as
well as providing readily discoverable and understandable information to authorized but
unanticipated users.
Identify the approach for delivering the capabilities. Community of Interest (COI)
members must consider the normal certification and test processes when determining
whether information sharing capabilities will be piloted or offered for operational use.
The COI should base its approach on many factors, including technical and operational
risk and the life-cycle stage of the data assets involved. For example, a COI may decide
to develop a pilot capability that exposes data from existing systems in order to create a
new asset before pursuing operational fielding of the capability. Leveraging exposed
data from existing systems (instead of targeting programs in the new cycle), may enable
the COI to field a capability faster and provide more immediate benefits to users.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      620
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
7.4.2.2.3.3. Forward Planning
Check user satisfaction. As part of the ongoing feedback loop, COIs should make
data regarding the information sharing capability implementation available and
accessible to consumers of the community's data, and gather input from these users.
Gathering consumer, or user, input will enable the COI to gauge user satisfaction and
determine whether the capability meets user needs and expectations.
Capture lessons learned by the COI. Capturing and communicating lessons learned
is a key part of the COI's governance responsibilities. Lessons learned provide current
and future best practices, baseline financial data, and provide other valuable insight into
the fielding of new information sharing capabilities. Although there is no one-size-fits-all
approach, COIs should leverage all available resources to avoid repeating past
mistakes and duplicating current efforts. COIs should also plan to meet regularly with
the appropriate portfolio manager and other stakeholders to review implementation
results.
See Enclosure 3 DoD Directive 8320.02, " Sharing Data, Information, and Information
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      621
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Technology (IT) services in the Department of Defense”, March, 2013
Making data, information, and IT services visible, accessible, and understandable, and
trusted are the cornerstones goals as indicated in the tables above. The creation of
duplicative data and redundant capabilities often results from consumers' inability to
locate, access, understand, or trust that existing assets meet their needs. Enclosure 3
describes procedures to guide Communities of Interest (COIs) in implementing these
cornerstones.
The Data Services Environment (DSE) is an integrated dashboard that brings together
the existing capabilities (Metadata Registry [MDR], Net-Centric Publisher [NCP],
Service Discovery [SD], Enterprise Authoritative Data Source [EADS] Registry) into a
common modular framework. It contains the structural and semantic metadata artifacts
critical to successful development, operation, and maintenance of existing and future
capabilities that support the DoD Net-Centric Data Strategy and Net-Centric Services
Strategy. Its goal is to facilitate information sharing across the DoD, allowing users to
publish and discover information resources, data sources, and services. DISA maintains
and operates the DSE under the direction and oversight of DoD CIO.
The DoD Data Services Environment contains the structural and semantic metadata
artifacts critical to successful development, operation, and maintenance of existing and
future capabilities that support the DoD Net-Centric Data Strategy. Its goal is to simplify
the publication and discovery of data services that facilitate information sharing across
the Department of Defense.
The Data Services Environment provides a one stop access to DoD data source
directories to improve search, access, consistency, and integration of data services as
well as to increase collaboration amongst data producers and consumers.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      622
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         files, supporting message formats, i.e. XML Schemas, as well as descriptive and
         informative documentation supporting those assets.
    •    Services and service metadata including service end points, service POCs, and
         service PMO.
    •    Authoritative data sources including systems, data stores and capabilities that
         fulfill particular data needs.
    •    DDMS records that include the core discovery information required by the DDMS
         Specification and publish that information to the enterprise catalog.
Users within the DoD Enterprise can discover and leverage various enterprise service
offerings, they can discover the authoritative data sources that fulfill their data needs.
Developers can locate information in such as service offers, service specifications, and
taxonomical information, and they can readily reuse these existing entities to save time
and avoid duplication of effort.
The discovery metadata may also include elements defined as COI extensions
described in the in the DDMS. These elements are related to the subject matter of the
asset, and are necessary for specialist consumers in a particular subject matter to
locate relevant data assets.
Identify assets to share. Members of the community of interest (COI) should build a
prioritized list of the assets it will initially make visible to the DoD via the Data Services
Environment (DSE). The list should include descriptive information on each of the
identified data assets such as POC information, including email addresses and
telephone numbers; name of proposed or existing data access service and any related
information resources; and a high-level narrative description. The primary candidates for
the initial visibility effort should be the COI's current operational data assets, followed by
mature developmental capabilities that are on a rapid deployment track to fill known
mission data gaps and information needs. Prioritization occurs at the COI's discretion,
taking into consideration organizational preparedness, technical ease of service
implementation, law, policy and security classification restrictions, impact of broader
access on the COI's operations, and the quantitative and qualitative improvements that
might result from making a particular asset visible.
Define and register COI extensions for discovery metadata. One core purpose for
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      623
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
COIs is to foster agreements on the meaning and physical representation of their
assets, as packaged and offered in deployed services. This includes the agreement on
any metadata necessary to properly describe the community's, information and IT
services data assets. The DDMS provides the minimum discovery metadata
requirements to support enterprise discovery of these assets and can be extended by
COIs to provide additional context that aids in the search for relevant data assets.
Leverage work from other COIs. COIs should leverage the DSE to access guidance
on technical, organizational, and procedural approaches to data asset publication. Other
available information includes specific DDMS extensions registered by other COIs, data
schemas for carrying product payload, taxonomies, and other data engineering artifacts.
These models can provide a starting point for the COI efforts to reach agreement on
common elements that will be important for users to discover COI data assets.
Additional information regarding COIs that have registered metadata in the DoD
Metadata Registry may be available in the COI Directory.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      624
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         metadata that describes the file. Software automation of this task is highly
         recommended; however, the precise mechanism will depend on the type of data
         asset and granularity of description. The DDMS provides the minimum required
         structure and content for discovery-related tags. By adhering to this specification
         for tagging, the minimum necessary discovery metadata to participate in
         federated searches will be available.
COIs should establish, as part of its plan for long-term maintenance of COI metadata
artifacts, a plan for maintaining the discovery metadata, the COI extensions to the DoD
Discovery Metadata Specification, and the service discovery metadata. The goal is to
make data visible as soon as possible and to develop those resources over time. The
COI should agree on a schedule and process for how it will maintain the discovery
metadata, to ensure that the data is always the most current.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      625
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
7.4.2.3.1.3. Making Data, Information, and IT Services Accessible
Making data, information and IT services accessible focuses on offering data assets
over the network through commonly supported access methods. While making data,
information and IT services visible involves creation and use of discovery metadata,
making these assets accessible refers to providing access to the underlying information
provided by the asset so that authorized DoD users can make use of it.
This section describes activities that aid in implementing paragraph 4.3 of DoD Directive
8320.02, "Data Sharing in a Net-Centric Department of Defense," December 2, 2004.
Individually negotiated interfaces between systems are brittle and inflexible; they
support only the information transfers anticipated during development, not the "pull-on-
demand" transfers that are a key part of net-centric data sharing. While point-to-point
interfaces will continue to exist, DoD CIO Memorandum "DoD Net-Centric Data
Strategy," May 9, 2003, emphasizes the need to transition those interfaces and
implement new interfaces to support many-to-many information exchanges and
authorized but unanticipated users. Data, information, and IT service producers should
make data assets accessible using web-based approaches, minimizing the need for
predefined, engineered point-to-point interfaces wherever operationally and technically
possible.
Understand asset sharing constraints. The COI should identify any existing policies,
laws, or data classifications that would restrict access to the data across the Enterprise.
Traditional data access mechanisms will contain many implicit rules indicating how
systems respond to requests, based on how the requests fall into a predefined process
for handling the requests. Therefore, in addition to identifying explicit restrictions on
access, the COI should also consider the potential for (and attempt to discern) built-in
role-based access control systems. COIs should maintain awareness of evolving DoD
information assurance, information security, and information sharing policies, and
incorporate them as appropriate into COI activities and implementations.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      626
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Discover enterprise resources. The COI should leverage work products of other
COIs, operational data access mechanisms that are available, and available net-centric
interface standards and specifications.
    •    Enterprise Considerations. The COI can promote access mechanism reuse, and
         minimize the work required to obtain desired capabilities by collaborating with
         other COIs. In addition, the COI can make its own data accessible on an
         enterprise scale by adhering to existing technical standards. Interfaces
         developed using standard interface specifications enable COI-developed access
         mechanisms to exchange information readily with enterprise services resulting in
         wider access to the community's data assets.
    •    Technical Guidance. The Key Interface Profiles are the set of documentation
         produced as a result of interface analysis that designates an interface as key;
         analyzes it to understand it's architectural, interoperability, test, and configuration
         management characteristics; and documents those characteristics in conjunction
         with solution sets for issues identified during analysis.
Identify assets to make accessible. The COI should determine which assets within
the associated organizations, programs of record, sub-portfolios, etc., are likely to be of
most value to those inside and outside the COI taking into account the potential for
authorized but unanticipated users. The assets that the COI makes accessible will
typically be a necessary component of the new information sharing capability identified
by the COI.
Define requirements for access mechanisms. The COI should define the priority of
and functional requirements for data access mechanisms. Depending on the situation,
the COI may base these requirements on an existing data access mechanism or
establish them as part of an ongoing implementation plan. In setting requirements for
data access mechanisms, the COI should take into account the type of assets; the
security, license, and privacy considerations; and the static, dynamic, or streaming
nature of data change. The data access mechanism specifications should conform to
any agreements put forward by the stakeholders and the COI.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      627
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         (DISR), according to DoD Directive 4630.05, provides the minimal set of rules
         governing the arrangement, interaction, and interdependence of system parts or
         elements whose purpose is to ensure that a conformant system satisfies a
         specified set of requirements. The standards and guidelines in the DISR are
         stable, technically mature, and available via DISRonline.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      628
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
accessible is still not usable unless it is understandable. The Assistant Secretary of
Defense for Networks and Information Integration/DoD Chief Information Officer
Memorandum " DoD Net-Centric Data Strategy," May 9, 2003, provides for the
existence of expedient communities of interest (COIs) that may have diverse needs,
based on operational requirements. It is therefore not always safe to assume that
consumers will be familiar with what a COI's data, information and IT services means,
the way it is structured, or particularly how it fits into the COI's operational context. Most
important, it is not necessarily the case that all consumers will be using these assets in
the same way or for the same purpose. For example, "a tank" in the Army might refer to
an armored vehicle, whereas "a tank" in the Navy might refer to a storage device for
fluids. Although the data producer's perspective might be reasonable within the
producer's context, the consumer might have a very different purpose in mind.
Gather existing semantic metadata. The DSE will contain vocabularies, taxonomies,
ontologies, conceptual data schemas, and other forms of semantic metadata from other
COIs upon which the COI might base development of its own semantic metadata. In
addition, the COI should discover existing semantic metadata among its members. In
this way, the COI can start the process with a foundation in related semantics.
Gather existing structural metadata. The DSE also contains logical and physical data
schemas that could aid the COI in forming structural representations that would be
understandable to end-users. Data asset structure (such as whether dates are
represented as normal, or as Julian dates) is an important aspect of understanding. By
using the DSE and consulting COI members, the COI start the process with a
foundation in related structures.
Develop a shared understanding of COI data made visible. COI members, pooling
subject matter expertise, should collaborate on several semantic metadata artifacts that
are crucial for providing context and meaning to any COI data that is made visible and
accessible.
Agree on a shared vocabulary. The COI should use its own extensions to the DDMS
as a starting point for the shared vocabulary. As a set of terms and definitions, the
shared vocabulary should include any term used in the COI extensions, along with
definitions that put these and other terms into proper COI context.
Agree on a conceptual data schema. The conceptual data schema indicates high-
level data entities. Its coverage includes any entities in visible COI data assets, as well
as the relationships between those data entities. The conceptual schema's coverage
area may include multiple data assets, requiring that the COI come to an agreement on
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      629
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
how members will collaborate, possibly through a COI data working group, to develop
the conceptual schema.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      630
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         Registry and following the instructions for submission.
Determine how the community of interest (COI) will maintain metadata artifacts.
As the COI develops over time, the shared vocabulary, COI taxonomy, and other
metadata artifacts that enable understandability should remain synchronized with the
subject area they represent. To help it attain this objective, a COI could institute rules
relating to how shared vocabulary updates occur. In addition, COI governance should
be consulted for configuration management standards and related maintenance
schedules.
Ensure that data structure meets the consumers' needs, including those of
unanticipated users. The physical structure of the data affects how the consumer will
understand and utilize the data. Because it is not possible to know the unanticipated
uses and needs of the data, COIs can engage in ongoing planning to change the
structure of the data as it is exposed to the consumer via the access mechanism. Note
that this sort of change represents a change to the access mechanism, not necessarily
a change to the underlying data asset. Such changes can be meaningful only if they are
made with consideration for user feedback.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      631
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
7.4.2.3.1.5.2. Forward Planning
A consumer that can locate, access, and understand a particular data, information or IT
service asset, will want to assess the authority of the that asset to determine whether
the contents can be trusted. Promoting trust focuses on identifying sources clearly and
associating rich pedigree and security metadata with the assets to support the
consumer's trust decision.
While COIs can promote trust through implementation of the activities described in this
section, this Guidebook does not provide COIs the authority to share information in any
way that is prohibited by law, policy, or security classification.
Identify authoritative data sources. The community of interest (COI) should make
every effort to identify data assets that are authoritative sources for data, as well as
identifying in what contexts the data is authoritative. In situations where there is more
than one authoritative source, depending on how the data is used, the COI should
indicate the business process for which the authority is valid.
Associate trust discovery metadata with data, information, and IT service assets.
The COI should include trust discovery metadata to support data consumers' decisions
on which assets are appropriate for their use. There are three categories of trust
discovery metadata. These are discussed in the following subparagraphs.
    •    Asset pedigree metadata. The source and lineage of an asset are its pedigree.
         The purpose of the pedigree is to enable consumers to determine whether the
         asset is fit for their intended use and to enable them to track the flow of
         information, its transformations, and modifications, through assets. Notional
         metadata describing an asset's pedigree would include creation date,
         modification date, processing steps (including methods and tools), source and
         author (if known) status, and validation results against a published set of
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      632
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         constraints.
    •    Security labels. Security labels provided in discovery metadata enable services
         to restrict access to data assets on the basis of a COI's identified parameters,
         including classification and dissemination controls. Preventing unauthorized
         access to data assets is important to promote trust in the data among authorized
         users.
    •    Associate rights protection metadata. Rights protection metadata refers to
         metadata that indicates any copyright, trade secret, trademark, licensing,
         proprietary information, Privacy Act, or other usage restriction. As such, it may
         not be appropriate for all assets. Nevertheless, where this metadata does apply,
         it is important that it be provided. Consumers and data access services can only
         protect data against inappropriate use if they are informed of restrictions.
    •    Technical Guidance. The DoD Discovery Metadata Specification (DDMS)
         references the security elements found in the Intelligence Community Metadata
         Working Group document, specifying 18 attributes that can be used for
         information in classification and controls marking. The DDMS category named
         "Security" contains relevant elements addressing classification and
         dissemination. The "Source" category contains elements for asset pedigree
         metadata, and the "Rights" category contains applicable elements for rights
         protection metadata. The COI can obtain background on security tagging by
         checking the Intelligence Community Metadata Standard for Information Security
         Markings and accessing the Data Element Dictionary.
Because a data asset can be trusted only if its contents are sufficiently accurate and of
sufficiently reliable quality, assessing and improving data asset quality is important.
Quality assertions about data include information on its accuracy, completeness, or
timeliness for a particular purpose. For example, consumers might need to know the
age of the data to determine whether it is trustworthy, or they might need to know how
accurate estimates and figures within the data asset are. Typically, such metadata
results from a separate data quality analysis of an asset. The community of interest
(COI) may develop an ongoing process for auditing the quality of data assets that are
made visible and accessible. This process should be designed in concert with the COI
leadership's ongoing quality assurance and configuration management efforts .
7.4.3. Integrating Net-Centric Information Sharing into the Acquisition Life Cycle
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      633
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
7.4.3.5. Govern Data Activities
7.4.3. Integrating Net-Centric Information Sharing into the Acquisition Life Cycle
A description of the program's approach for ensuring that information assets will be
made visible, accessible, and understandable to any potential user as early as possible
(DoD Directive 8320.02, "Sharing Data, Information, and Information Technology (IT)
services in the Department of Defense, March, 2013"). Recommended scope of data,
information and IT services activities follow:
Evaluate information from sources such as compliance reports, incentive plan reports,
policy, and user needs to create net-centric data, information and IT services guidance
documents. This guidance is the policy, specifications, standards, etc., used to drive
activities within the program/organization. It differs from a net-centric data or services
plan in that the plan is more strategic in nature. Data guidance may be a subset of an
overall net-centric data sharing plan.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      634
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Program to insert its metadata requirements into an overall Domain metadata model).
Build upon existing and revised architectures and plans to describe the architecture to
support data sharing objectives. The architecture should depict components that
emphasize the use of discovery, services-based approach to systems engineering, use
of metadata to support mediated information exchange, web-based access to data
assets, etc.
Determine what data assets (documents, images, metadata, services, etc.) are
produced or controlled within a program or organization. This is primarily an inventory of
data assets, which should include both structured and unstructured data sources as well
as IT services.
Assess the data asset inventory to identify key data products that are of greatest value
to known users and are likely to be of value to unanticipated users. This list should be
used to determine data assets a program/organization should make initial efforts at
exposing as enterprise data assets.
Responsibilities: Both Sponsor/Domain Owners and PMs should analyze and prioritize
which data assets are most valuable, initially, to be exposed as enterprise data assets.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      635
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Define Communities of Interest (COIs)
Identify appropriate groups of people who should come together to support common
mission objectives. COIs are an appropriate construct for defining information exchange
formats and metadata definitions as well as vocabularies used to communicate within
the COI. This activity does not include the establishment of actual COIs. This is simply
the process of identifying COIs that exist or should exist.
Responsibilities: Sponsors/Domain Owners should define major COIs that could benefit
missions within the Domain (and across Domains). PMs should identify other COIs that
serve the goals of the program and its associated functional areas.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      636
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
directories include Universal Description, Discovery, and Integration Directories used to
maintain Web Services information. This is a key component of establishing a services-
based architecture that supports net-centric data tenets.
Post data assets to an information sharing application (e.g., end-user web site, a file
system, a document repository) or through the use of web services to provide system-
to-system access, etc.
Manage COIs
Responsibilities: Both Sponsor/Domain Owners and PMs should establish, register, and
maintain identified COIs.
Associate or generate discovery metadata for data assets. This activity is the 'tagging'
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      637
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
of data assets to provide value-added information about data assets that can be used to
support discovery, accessibility, information assurance, and understandability.
Responsibilities: PMs should ensure that discovery metadata associated with each data
asset is posted to searchable metadata catalogs (established by the Domain and by
Programs).
Participate in governance activities that enable net-centric data asset sharing. This
includes participation in DoD IE Enterprise Service efforts, net-centric architectural
compliance, Capabilities Portfolio Management for net-centric information sharing, etc.
This activity involves vetting, publicizing, and institutionalizing the Net-Centric Data
Sharing plans and guidance developed in the Data Planning process.
Responsibilities: Both Sponsor/Domain Owners and PMs should advocate the DoD Net-
Centric Data Strategy and Domain-established data guidance.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      638
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
7.4.4. Supporting Language for Information Technology (IT) System
Procurements
To ensure support of the goals of NCDS, the Program Manager (PM), through his or her
contracting specialists, should include the following sections, as appropriate, in Request
for Proposal (RFP)/Request for Quotation (RFQ) language for the procurement of IT
systems.
The contractor shall ensure that any IT systems covered in this procurement or
identified in this RFP/RFQ support the goals of the Defense Information Enterprise
Architecture 1.1, dated May 27, 2009, and its subsequent official updates and revisions.
The contractor shall ensure that any IT systems covered in this procurement or
identified in this RFP/RFQ support the goals of the DoD Net-Centric Data Strategy
dated May 9, 2003, and comply with the Department's data strategy as defined in DoD
Directive 8320.02, Sharing Data, Information, and Information Technology (IT) services
in the Department of Defense", March, 2013. The contractor shall ensure that any IT
systems covered in this procurement or identified in this RFP/RFQ support the goals of
DoD Net-Centric Services Strategy, Strategy for a Net-Centric, Service Oriented DoD
Enterprise, March, 2007.
Also, the contractor must ensure that any IT systems covered in this procurement or
identified in this RFP/RFQ meet the requirements detailed below. Additionally, it is
acceptable for vendors and/or integrators to provide functionality (via wrappers,
interfaces, extensions) that tailor the COTS system to enable these requirements below
(i.e., the COTS system need not be modified internally if the vendor/integrator enables
the requirements through external or additional mechanisms. In this case, these
mechanisms must be acquired along with the COTS system procurement).
    •    Access to Data: The contractor shall ensure that all data managed by the IT
         system can be made accessible to the widest possible audience of DIE users via
         open, web-based standards. Additionally, the system's data should be accessible
         to DIE users without 1) the need for proprietary client-side software/hardware, or
         2) the need for licensed user-access (e.g. non-licensed users should be able to
         access the system's data independent to the licensing model of the COTS
         system). This includes all data that is used to perform mission-related analysis
         and processing including structured and unstructured sources of data such as
         databases, reports, and documents. It is not required that internal, maintenance
         data structures be accessible.
    •    Metadata: The contractor shall ensure that all significant business data made
         accessible by the IT system is tagged with descriptive metadata to support the
         net-centric goal of data visibility. Accordingly, the system data shall be tagged to
         comply, at a minimum, with the DoD Discovery Metadata Specification (DDMS).
         This specification is available at: http://metadata.ces.mil/dse/irs/DDMS/ . The
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      639
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         system should provide DDMS-compliant metadata at an appropriate level based
         on the type of data being tagged. It is not required that individual records within
         databases be tagged; rather it is expected that the database itself or some
         segment of it is tagged appropriately. Additionally, the contractor shall ensure
         that all structural and vocabulary metadata (metamodels, data dictionaries)
         associated with the exposed system data be made available in order to enable
         understanding of data formats and definitions. This includes proprietary metadata
         if it is required to effectively use the system data.
    •    Enterprise Services/Capabilities: The contractor shall ensure that key business
         logic processing and other functional capabilities contained within the IT system
         are exposed using web-based open standards (e.g., application programming
         interfaces provide for Web Services-based access to system processes and
         data). The level of business logic exposure shall be sufficient to enable
         reuse/extension within other applications and/or to build new capabilities. The
         contractor shall provide an assessment of how any licensing restrictions affect or
         do not affect meeting the goals of re-use and exposure as DoD Information
         Enterprise-wide enterprise services.
Optional Components/Modules: The contractor shall ensure that all standard and/or
optional components of the IT system are identified and procured in a manner that
ensures the requirements outlined in this document are met.
7.5.2.8. DoD Instruction 8581.01, "Information Assurance (IA) Policy for Space
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      640
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Systems Used by the Department of Defense"
Most programs delivering capability to the warfighter or business domains will use
information technology (IT) to enable or deliver that capability. For those programs,
developing a comprehensive and effective approach to IA is a fundamental requirement
and will be key in successfully achieving program objectives. The Department of
Defense defines IA as "measures that protect and defend information and information
systems by ensuring their availability, integrity, authentication, confidentiality, and non-
repudiation. This includes providing for the restoration of information systems by
incorporating protection, detection, and reaction capabilities." DoD policy and
implementing instructions on information assurance are in the 8500 series of DoD
publications. Program Managers (PMs) and functional proponents for programs should
be familiar with statutory and regulatory requirements governing information assurance,
and understand the major tasks involved in developing an IA organization, defining IA
requirements, incorporating IA in the program's architecture, developing an Acquisition
IA Strategy (when required), conducting appropriate IA testing, and achieving IA
certification and accreditation for the program. The information in the following sections
explains these tasks, the policy from which they are derived, their relationship to the
acquisition framework, and the details one should consider in working towards effective
IA defenses-in-depth in a net-centric environment.
Note: DAG Section 7.5 will be re-written to reflect the re-issuance of DoDI 8500.01
Cyber security and DoDI 8510.01 Risk Management Framework (RMF) for DoD
Information Technology (IT) instructions are signed and published. Until then, the
current information provided in Section 7.5 remains valid.
Acquisition managers shall address information assurance requirements for all weapon
systems; Command, Control, Communications, Computers, Intelligence, Surveillance,
and Reconnaissance systems; and IT programs that depend on external information
sources or provide information to other DoD systems. DoD policy for information
assurance of IT, including NSS, appears in DoD Directive 8500.01E.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      641
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
7.5.2.2. DoD Instruction 5000.02, "Operation of the Defense Acquisition System"
Ensure that the program has an information assurance strategy that is consistent with
DoD policies, standards and architectures, to include relevant standards.
This directive establishes policy and assigns responsibilities under 10 U.S.C. 2224 to
achieve DoD information assurance through a defense-in-depth approach that
integrates the capabilities of personnel, operations, and technology, and supports the
evolution to network centric warfare. According to DoD Directive 8500.01E, all
acquisitions of DoD Information Systems (to include Automated Information System
applications, Outsourced IT-based Processes, and platforms or weapon systems) with
connections to the Global Information Grid, must be certified and accredited.
This Directive will be re-written and combined with the revised DoDI 8500.2 and
published in Q4FY13 as DoDI 8500.01. The ramifications of the revised policy will move
the DoD to the Risk Management Framework as implemented by the National Institute
of Standards and Technology (NIST) 800 series Special Publications.
This instruction establishes the DoD information assurance (IA) certification and
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      642
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
accreditation (C&A) process for authorizing the operation of DoD information systems
consistent with the Federal Information Security Management Act and DoD Directive
8500.01E. The instruction superseded DoD Instruction 5200.40 (DITSCAP) and DoD
8510.1-M (DITSCAP Manual). The DIACAP process supports net-centricity through an
effective and dynamic IA C&A process. It also provides visibility and control of the
implementation of IA capabilities and services, the C&A process, and accreditation
decisions authorizing the operation of DoD information systems, to include core
enterprise services and web services-enabled software systems and applications. This
Instruction is under revision with the new version due Q4FY13. The ramifications of the
revised policy will institute a shift from the current DIACAP process to DoDs adoption,
implementation, execution, and maintainance of the NIST RMF.
This directive establishes policy and assigns responsibilities for DoD IA training,
certification, and workforce management. Along with the accompanying manual, it
provides guidance and procedures for the identification and categorization of positions
and certification of personnel conducting IA functions within the DoD workforce
supporting the DoD Global Information Grid (GIG) per DoD Instruction 8500.2. The DoD
IA Workforce includes, but is not limited to, all individuals performing any of the IA
functions described in the manual.
7.5.2.8. DoD Instruction 8581.01, "Information Assurance (IA) Policy for Space
Systems Used by the Department of Defense"
This strategy recognizes that cyberspace is a key sector of the global economy. The
security and effective operation of U.S. critical infrastructure including energy, banking
and finance, transportation, communication, and the Defense Industrial Base rely on
cyberspace, industrial control systems, and information technology that may be
vulnerable to disruption or exploitation. This strategy notes that foreign cyberspace
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      643
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
operations against U.S. public and private sector systems are increasing in number and
sophistication.
Accordingly, Program Managers must ensure procedures and processes are in place
for the protection of DoD program information residing on or transiting corporate
unclassified networks and information systems. The objective is to protect DoD
information, not just DoD systems, and it relates to all programs, not just those IT
focused. Several policy documents provide additional guidance in this area for inclusion
in developing the IA Strategy and RFP IA clauses.
7.5.3. Information Assurance (IA) Integration into the Acquisition Life Cycle
7.5.3.4. After Milestone C or before the Full Rate Production Decision Review (or
equivalent for MAIS Programs)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      644
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
developed in preparation for Milestone A will be more general, and contain a lesser
level of detail than acquisition IA strategies submitted to support subsequent Milestone
decisions. Click here to see the Acquisition IA Strategy Instructions.
Update and submit the Acquisition IA Strategy. Click here for an Acquisition IA Strategy
Template.
Secure resources for IA. Include IA in program budget to cover the cost of developing,
procuring, testing, certifying and accrediting, and maintaining the posture of system IA
solutions. Ensure appropriate types of funds are allocated (e.g., Operations &
Maintenance for maintaining IA posture in out years).
Test and evaluate IA solutions. See Chapter 9, Test and Evaluation (T&E), for
information on testing.
• Developmental Test.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      645
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Security Test & Evaluation, Certification and Accreditation activities.
    •    Operational Test.
Accredit the system under the DIACAP or other applicable Certification and
Accreditation process. For systems using the DIACAP, an Authorization to Operate
should be issued by the Designated Accrediting Authority.
7.5.3.4. After Milestone C or before the Full Rate Production Decision Review (or
equivalent for MAIS Programs)
Maintain the system's security posture throughout its life cycle. This includes periodic
re-accreditation.
Figure 7.5.4.F1 shows the relationship between the acquisition framework and typical
timeframes for accomplishing key IA activities.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      646
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
7.5.5. Integrating Information Assurance (IA) into the Acquisition Process
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      647
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
MC/ME Network Interconnections to the GIG
  No                       No                                                Not Required * Recommended **
  No                      Yes                                                Not Required *    Required
  Yes                      No                                                  Required     Recommended **
  Yes                     Yes                                                  Required        Required
Legend: AIS = Automated Information System
GIG = Global Information Grid
IT = Information Technology
MAIS = Major Automated Information System
MC/ME = Mission Critical / Mission Essential
PM = Program / Project Manager
Because requirements for IA vary greatly across acquisition programs, PMs should
examine acquisition programs carefully to identify applicable IA requirements. The
following guidelines derived from DoD Directive 8500.01E apply:
    1. Programs that do not involve the use of IT in any form have no IA requirements.
       PMs should carefully examine programs, however, since many programs have IT
       (such as automatic test equipment) embedded in the product or its supporting
       equipment.
    2. Programs that include IT always have IA requirements, but these IA
       requirements may be satisfied through the normal system design and test
       regimen, and may not be required to comply with DoD Directive 8500.01E.
       Acquisitions that include Platform IT with no network interconnection to the GIG
       fit into this category. However, such programs require an Acquisition IA Strategy
       if they are designated Mission Critical or Mission Essential.
    3. Acquisitions of Platforms with network interconnections to the GIG must comply
       with the IA requirements of DoD Directive 8500.01E and DoD Instruction 8500.2.
    4. Acquisitions of AIS applications or outsourced IT processes also must comply
       with DoD Directive 8500.01E and DoDI 8500.2.
    5. Programs that include IT, and that are designated Mission Critical or Mission
       Essential, require an Acquisition IA Strategy without regard to the applicability of
       DoD Directive 8500.01E. The DoD Component Chief Information Officer (CIO) is
       responsible for approving the Acquisition IA Strategy. Subsequent to DoD
       Component CIO approval, in accordance with DoD Instruction 8580.1, the DoD
       CIO must review the Acquisition IA Strategy.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      648
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
7.5.6. Program Manager (PM) Responsibilities
PMs for acquisitions of platforms with internal IT (including platforms such as weapons
systems, sensors, medical technologies, or utility distribution systems) remain ultimately
responsible for the platform's overall IA protection. If the Platform IT has an
interconnection to the GIG, in accordance with DoD Instruction 8500.2, the PM must
identify all assurance measures needed to ensure both the protection of the
interconnecting GIG enclave, and the protection of the platform from connection risks
(such as unauthorized access), that may be introduced from the enclave. However,
connecting enclaves have the primary responsibility for extending needed IA services
(such as Identification and Authentication) to ensure an assured interconnection for both
the enclave and the interconnecting platform. These IA requirements should be
addressed as early in the acquisition process as possible.
PMs for acquisitions of platforms with IT that does not interconnect with the GIG retain
the responsibility to incorporate all IA protective measures necessary to support the
platform's combat or support mission functions. The definition of the GIG recognizes
"non-GIG IT that is stand-alone, self-contained or embedded IT that is not or will not be
connected to the enterprise network." Non-GIG IT may include "closed loop" networks
that are dedicated to activities like weapons guidance and control, exercise,
configuration control or remote administration of a specific platform or collection of
platforms. The primary test between whether a network is part of the GIG or is non-GIG
IT is whether it provides enterprise or common network services to any legitimate GIG
entity. In any case, PMs for systems that are not connected to GIG networks should
consider the IA program provisions in DoD Directive 8500.01E and DoD Instruction
8500.2, and should employ those IA controls appropriate to their system.
PMs for acquisitions of AIS applications are responsible for coordinating with enclaves
that will host (run) the applications early in the acquisition process to address
operational security risks which the system may impose upon the enclave, as well as
identifying all system security needs that may be more easily addressed by enclave
services than by system enhancement. The baseline IA Controls serve as a common
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      649
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
framework to facilitate this process. The Designated Accrediting Authority for the
enclave receiving an AIS application is responsible for incorporating the IA
considerations for the AIS application into the enclave's IA plan. The burden for
ensuring that an AIS application has adequate assurance is a shared responsibility of
both the AIS application PM and the Designated Accrediting Authority for the hosting
enclave; however, the responsibility for initiation of this negotiation process lies clearly
with the PM. PMs should, to the extent possible, draw upon the common IA capabilities
that can be provided by the hosting enclave.
PMs for acquisitions of Outsourced IT-based Processes must comply with the IA
requirements in the 8500 policy series. They are responsible for delivering outsourced
business processes supported by private sector information systems, outsourced
information technologies, or outsourced information services that present specific and
unique challenges for the protection of the GIG. The PM for an Outsourced IT-based
process should carefully define and assess the functions to be performed and identify
the technical and procedural security requirements that must be satisfied to protect DoD
information in the service provider's operating environment and interconnected DoD
information systems.
    •    In one scenario, the service is hosted at vendor facilities, and accordingly, DoD
         does not have significant control of the operations of the Managed Enterprise
         Service.
    •    In the second scenario, the Managed Enterprise Service is hosted in a DoD
         facility, but operations are provided by one or more vendors. Managed services
         that are DoD Component-wide or that belong to the warfighter or business
         mission areas are outside the scope of Managed Enterprise Services. If your
         acquisition includes Managed Enterprise Services, see DoD CIO Memorandum
         "Certification and Accreditation Requirements for DoD-wide Managed Enterprise
         Services Procurements," dated June 22, 2006.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      650
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
to developing or purchasing any DoD information system that will collect, maintain, use,
or disseminate PII about members of the public. The DoD Instruction 5400.16 provides
procedures for completing and approving PIAs and expanded the requirement to
include federal personnel, DoD contractors and, in some cases, foreign nationals.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      651
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
           These systems handle information that is necessary
           for the conduct of day-to-day business, but does
     3                                                                                        BASIC               BASIC
           not materially affect support to deployed or
           contingency forces in the short-term.
The other major component in forming the baseline set of IA controls for every
information system is determined by selecting the appropriate confidentiality level based
on the sensitivity of the information associated with the information system. DoD has
defined three levels of confidentiality, identified in Table 7.5.7.1.T2.
The specific set of baseline IA controls that the PM should address is formed by
combining the appropriate lists of Mission Assurance Category (MAC) and
Confidentiality Level controls specified in the DoD Instruction 8500.2. Table 7.5.7.2.T1
illustrates the possible combinations.
                                                                                                   DoDI 8500.2
                                 Mission Assurance                    Confidentiality
         Combination                                                                               Enclosure 4
                                      Category                            Level
                                                                                                   Attachments
                 1                         MAC 1                         Classified                   1 and 4
                 2                         MAC 1                         Sensitive                    1 and 5
                 3                         MAC 1                           Public                     1 and 6
                 4                         MAC 2                         Classified                   2 and 4
                 5                         MAC 2                         Sensitive                    2 and 5
                 6                         MAC 2                           Public                     2 and 6
                 7                         MAC 3                         Classified                   3 and 4
                 8                         MAC 3                         Sensitive                    3 and 5
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      652
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                 9                         MAC 3                            Public                      3 and 6
There are a total of 157 individual IA Controls from which the baseline sets are formed.
Each IA Control describes an objective IA condition achieved through the application of
specific safeguards, or through the regulation of specific activities. The objective
condition is testable, compliance is measurable, and the activities required to achieve
the objective condition for every IA Control are assignable, and thus accountable. The
IA Controls specifically address availability, integrity, and confidentiality requirements,
but also take into consideration the requirements for non-repudiation and authentication.
A system being acquired may have specific IA requirements levied upon it through its
controlling capabilities document (i.e., Capstone Requirements Document, Initial
Capabilities Document, Capability Development Document, or Capability Production
Document). These IA requirements may be specified as performance parameters with
both objective and threshold values.
All IA requirements, regardless of source, are compiled in the system's DoD Information
Assurance Certification and Accreditation Process (DIACAP) Implementation Plan
(similar to the system Requirements Traceability Matrix used in the DoD Information
Technology Security Certification and Accreditation Process, superseded by the
DIACAP). The DIACAP Implementation Plan documents all IA controls and
requirements assigned, whether implemented or "inherited," and for each displays the
implementation status, resources required, and the estimated completion date.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      653
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
The Department regularly publishes security configuration guidelines enabling IT
components to deliver the highest level of inherent security. These guidelines can be
obtained from the following sites: Security Technical Implementation Guides from the
Defense Information Systems Agency, and Security Configuration Guides from the
National Security Agency.
7.5.9.1. Development
The primary purpose of the Acquisition IA Strategy is to ensure compliance with the
statutory requirements of Title 40/Clinger-Cohen Act and related legislation, as
implemented by DoD Instruction 5000.02. As stated in Table 8, Enclosure 5, of that
instruction, the Acquisition IA Strategy provides documentation that "Ensure that the
program has an information assurance strategy that is consistent with DoD policies,
standards and architectures, to include relevant standards." The PM develops the
Acquisition IA Strategy to help the program office organize and coordinate its approach
to identifying and satisfying IA requirements consistent with DoD policies, standards,
and architectures.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      654
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
generated from the DIACAP or other Certification and Accreditation (C&A) processes.
Developed earlier in the acquisition life cycle and written at a higher level, the
Acquisition IA Strategy documents the program's overall IA requirements and approach,
including the determination of the appropriate certification and accreditation process.
The Acquisition IA Strategy must be available for review at all Acquisition Milestone
Decisions, including early milestones when C&A documentation would not yet be
available.
The Acquisition IA Strategy lays the groundwork for a successful C&A process by
facilitating consensus among the PM, Component CIO, and DoD CIO on pivotal issues
such as Mission Assurance Category, Confidentiality Level, and applicable Baseline IA
Controls; selection of the appropriate C&A process; identification of the Designated
Accrediting Authority and Certification Authority; and documenting a rough timeline for
the C&A process.
7.5.9.1. Development
Click here for a sample Acquisition IA Strategy Template that can be tailored as
appropriate.
Acquisition IA Strategies must be submitted for approval and review in accordance with
Table 7.5.9.2.T1, which is based on submission requirements detailed in DoD
Instruction 5000.02, Enclosures 4 and 5. Sufficient time should be allowed for
Acquisition IA Strategy preparation or update, DoD Component CIO review and
approval, and DoD CIO review prior to applicable milestone decisions, program review
decisions, or contract awards.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      655
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    *Acquisition Category (ACAT) descriptions are provided in DoD Instruction
    5000.02, Table 1
Click here to view the Acquisition IA Strategy Development, Review and Approval
Process MS PowerPoint briefing that contains information on Acquisition IA Strategy
key success factors, key stakeholders, critical content criteria, and the review and
approval process.
In accordance with DoD Directive 8500.01E, all acquisitions of AISs (to include MAISs),
outsourced IT-based processes, and platforms or weapon systems with connections to
the GIG must be certified and accredited. The primary methodology for certifying and
accrediting DoD information systems is the DoD Information Assurance Certification
and Accreditation Process (DIACAP) of DoD Instruction 8510.01.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      656
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Assurance (IA) Considerations for Source Selection Procedures
Every year the Department acquires a vast array of IT services from the commercial
sector, valued in the billions of dollars. These services support, impact, or utilize DoD
information systems and networks both on and off the GIG. Because of this broad
scope it is essential that IA be carefully considered as a factor in the planning,
procurement, and execution of these services.
Throughout this section, the services of an "IA professional" are recommended for the
development and review of IA elements within acquisition strategies, plans, and
procurement documentation. In selecting the appropriate IA professional support,
ensure that the individual's IA knowledge and experience are appropriate to the task.
Table 7.5.12.T1 suggests appropriate IA workforce categories and levels from the DoD
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      657
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Manual 8570.01-M, "Information Assurance Workforce Improvement Program Manual,"
for commonly required tasks. See the manual for details of knowledge, experience, and
professional certifications required for each category and level.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      658
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Network operations and Computer Security support
As with the acquisition strategy, the IA language in the RFP is driven by the
characteristics of the IT services requirement. However, regardless of the specifics of
the acquisition, the goal of the RFP is to clearly and unambiguously communicate to
potential offerors what our IA requirements are, and what we expect from them in terms
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      659
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
of compliance and performance.
Click here for the Sample RFP IA Clause contract language that can be tailored as
appropriate, and included in Section H (Special Contract Requirements) of the
solicitation.
Contract Data Requirements List (CDRL). In this section, identify any IA-related data
products that the potential contractor must produce. This may include reports, IA
artifacts, or other IA documentation.
Section M: Evaluation Factors for Award. This section contains the evaluation factors
and significant subfactors by which offers will be evaluated and the relative importance
that the Government places on these evaluation factors and sub-factors. See section
7.5.12.3 for additional guidance.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      660
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
7.5.12.3. Acquisition of Information Technology (IT) Services Information
Assurance (IA) Considerations for Source Selection Procedures
Section M of the Uniform Contract format contains the Evaluation Factors for Award.
This section contains the evaluation factors and significant sub-factors by which offers
will be evaluated and the relative importance that the Government places on these
evaluation factors and sub-factors. IA is just one of numerous factors that may be
assessed for the purposes of making a contract award decision. It may be a major
contributing factor in a best value determination, or it may be a minimum qualification for
an award based primarily on cost or price.
The extent to which IA considerations impact the award factors is a direct function of the
clear communication and understanding of the potential loss or damage that an IA
failure could subject to a system, organization or mission capability. For this reason, an
IA professional should be tasked to assess the IA requirement and risks, and to advise
the contracting officer accordingly. As appropriate, an IA professional should develop IA
related evaluation factors, and participate in the negotiation of relative weightings of
these factors. Correspondingly, an IA professional should also be part of the source
selection evaluation board to ensure that the IA aspects of offerors' proposals are
assessed for technical and functional appropriateness, adequacy, and compliance with
requirements.
In many large IT services contracts, the initial contract award merely establishes the
scope of work, pricing, and other global factors, but no specific work is done until
separate task orders are established. For these indefinite delivery-indefinite quantity
(IDIQ) contracts, the IA considerations can vary widely from order to order. Additionally,
orders may be originated from activities separate from the activity that awarded the
basic IDIQ contract, even from other agencies. To ensure that IA is appropriately
considered in these individual and potentially unique orders, the "ordering guide" for the
contract should inform the ordering activities of their responsibilities with regards to IA.
Specifically, ordering/requiring activities are responsible to ensure that any order placed
for IT services will result in a commitment from the service provider to deliver services
that comply with DoD IA policies. To do this, the ordering activity must be aware of what
general IA requirements are invoked in the basic contract, and then ensure that
individual orders provide specific details, and any supplemental IA requirements that
may be needed to achieve policy requirements. For example, the basic contract may
invoke DoD Instruction 8500.2 and require "implementation of appropriate baseline IA
controls", but the individual order would have to specify the Mission Assurance
Category (MAC) and Confidentiality Level relevant to that order.
Finally, since IT services acquisitions must comply with the Title 40/Clinger-Cohen Act
which requires a level of assurance that IA compliance is being achieved, it may be
appropriate to direct that a hierarchy of IA review and approvals be established based
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      661
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
on factors such as dollar value of the individual orders. This will ensure that qualifying
orders are reviewed at an oversight level commensurate with their value.
Click here for the Sample IA Section of an Ordering Guide. The specific form, structure
and content should be driven by the needs of the acquisition, and the example is
provided merely to offer a point of departure, and may not be appropriate for a specific
acquisition.
    •    For acquisitions of IT Services estimated at greater than $250M (basic plus all
         options)
            o DoD Component IA Review of Acquisition Strategy/Acquisition
                Plan/Request for Proposal (RFP)
    •    For acquisitions of IT Services estimated at greater than $500M (basic plus all
         options)
            o DoD Component IA Review of Acquisition Strategy/Acquisition Plan/RFP,
                and
            o DoD CIO IA ** Review of Acquisition Strategy/Acquisition Plan/RFP, and
            o Notification of cognizant Mission Area Portfolio Manager by ther DoD CIO
                Acquisition prior to RFP release.
For acquisitions of IT services below the $250M threshold, follow Component guidance.
For acquisition of IT services related to telecommunications or transport infrastructure,
recommend review for IA technical sufficiency by Defense IA/Security Accreditation
Working Group (DSAWG) representative.
For IA-related definitions, refer to Enclosure 2 of DoD Directive 8500.01E and Enclosure
2 of DoD Instruction 8500.2. All other definitions are defined in CNSSI 4009.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      662
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
7.6.1. EM Spectrum Considerations
In accordance with DoDI 5000.02, Enclosure 12, paragraph 11, the Program Manager
(PM) must consider the use of the EM SPECTRUM when delivering capability to the
warfighter's or business domains. The fundamental questions are:
Ensuring the compatible operation of DoD systems in peace and in times of conflict is
becoming increasingly complex and difficult. DoD's demand for spectrum access is
increasing as more systems become net-centric and information is pushed to the
"tactical edge". In addition, the EM environment in which the DoD operates around the
globe is becoming more congested as consumer applications that require spectrum are
introduced and take hold. System developers can no longer assume their systems will
be operating in an interference-free frequency band or that a single band will work
around the world. Given these circumstances, DoD Instruction 4650.01 states the
following as one of spectrum management's core principles: "Pursue spectrum-efficient
technologies to support the increasing warfighter demand for spectrum access and
encourage development of spectrum-dependent systems that can operate in diverse
EM environments."
National and DoD policies and procedures for the management and use of the EM
Spectrum direct PMs developing spectrum-dependent systems/equipment to consider
EM SPECTRUM requirements and Electromagnetic Environmental Effects (E3) control
early in the development process. Given the complex environment (both physical and
political) in which DoD forces operate, and the potential for worldwide use of capabilities
procured for DoD, early and thorough consideration is vitally important. These policies
and procedures are intended to ensure the following:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      663
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         (including the United States) to use the equipment within their respective borders
         and near the geographic borders of other countries (within coordination zones);
    •    Sufficient spectrum will be available in the operational environment during the
         system/equipment's life cycle; and
Because this requires coordination at the national and international levels, getting
spectrum advice early helps a PM identify and mitigate spectrum-related risks and
successfully deliver capabilities that can be employed in their intended operational
environment.
E3 control is concerned with proper design and engineering to minimize the impact of
the EM environment on equipment, systems, and platforms. E3 control applies to the
EM SPECTRUM interactions of both spectrum-dependent and non- spectrum-
dependent objects within the operational environment. Examples of non- spectrum-
dependent objects that could be affected by the EM environment include all other
electrical/electronic systems, ordnance, personnel, and fuels. The increased
dependency on, and competition for, portions of the EM Spectrum have increased the
likelihood of adverse interactions among sensors, networks, communications, weapons
systems, fuels, personnel, and ordnance.
DoD has established procedures, described below, to identify and mitigate spectrum-
related risks and to control the E3 impacts on the equipment, systems, and platforms
used by our military forces. Spectrum requirements shall be addressed early in
acquisition programs ( DoD Instruction 4650.01). In accordance with DoD Directive
3222.3, "DoD Electromagnetic Environmental Effects (E3) Program," proper design and
engineering techniques to control E3 shall be considered throughout the acquisition
process to ensure the successful delivery of operational capabilities to the warfighter.
7.6.2.2. Title 47, Code of Federal Regulations (CFR), Chapter III, Part 300.1
7.6.2.3. Office of Management and Budget (OMB) Circular A-11, Section 31.12
7.6.2.4. DoD Instruction 4650.01, "Policy and Procedures for the Management and
Use of the Electromagnetic Spectrum"
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      664
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
7.6.2.1. DoD Instruction 5000.02, "Operation of the Defense Acquisition System"
    •    For all EM spectrum-dependent systems, PMs must comply with U.S. and host
         nation spectrum regulations. They shall submit written determinations to the DoD
         Component Chief Information Officer (CIO) or equivalent that the EM
         SPECTRUM necessary to support the operation of the system during its
         expected life cycle is, or will be, available. These determinations shall be the
         basis for recommendations provided to the Milestone Decision Authority (MDA)
         at the milestones defined in Table 3 in Enclosure 4 of DoD Instruction 5000.02.
    •    Tables 2-1 and 2-2 in Enclosure 4 state the statutory requirement for all
         developers of systems/equipment that use the EM SPECTRUM in the U.S. and
         its possessions to submit a DD Form 1494 "Application for Equipment Frequency
         Allocation" and get Certification of Spectrum Support from the National
         Telecommunications and Information Administration (NTIA).
7.6.2.2. Title 47, Code of Federal Regulations (CFR), Chapter III, Part 300.1
7.6.2.3. Office of Management and Budget (OMB) Circular A-11, Section 31.12
7.6.2.4. DoD Instruction 4650.01, "Policy and Procedures for the Management and
Use of the Electromagnetic Spectrum"
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      665
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
This instruction establishes policy and procedures for management and use of the EM
spectrum and the supportability of DoD spectrum-dependent systems in the EM
spectrum and states:
In addition, DoD Instruction 4650.01 states that spectrum policy and spectrum
management functions shall be guided by the following core principles:
    •    Ensure the U.S. warfighter has sufficient EM spectrum access to support military
         capabilities.
    •    Support a U.S. EM spectrum policy that balances national and economic
         security, with national security as the first priority.
    •    Use the EM spectrum as efficiently and effectively as practical to provide the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      666
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         greatest overall benefit to warfighting capability.
    •    Pursue spectrum-efficient technologies to support the increasing warfighter
         demand for EM spectrum access.
    •    Encourage development of spectrum-dependent systems that can operate in
         diverse EM environments.
    •    Actively support U.S. policies and interests in international EM spectrum bodies
         and in international negotiations for spectrum allocation and access.
This directive establishes policy and responsibilities for the management and
implementation of the DoD E3 Program. This program facilitates mutual EM
compatibility and effective E3 control among land, air, sea, and space-based electronic
and electrical systems, subsystems, and equipment, and the existing natural and man-
made environments.
It states DoD policy that all electrical and electronic systems, subsystems, and
equipment, including ordnance containing electrically initiated devices, shall be mutually
compatible in their intended EM environment without causing or suffering unacceptable
mission degradation due to E3.
7.6.3.2. Before Milestone B (or before the first Milestone that authorizes contract
award)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      667
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
7.6.3.10. Spectrum and E3 Control Requirements in the Contract Data
Requirements List (CDRL)
PMs shall take the following actions to mitigate spectrum-related risks for spectrum-
dependent equipment, and minimize the E3 on all military forces, equipment, systems,
and platforms (both non- and spectrum-dependent). Consideration of these critical
elements throughout the acquisition process will help to ensure successful delivery of
capability to the warfighter.
The PM shall include the funding to cover Spectrum Supportability Risk Assessments
(SSRAs), required certification processes, and control of E3 as part of the overall
program budget. Section 7.6.4.1 addresses SSRAs; Section 7.6.4.4 addresses E3.
    •    Perform initial regulatory SSRA to identify and refine spectrum issues. See
         Section 7.6.4.1 for details.
    •    For systems that will be operated in the U.S. and its possessions, complete a
         Stage 1 (Conceptual) Certification of Spectrum Support through the National
         Telecommunications and Information Administration (NTIA). Contact your
         sponsoring military department frequency management office (MILDEP FMO) for
         details on the process. The process can take several months, so start as early as
         practical. See Section 7.6.4.2 for details.
7.6.3.2. Before Milestone B (or before the first Milestone that authorizes contract
award)
    •    Update the spectrum supportability and E3 control requirements and ensure they
         are addressed in the Capability Development Document.
    •    Perform initial technical and initial operational SSRAs to identify spectrum issues.
         See Section 7.6.4.1 for details.
    •    For systems that will be operated in the U.S. and its possessions, complete a
         Stage 2 (Experimental) Certification of Spectrum Support through the NTIA.
         Contact your sponsoring MILDEP FMO for details on the process. The process
         can take several months so start as early as practical. See Section 7.6.4.2 for
         details.
    •    For systems that will be operated outside the U.S. and its possessions, initial
         discussions with host nations should be conducted to determine if there may be
         significant obstacles to obtaining authorization to operate. MILDEP frequency
         managers in conjunction with the Joint Staff will assist the PM in initiating
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      668
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         discussions with regional combatant command frequency management offices.
         Discussion should concentrate on host nations where the systems will be
         permanently deployed.
    •    Obtain applicable U.S. and/or host nation authorizations before testing spectrum-
         dependent systems or components.
    •    Provide initial technical performance data to Defense Information Systems
         Agency (DISA) via supporting MILDEP FMOs.
    •    Discuss spectrum and E3 control requirements and any associated issues in the
         initial ISP.
    •    Define, in the TEMP, those spectrum-related and E3 control requirements that
         must be tested during Developmental Test and Evaluation and Operational Test
         and Evaluation. TEMPs shall include, within the scope of critical operational
         issues and sub-issues, the requirement to demonstrate the effective E3 control of
         systems, subsystems, and equipment.
    •    Address SSRA, certification of spectrum support, and E3 control requirements in
         the Government's Statement of Work, Performance Specifications, and contract
         data requirements to be provided to the contractor.
    •    Update the spectrum and E3 control requirements and ensure they are
         addressed in the Capability Production Document.
    •    Perform a detailed regulatory and a detailed technical SSRA to ensure all issues
         have been identified and are being mitigated. See Section 7.6.4.1 for details.
    •    For systems that will be operated in the U.S. and its possessions, complete a
         Stage 3 (Developmental) Certification of Spectrum Support through the NTIA.
         Contact your sponsoring MILDEP FMO for details on the process. The process
         can take several months so start as early as practical. See Section 7.6.4.2 for
         details.
    •    For systems that will be operated overseas, more detailed discussions with host
         nations may be required to resolve any significant obstacles to obtaining
         authorization to operate. MILDEP frequency managers in conjunction with the
         Joint Staff will assist the PM in initiating discussions with regional combatant
         command frequency management offices. Discussion should concentrate on
         host nations where the systems will be permanently deployed.
    •    Obtain applicable U.S. and/or host nation authorizations before testing spectrum-
         dependent systems or components.
    •    Provide updated technical performance data to DISA via supporting MILDEP
         FMOs.
    •    Refine the discussion of spectrum and E3 control requirements and any
         associated issues in the ISP for record.
    •    Refine discussion of spectrum-related and E3 control requirements to be tested
         in the revised TEMP.
    •    Address SSRA, certification of spectrum support, and E3 control requirements in
         the Government's Statement of Work, Performance Specifications, and contract
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      669
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         data requirements to be provided to the contractor.
The JCIDS Manual and CJCS Instruction 6212.01 reference other spectrum-related
policies and restate some of the requirements. However, CJCSI 6212.01 was published
prior to implementation of DoD Instruction 4650.01 and needs revision. In cases of
conflicting policy, DoD Instruction 4650.01 takes precedence for spectrum-related
requirements.
CJCSI 6212.01 includes spectrum and E3 requirements in the NR KPP under the
heading of Supportability Requirements.
Per CJCSI 6212.01, the Joint Staff will use the following assessment criteria when
reviewing documents for interoperability:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      670
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    If applicable, does the document address host nation approval?
    •    If applicable, has a DD Form 1494 been submitted to the military department
         Frequency Management Office?
    •    Does the document include a spectrum supportability compliance statement or
         outline a plan to obtain spectrum supportability?
    •    Does the document address spectrum supportability as a separate requirement
         in a paragraph?
Spectrum. The XXX System will comply with the applicable DoD, National, and
International spectrum management policies and regulations. Required performance
data will be submitted to the supporting MILDEP Frequency Management Office.
(Threshold)
DoD Instruction 4630.8 references other spectrum-related policies and restates some of
the requirements. However, it was published prior to implementation of DoD Instruction
4650.01 and it needs revision. In cases of conflicting policy, DoD Instruction 4650.01
takes precedence for spectrum-related requirements.
According to DoD Instruction 4630.8, the ISP must "discuss RF spectrum needs" in
Chapter 2 (see details in Section 7.3.6.7.2). Spectrum-related and E3 control issues
shall be described in the ISP Chapter 3 (see details in Section 7.3.6.7.3).
Within the TEMP, the critical operational issues for suitability or survivability are usually
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      671
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
appropriate to address spectrum supportability and E3 control requirements. The overall
goals of the test program with respect to spectrum supportability and E3 control
requirements are to ensure that appropriate evaluations are conducted during
developmental test and evaluation, and that appropriate assessments are performed
during operational test and evaluation. See Section 9.5.5 for details.
Sample Language. The following are four examples of critical operational issues
statements in the TEMP:
Military Standards (MIL-STD) 461 and 464 and Military (MIL-HDBK) 237 provide crucial
guidance that, if followed, should preclude E3 problems with the critical systems
provided to the warfighter. (Note: MIL-HDBK 237D does not reflect new requirements in
DoD Instruction 4650.01, published in January 09, and needs to be revised. DoD
Instruction 4650.01 takes precedence.)
Electromagnetic Interference (EMI) Control. The equipment shall comply with the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      672
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
applicable requirements of MIL-STD-461.
As an alternative, the program manger can tailor E3 control requirements from MIL-
STD-461 or MIL-STD-464. Both MIL-STD-461 and MIL-STD-464 are interface
standards. See Section 9.5.2 for testing standards and guidance from Director,
Operational Test & Evaluation and from Development Test and Evaluation. See the
DoD ASSIST homepage for additional information on Military specs and standards.
The contractor shall design, develop, integrate, and qualify the system such that it
meets its Operational Performance Requirements and the applicable spectrum
supportability and E3 control requirements in the system specification. The contractor
shall perform analyses, studies, and testing to ensure the system is designed to comply
with the applicable DoD, National, and International spectrum management and E3
control policies and regulations. The contractor shall perform inspections, analyses, and
tests, as necessary, to verify that the system complies with the applicable DoD,
National, and International spectrum management and E3 control policies and
regulations. The contractor shall prepare and update spectrum-dependent system
technical performance data throughout the development of the system and shall
perform sufficient analysis and testing to characterize the equipment, where necessary.
The contractor shall establish and support spectrum and E3 control requirements
Working-level Integrated Product Team (WIPT) to accomplish these tasks.
The following are examples of data item requirements typically called out for spectrum
supportability and E3 control requirements in the CDRL:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      673
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    DI-EMCS-81542B E3 Verification Report
7.6.4.2. U.S. Government (USG) and Host Nation (HN) Certification of Spectrum
Support
7.6.5. Definitions
Spectrum-dependent system developers shall assess the risk for harmful interference
with other spectrum-dependent systems and/or harmful radiation-related effects. At a
minimum, electromagnetic interference (EMI) and electromagnetic compatibility (EMC)
assessments shall be made.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      674
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Spectrum-dependent system developers shall manage spectrum supportability risks
with other developmental risks through systems engineering processes.
Complex "family of systems" or "system-of-systems" may require more than one SSRA.
                                                       Regulatory
                                            •    Determine countries for likely operational
                                                 deployment within each Combatant
                                                 Commander area of responsibility.
                                            •    Determine the internationally recognized
                                                 radio service of all spectrum-dependent sub-
                                                 systems.
                                            •    Identify portions of the system's tuning range
                                                 supported by each host nation's (HN's) table
                                                 of frequency allocation.
                                            •    Determine the relative regulatory status, for
                                                 example, co-primary or secondary, assigned
                                                 to the radio service by the HN's table of
             Initial Regulatory                  frequency allocations.
                  Spectrum                  •    Obtain international comments on U.S.
               Supportability                    military systems of the same radio service
             Risk Assessment                     and with similar technical characteristics
               (SSRA) Tasks                      submitted for HN spectrum certification
                                                 (available via the DoD Host-Nation Spectrum
                                                 Worldwide Database Online).
                                            •    Identify other U.S. military, U.S. civil, and
                                                 non-U.S. co-band and adjacent-band and
                                                 harmonically-related systems likely to be co-
                                                 site or in close proximity by querying DoD
                                                 system databases or the appropriate National
                                                 Telecommunications and Information
                                                 Administration (NTIA) database.
                                            •    Identify risks and develop recommendations
                                                 for mitigation of regulatory issues.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      675
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                            •    Address Military Communications-Electronics
                                                 Board (MCEB), NTIA and other guidance
                                                 resulting from the certification of spectrum
                                                 support process.
                                            •    Consult with the DoD Component spectrum
                                                 management office regarding changes to
                                                 U.S. Federal or civil telecommunication
                                                 regulations impacting the system's frequency
                  Detailed                       bands.
                 Regulatory                 •    Determine if the system meets appropriate
                SSRA Tasks                       military, U.S. national, and international
                                                 spectrum standards for radiated bandwidth
                                                 and transmitter characteristics.
                                            •    Quantify the impacts of any changes to U.S.
                                                 Government or international spectrum
                                                 regulations or technical sharing criteria.
                                            •    Identify risks and develop recommendations
                                                 for mitigation of regulatory issues.
Technical
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      676
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                            •    Determine candidate technologies and their
                                                 technical parameters:
                                                     o Application: fixed, transportable,
                                                         mobile
                                                     o Host platform (dismounted soldier,
                                                         airborne, tactical operations center,
                                                         etc.)
                                                     o Frequency range of operation
                                                     o Required data throughput
                                                     o Receiver selectivity
                                                     o Receiver criteria required for desired
                                                         operation
                                                     o Required radiated bandwidth
                                                     o Transmitter power output
                                                     o Antenna performance characteristics
              Initial Technical                      o Anticipated HNs for deployment
                SSRA Tasks                  •    Perform an initial electromagnetic
                                                 compatibility (EMC) analysis to identify
                                                 electromagnetic interactions that require
                                                 further study. The analysis should use, as a
                                                 minimum, technical parameters for the
                                                 candidate system and the technical
                                                 parameters of spectrum-dependent systems
                                                 expected to be in the candidate's operational
                                                 environment.
                                            •    Evaluate the initial system parameters with
                                                 respect to U.S. and appropriate international
                                                 spectrum standards; develop plans to
                                                 address non-compliant systems.
                                            •    Identify risks and develop recommendations
                                                 for mitigation of technical issues.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      677
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                            •    Evaluate systems performance and effect on
                                                 other spectrum-dependent system that
                                                 operates co-frequency or adjacent frequency
                                                 expected to be found in the intended
                                                 operational environment.
                                            •    Determine the acceptable received
                                                 interference level between the system being
                                                 analyzed and other spectrum-dependent
                                                 systems to ensure neither is significantly
                                                 degraded and that coexistence is feasible.
                                            •    Use measured performance of the system's
                                                 receiver, transmitter, antenna, and
                                                 appropriate propagation models whenever
                                                 feasible.
                                            •    Use propagation models developed
                                                 specifically for mobile communications
                                                 systems to determine potential link
                                                 degradation and blockage due to atmospheric
                                                 conditions or terrain and building obstructions
                                                 within intended deployments areas.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      678
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                                 international spectrum standards.
                                            •    Generate technical recommendations
                                                 regarding mitigating potential interference by
                                                 implementing channelization plans, advanced
                                                 narrow-beam antennas, (active, spot and
                                                 contoured-beam, etc.), as well as use of
                                                 passive radio frequency components (filters,
                                                 diplexers, couplers, etc.).
                                            •    Identify risks and develop recommendations
                                                 for mitigation of technical issues.
                                                      Operational
                                            •    Determine the expected complement of
                                                 spectrum-dependent systems anticipated to
                                                 be in the systems operating environment. The
                                                 system should operate without experiencing
                                                 or causing interference as part of the DoD
                                                 response to conventional and non-
                                                 conventional (disaster relief) missions.
                                            •    Perform a more extensive EMC analysis
                  Initial                        quantifying the potential interference between
                Operational                      the candidate system and the spectrum-
                SSRA Tasks                       dependent systems used by other DoD units
                                                 in the operational environment. Express the
                                                 results in operational terms, e.g., the
                                                 frequency-distance separation requirements
                                                 between a transmitter and a receiver that
                                                 must be maintained to achieve compatibility.
                                            •    Identify risks and develop recommendations
                                                 for mitigation of technical issues.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      679
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                            •    Refine the expected complement of
                                                 spectrum-dependent systems anticipated to
                                                 be in the systems operating environments.
7.6.4.2. U.S. Government (USG) and Host Nation (HN) Certification of Spectrum
Support
PMs shall request certification of spectrum support via the appropriate Service
Frequency Management Office using procedures in Chapter 10 of NTIA "Manual of
Regulations and Procedures for Federal Radio Frequency Management."
Additionally, as required by OMB Circular A-11, Section 31.12 (see section 7.6.2.3), this
certification must be completed prior to submission of cost estimates for development or
procurement of major spectrum-dependent systems and for all space and satellite
systems.
Additional coordination is required for satellite systems per NTIA "Manual of Regulations
and Procedures for Federal Radio Frequency Management". Information required for
requesting either an exemption from the International Telecommunication Union
registration or advanced publication, coordination, and notification of a particular space
system must be submitted to the NTIA.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      680
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
agreements with HNs and by the Military Communications-Electronics Board.
Requirements for certification vary by HN. PMs should contact their appropriate Service
Frequency Management Office for details on process and procedures.
In most cases, the operational frequency assignments are requested and received as a
program is being fielded. However, if the PM has implemented guidance received in
response to requests for certification of spectrum support and designed the system as
described in the performance data provided, system operators have not historically
encountered problems in obtaining operational frequency assignments.
It is critical that all electrical and electronic equipment be designed to be fully compatible
in the intended operational EM environment. The DoD has experience spectrum-
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      681
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
dependent with items developed without adequately addressing E3 which resulted in
poor performance, disrupted communications, reduced radar range, and loss of control
of guided weapons. Failure to consider E3 can result in mission failure, damage to high-
value assets, and loss of human life. Compounding the problem, there is increased
competition for the use of the spectrum by DoD, non-DoD Government, and civilian
sector users; and many portions of the EM spectrum are already congested with
spectrum-dependent systems. Additionally, new spectrum-dependent platforms/systems
and subsystems/equipment are technologically complex, highly sensitive, and often
operate at higher power levels. All of these factors underscore the importance of
addressing E3 control requirements early in the acquisition process.
7.6.5. Definitions
Electromagnetic Compatibility (EMC). Defined in Joint Publication 1-02 as: The ability
of systems, equipment, and devices that use the EM spectrum to operate in their
intended operational environments without causing or suffering unacceptable or
unintentional degradation because of EM radiation or response. It involves the
application of sound EM spectrum management; system, equipment, and device design
configuration that ensures interference-free operation; and clear concepts and doctrines
that maximize operational effectiveness.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      682
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Electromagnetic Environmental Effects (E3). Defined in Joint Publication 1-02 as:
The impact of the electromagnetic environment upon the operational capability of
military forces, equipment, systems, and platforms. It encompasses all electromagnetic
disciplines, including electromagnetic compatibility and electromagnetic interference;
electromagnetic vulnerability; electromagnetic pulse; electronic protection, hazards of
electromagnetic radiation to personnel, ordnance, and volatile materials; and natural
phenomena effects of lightning and precipitation static.
Electromagnetic (EM) Spectrum. Defined in Joint Publication 1-02 as: The range of
frequencies of EM radiation from zero to infinity. It is divided into 26 alphabetically
designated bands. The terms "electromagnetic spectrum" and "spectrum" shall be
synonymous.
Host Nations (HNs). Defined in Joint Publication 1-02 as : A nation which receives the
forces and/or supplies of allied nations and/or NATO organizations to be located on, to
operate in, or to transit through its territory.
Net-Centric. Defined in DoDI 8320.02 as: Relating to, or representing the attributes of
net-centricity. Net-centricity is a robust, globally interconnected network environment
(including infrastructure, systems, processes and people) in which data is shared timely
and seamlessly among users, applications and platforms. Net-centricity enables
substantially improved military situational awareness and significantly shortened
decision making cycles. Net-Centric capabilities enable network-centric operations and
network-centric warfare (NCW).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      683
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
7.7. Accessibility of Electronic and Information Technology
In accordance with Section 508 of Public Law 105-220, "The Workforce Investment Act
of 1998," now codified as Title 29 USC Sec. 794d, When developing, procuring,
maintaining, or using electronic and information technology, each Federal department or
agency, shall ensure, unless an undue burden would be imposed, that the electronic
and information technology (E&IT) allows, regardless of the type of medium of the
technology--
(i) individuals with disabilities who are Federal employees to have access to and use of
information and data that is comparable to the access to and use of the information and
data by Federal employees who are not individuals with disabilities; and
(ii) individuals with disabilities who are members of the public seeking information or
services from a Federal department or agency to have access to and use of information
and data that is comparable to the access to and use of the information and data by
such members of the public who are not individuals with disabilities (
http://www.justice.gov/crt/508/508law.php).
Section 508 does NOT however, in accordance with DoDM 8400.01-M, Procedures for
Ensuring the Accessibility of Electronic and Information Technology (E&IT) Procured by
DoD Organizations, apply to the following:
(1) Any E&IT operated by agencies, the function, operation, or use of which involves
intelligence activities, cryptologic activities related to national security, command and
control of military forces, equipment that is an integral part of a weapon or weapons
system, or systems critical to the direct fulfillment of military or intelligence missions.
Systems that are critical to the direct fulfillment of military or intelligence missions do not
include systems used for routine administrative and business applications (including
payroll, finance, logistics, and personnel management applications).
(3) E&IT located in spaces frequented only by service personnel for maintenance,
repair, or occasional monitoring of equipment.
The law further directs the United States Access Board to develop standards to support
Section 508 . For Requiring Officials who will be acquiring E&IT, the General Services
Administration provides a website for assistance in developing Section 508 language in
DoD contracts . Included at this site is the Voluntary Product Accessibility Template or
VPAT, which helps PMs identify products and vendors that comply with this Federal
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      684
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
law. Additional guidance on the DoD Section 508 program is provided in DoD Section
508 Manual (DoDM 8400.01-M) dated June 3, 2011 .
7.8. The Clinger-Cohen Act (CCA) -- Subtitle III of Title 40 United States Code
(U.S.C.)
7.8.1. Overview
7.8.1. Overview
Subtitle III of Title 40 of the United States Code (formerly known as Division E of the
Clinger-Cohen Act (CCA) (hereinafter referred to as "Title 40/CCA") applies to all
Information Technology (IT) investments, including National Security Systems (NSS).
(Note: Throughout the remainder of this subchapter 7.8, the term IT is presumed to
mean IT, including NSS.) Title 40/CCA requires Federal agencies to focus more on the
results achieved through its IT investments, while streamlining the Federal IT
procurement process. Specifically, this Act introduces much more rigor and structure
into how agencies approach the selection and management of IT projects.
Title 40/CCA generated a number of significant changes in the roles and responsibilities
of various Federal agencies in managing the acquisition of IT. It elevated oversight
responsibility to the Director of the Office of Management and Budget (OMB) and
established and gave oversight responsibilities to the departmental Chief Information
Officer (CIO). Also, under this Act, the head of each agency is required to implement a
process for maximizing the value and assessing and managing the risks of the agency's
IT acquisitions.
In DoD, the DoD CIO has the primary responsibility of providing management and
oversight of all Department IT to ensure the Department's IT systems are interoperable,
secure, properly justified, and contribute to mission goals.
The basic requirements of the Title 40/CCA , relating to DoD's acquisition process, have
been institutionalized in DoD Instruction 5000.02, "Operation of the Defense Acquisition
System;" in particular, Enclosure 5, IT Considerations . The requirements delineated in
the Title 40/CCA Compliance Table at Enclosure 5 of DoD Instruction 5000.02 must
also be considered and applied to all IT investments, regardless of acquisition category,
and tailored commensurate to size, complexity, scope, and risk levels. Table 7.8.1.T1
depicts a summary of Title 40/CCA obligations and authorities.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      685
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
      Table 7.8.1.T1. Summary of Clinger-Cohen Act Compliance Confirmations*
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      686
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
software, firmware and similar procedures, services (including support services), and
related resources; but
(C) does not include any equipment acquired by a federal contractor incidental to a
federal contract.
(1) National security system.- In this section, the term national security system means
a telecommunications or information system operated by the Federal Government, the
function, operation, or use of which-
(E) subject to paragraph (2), is critical to the direct fulfillment of military or intelligence
missions.
(2) Limitation.- Paragraph (1)(E) does not include a system to be used for routine
administrative and business applications (including payroll, finance, logistics, and
personnel management applications).
The Title 40/CCA Compliance Table, Table 7.8.4.T1, in Section 7.8.4 below, details
actions required to comply with Title 40/CCA regulatory requirements, mandatory DoD
policy, and the applicable program documentation that can be used to fulfill the
requirement. This table emulates the DoD Instruction 5000.02 Title 40/CCA Compliance
Table, Table 8, with the addition of columns relating the requirement to applicable
Milestones and regulatory guidance.
The requirements in this table must be satisfied before Milestone approval of any
Acquisition Category (ACAT) I (i.e., Major Defense Acquisition Program (MDAP)) and
ACAT IA (i.e., MAIS Program) and prior to the award of any contract for the acquisition
of a Mission-Critical or Mission-Essential IT system, at any level.
TAKE NOTE: The requirements delineated in this table must also be considered and
applied to all IT investments, regardless of acquisition category, and tailored
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      687
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
commensurate to size, complexity, scope, and risk levels.
Table 7.8.4.T1 is a Title 40/CCA compliance table that includes hyperlinks relative to
each compliance area. A brief discussion of each compliance area and hyperlinks to
additional pertinent information follow the table. For comprehensive coverage of the
Title 40/CCA, including policy documents, best practices, examples, and lessons
learned, refer to the CCA Community of Practice website.
Actions Required to
                                    Applicable Program Applicable                                Regulatory
Comply With Title 40
                                    Documentation 1    Milestone                                 Requirement
U.S.C. Subtitle III
1. Make a
determination that the
acquisition supports                ICD Approval                      Milestone A                CJCSI 3170.01
core, priority functions
of the Department. 2
2. Establish outcome-
                                                                                                 CJCSI 3170.01
based performance                   ICD, CDD, CPD and                 Milestone A, B &
measures linked to                  APB approval                      C
                                                                                                 DoDI 5000.02
strategic goals. 2
3. Redesign the
processes that the
                                    Approval of the ICD,
system supports to                                                                               CJCSI 3170.01
                                    Concept of                        Milestone A, B &
reduce costs, improve
                                    Operations, AoA,                  C
effectiveness and                                                                                DoDI 5000.02
                                    CDD, and CPD
maximize the use of
COTS technology. 2
                      Technology           Milestone A
4. Determine that no  Development Strategy
Private Sector or                          Milestone B                                           CJCSI 3170.01
Government source can Acquisition Strategy
better support the    page XX, para XX                                                           DoDI 5000.02
          3
function.
                      AoA page XX
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      688
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                                                      For MAIS:
                                                                      Milestone A & B,
                                                                      & FRPDR (or
                                                                      their equivalent)
5. Conduct an analysis
                       AoA                                                                       DoDI 5000.02
of alternatives. 3
                                                                      For non-MAIS:
                                                                      Milestone B or
                                                                      the first Milestone
                                                                      that authorizes
                                                                      contract award
6. Conduct an
economic analysis that
includes a calculation of Program LCCE
                                                                                                 CJCSI 3170.01
the return on
                                                  Milestone A & B
investment; or for non- Program Economic
                                                                                                 DoDI 5000.02
AIS programs, conduct Analysis for MAIS
a Life-cycle Cost
Estimate (LCCE). 3
7. Develop clearly        Acquisition Strategy
established measures page XX
                                                  Milestone B                                    DoDI 5000.02
and accountability for
program progress          APB
8. Ensure that the
acquisition is consistent APB (Net-Ready KPP)
with the Global                                                                                  CJCSI 6212.01
                                                  Milestone A, B &
Information Grid          ISP (Information
                                                  C
policies and              Exchange                                                               DoDI 5000.02
architecture, to include Requirements)
relevant standards
9. Ensure that the
program has an
information assurance
                                                  Milestone A, B,                                DoDI 5000.02
strategy that is          Acquisition Information
                                                  C, FRPDR or
consistent with DoD       Assurance Strategy
                                                  equivalent*****                                DoDD 8580.01
policies, standards and
architectures, to include
relevant standards
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      689
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
10. Ensure, to the
maximum extent
practicable, (1) modular
contracting has been
used, and (2) the
program is being                                                      Milestone B or
implemented in phased, Acquisition Strategy                           the first Milestone
                                                                                          DoDI 5000.02
successive increments, page XX                                        that authorizes
each of which meets                                                   contract award
part of the mission need
and delivers
measurable benefit,
independent of future
increments
11. Register Mission-                                                 Milestone B,
Critical and Mission-    DoD IT Portfolio
                                                                                                 DoDI 5000.02
Essential systems with Repository                                     Update as
the DoD CIO 4/5                                                       required
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      690
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Title 40/CCA Compliance Table Notes:
1. The system documents/information cited are examples of the most likely but not the
only references for the required information. If other references are more appropriate,
they may be used in addition to or instead of those cited. Include page(s) and
paragraph(s), where appropriate.
3. These actions are also required in order to comply with Section 811 of Public Law
106-398 (Reference (ag)).
4. For NSS, these requirements apply to the extent practicable (Title 40 U.S.C. 11103,
Reference (v)).
5. Definitions:
5. Only unclassified data may be entered into DoD Information Technology Portfolio
Repository. If the information about the system being registered is classified up to
SECRET collateral level, the system should be registered with the DoD CIO by entering
it into the DoD Secret Internet Protocol Router Network IT Registry.
One other topic not addressed in the Title 40/CCA Compliance Table is the Post
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      691
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Implementation Review (PIR), previously referred to as the Post Deployment
Performance Review. See Section 7.9 of this guide for an in-depth discussion of PIR.
7.8.6.1. Determining that the Acquisition Supports the Core, Priority Functions of
the Department
7.8.6.8. The acquisition is consistent with the Global Information Grid (GIG)
policies and architecture
7.8.6.9. The program has an Information Assurance (IA) strategy that is consistent
with DoD policies, standards and architectures
7.8.6. Title 40, Subtitle III /Clinger-Cohen Act (CCA) Compliance Requirements
This section provides an overview of the actions stipulated in the Title 40/CCA
Compliance Table, which must be addressed and ultimately lead to confirmation of
compliance of a MAIS or MDAP by the DoD CIO. The DoD Component Requirements
Authority, in conjunction with the Acquisition Community, is accountable for
requirements 1 through 5 of the table; the program manager (PM) is accountable for
requirements 6 through 11.
The PM should prepare a table similar to Table 7.8.4.T1, above, to indicate which
documents support the Title 40/CCA requirements. DoD Component CIOs should use
those supporting documents to assess and confirm Title 40/CCA compliance. For in-
depth coverage of each Title 40/CCA requirement, refer to the CCA Community of
Practice as well as the links provided in subsections 7.8.6.1 through 7.8.6.11 and
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      692
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
section 7.9.
7.8.6.1. Determining that the Acquisition Supports the Core, Priority Functions of
the Department
Overview: This element of the Title 40/CCA asks if the function supported by a
proposed acquisition is something the Federal government actually needs to perform;
i.e., for the DoD, is the function one that we (DoD and/or its Components) must perform
to accomplish the military missions or business processes of the Department?
For Warfare Mission Area and Enterprise Information Environment functions, this
question is answered in the Joint Capabilities Integration and Development System
(JCIDS) process. Before a functional requirement or new capability enters the
acquisition process, the JCIDS process (See the JCIDS Manual) requires the
Sponsor/Domain Owner (hereafter referred to as the Sponsor)to conduct a series of
analyses. The result of these analyses is reported in an Initial Capabilities Document.
Ideally, these analyses will show that the acquisition supports core/priority functions that
should be performed by the Federal Government. Moreover, the analysis should
validate and document the rationale supporting the relationship between the
Department's mission (i.e., core/priority functions) and the function supported by the
acquisition.
Who is Responsible? The Sponsor with cognizance over the function leads the analysis
work as part of the JCIDS processes.
Implementation Guidance: Ensure that the JCIDS analytical work addresses the Title
40/CCA question by establishing the linkage between the mission, the function
supported, the capability gap and potential solutions. The following questions should be
helpful in determining whether a program supports DoD core functions:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      693
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
based management is the establishment of outcome-based performance measures,
also known as measures of effectiveness (MOE), for needed capability. MOEs for
capabilities needed by the Warfighting and Enterprise Information Environment Mission
Areas are developed during a Capabilities-based Assessment (CBA) and recorded in a
validated Initial Concept Document. The Business Mission Area identifies outcome-
based performance measures during the business case development process and
records the approved measures in the business plan.
This section defines measurement terminology, relates it to DoD policy and provides
guidance for formulating effective outcome-based performance measures for IT
investments. For clarification, the various uses and DoD definitions of MOEs are
provided in the CCA Community of Practice (CoP). Regardless of the term used, the
Title 40/CCA states that the respective Service Secretaries shall:
In summary, we are obligated to state the desired outcome, develop and deploy the
solution, and then measure the extent to which we have achieved the desired outcome.
For further discussion, see the Title 40/CCA language in OMB Circular A-11, Part 7,
Page 16 of Section 300, Part ID. Additionally, discussions on the statutory basis and
regulatory basis for MOEs and their verification are available in the IT-CoP.
Who is Responsible?
    •    The program Sponsor with cognizance over the function oversees the
         development of the MOEs during the CBA phase of the JCIDS process. The
         Sponsor ensures that the MOEs are outcome-based standards for the validated
         capabilities.
            o The PM must be aware of the MOEs and how they relate to overall
                program effectiveness and document these MOEs in the Exhibit 300 that
                is part of DoD's budget submission to OMB.
    •    The DoD CIO assesses the outcome-based measures in deciding whether to
         confirm Title 40/CCA compliance for ACAT IA programs and recommend Section
         801 (2366(a)) (or subsequent defense authorization provision) compliance to the
         Milestone Decision Authority (MDA) for ACAT ID programs.
Implementation Guidance: This section is written to help the Sponsor prepare the MOEs
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      694
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
and to help the PM understand his/her role in the MOE refinement process. The key to
understanding and writing MOEs for IT investments is to recognize their characteristics
and source. Therefore, MOEs should be:
To satisfy the requirement that an MOE be independent of any solution and not specify
system performance or criteria, the MOE should be established before the Materiel
Solution Analysis phase because the MOEs guide the analysis and selection of
alternative solutions leading up to Milestone A. Although the MOE may be refined as a
result of the analysis undertaken during this phase, the source of the initial
mission/capability MOE is the functional community. The MOE is the common link
between the Initial Capabilities Document (ICD), the Analysis of Alternatives (AoA) and
the benefits realization assessment conducted during a PIR as described in Section 7.9
of this guide.
As stated in Table 8 of DoD Instruction 5000.02, for a weapon system with embedded
IT and for command control systems that are not themselves IT systems, it shall be
presumed that the acquisition has outcome-based performance measures linked to
strategic goals and that they are likely to be found in a JCIDS document (ICD,
Capability Development Document (CDD) or Capability Production Document (CPD)).
Note however that the presumption exists because the JCIDS requires the development
of MOEs. For Title 40/CCA confirmation, approved MOEs are required to be presented
to the DoD Component CIO.
For further MOE writing guidance, see the Information Technology Community of
Practice Measures of Effectiveness Area.
Overview: This element of the Title 40/CCA asks if the business process or mission
function supported by the proposed acquisition has been designed for optimum
effectiveness and efficiency. Title 40/CCA requires the DoD Component to analyze its
mission, and based on the analysis, revise its mission-related processes and
administrative processes as appropriate before making significant investments in IT.
There are a number of ways to accomplish this requirement, but this is known as
business process reengineering (BPR) and is used to redesign the way work is done to
improve performance in meeting the organization's mission while reducing costs.
To satisfy this requirement, BPR is conducted before entering the acquisition process.
However, when the results of the JCIDS analysis, including the AoA, results in a
Commercial-Off-The-Shelf (COTS) enterprise solution, additional BPR is conducted
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      695
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
after program initiation, to reengineer an organization's retained processes to match
available COTS processes. As stated in Table 8 of DoD Instruction 5000.02, for a
weapon system with embedded IT and for command and control systems that are not
themselves IT systems, it shall be presumed that the processes that the system
supports have been sufficiently redesigned if one of the following conditions exist: "(1)
the acquisition has a JCIDS document (ICD, CDD, CPD) that has been validated by the
Joint Requirements Oversight Council (JROC) or JROC designee, or (2) the MDA
determines that the AoA is sufficient to support the initial Milestone decision."
Who is Responsible?
    •    The Sponsor with cognizance over the function with input from the corresponding
         DoD Component functional sponsor is responsible for BPR.
    •    The PM should be aware of the results of the BPR process and should use the
         goals of the reengineered process to shape the acquisition.
    •    The Director of the Office of the Secretary of Defense (OSD), Cost Assessment
         and Program Evaluation (CAPE) (OD/CAPE) assesses an ACAT IAM program's
         AoA to determine the extent to which BPR has been conducted.
    •    The DoD CIO assesses an ACAT IAM program's AoA to determine whether
         sufficient BPR has been conducted.
Benchmarking is necessary for outcome selection and BPR. The Sponsor should
quantitatively benchmark agency outcome performance against comparable outcomes
in the public or private sectors in terms of cost, speed, productivity, and quality of
outputs and outcomes.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      696
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         successful BPR implementation. Qualitative data from a benchmarking analysis
         is especially valuable for this phase. It aids in working change management
         issues to bring about positive change.
    •    Implementation Phase: Initiate the plan of action and monitor the results.
         Continue to monitor the product or process that was benchmarked for
         improvement. Benchmark the process periodically to ensure the improvement is
         continuous.
Overview: This element of the Title 40/CCA asks if any private sector or other
government source can better support the function. This is commonly referred to as the
"outsourcing determination." The Sponsor determines that the acquisition MUST be
undertaken by DoD because there is no alternative source that can support the function
more effectively or at less cost. Note that for weapon systems and for command and
control systems, the need to make a determination that no private sector or Government
source can better support the function only applies to the maximum extent practicable.
As an example, consider that both the DoD and the Department of Homeland Security
have common interests. This requirement should be presumed to be satisfied if the
acquisition has a MDA-approved acquisition strategy.
Who is Responsible?
    •    The Sponsor with cognizance over the function leads the analysis work as part of
         the AoA process.
    •    The PM updates and documents the supporting analysis in the AoA and a
         summary of the outsourcing decision in the Acquisition Strategy.
Overview: The Director of the Office of the Secretary of Defense (OSD), Cost
Assessment and Program Evaluation (OD/CAPE), provides basic policies and guidance
associated with the AoA process. Detailed AoA guidance can be found in Chapter 3.3.
Analysis of Alternatives. For ACAT ID and IAM programs, OD/CAPE prepares and
approves the AoA study guidance, approves the Component-prepared AoA study plan,
and reviews the final analysis products (briefing and report). After the review of the final
products, OD/CAPE provides an independent assessment to the MDA (see DoD
Instruction 5000.02, Enclosure 7, paragraph 5). See Section 3.3 of this guidebook for a
general description of the AoA and the AoA Study Plan.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      697
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
approach to selecting the most efficient and cost effective strategy for satisfying an
agency's need. See Sections 3.6 and 3.7 of this guidebook for detailed EA and LCC
estimate guidance.
7.8.6.8. The acquisition is consistent with the Global Information Grid (GIG)
policies and architecture
Overview: The GIG is the organizing and transforming construct for managing IT for the
Department. See Section 7.2.1.2 for detailed guidance on GIG policies and architecture.
7.8.6.9. The program has an Information Assurance (IA) strategy that is consistent
with DoD policies, standards and architectures
Overview: IA concerns information operations that protect and defend information and
information systems by ensuring their availability, integrity, authentication,
confidentiality, and non-repudiation. This includes providing for the restoration of
information systems by incorporating protection, detection and reaction capabilities. See
Section 7.5 of this guidebook for detailed guidance on IA.
Who is Responsible?
Implementation Guidance: See Section 4.5.4 of this guidebook for a discussion of Open
Systems Approach as a systems engineering technique that will support modularity, and
Section 39.103 of the Federal Acquisition Regulations for a detailed discussion of
Modular Contracting.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      698
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
7.8.6.11. DoD Information Technology (IT) Portfolio Repository (DITPR)
Overview: The DITPR (requires login) supports the Title 40 /CCA inventory
requirements and the capital planning and investment processes of selection, control,
and evaluation. The DITPR contains a comprehensive unclassified inventory of the
Department's mission-critical and mission-essential NSS and their interfaces. It is web-
enabled, requires a Common Access Card (CAC) to obtain access, and requires a user
account approved by a DoD Component or DoD IT Portfolio Management (PfM) Mission
Area or Domain Sponsor. There is a separate inventory on the Secret Internet Protocol
Router Network (SIPRNET) called the DoD SIPRNET IT Registry, which requires a
separate user account to obtain access. DoD Components provide their IT systems
inventory data to either DITPR or the DoD SIPRNET IT Registry there is no overlap
between the two repositories. Data is entered into DITPR by one of two means. For the
Army, Air Force, Department of the Navy (Navy, US Marine Corps), and the TRICARE
Management Activity, data is entered into the DoD Component's IT inventory system
and uploaded to DITPR by batch update monthly. All other Components work directly
online in DITPR. The applicable policy and procedure document is the DoD IT Portfolio
Repository (DITPR) and DoD SIPRNET IT Registry Guidance, August 10, 2009.
Who is Responsible? The PM is responsible for ensuring the system is registered and
should follow applicable DoD CIO procedures and guidance.
Use of the DITPR for Decision Making: The DITPR and the DoD SIPRNET IT Registry
are the Department's authoritative inventories of IT systems. They provide senior DoD
decision makers a coherent and contextual view of the capabilities and associated
system enablers for making resource decisions and a common central repository for IT
system information to support the certification processes of the various Investment
Review Boards (IRBs) and the Defense Business Systems Management Committee
(DBSMC). DITPR provides consistent automated processes across the DoD
Components to meet compliance reporting requirements (e.g., Ronald W. Reagan
National Defense Authorization Act for Fiscal Year 2005 (NDAA), Federal Information
Security Act of 2002 (FISMA), E-Authentication, Privacy Act, Privacy Impact
Assessments, Social Security Number Reduction, Records Management, and
Interoperability). DITPR also enables the Mission Areas and the Components to
accomplish IT PfM.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      699
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
7.8.7.1. Background
7.8.7.1. Background
Since the enactment of the Information Technology Management Reform Act of 1996,
currently referred to as the Title 40/CCA , the DoD CIO has overseen the Title 40/CCA
implementation of ACAT I and IA weapons and automated information systems, in
accordance with the provisions of DoDI 5000.02. Under the risk-based oversight policy,
the objective is to make DoD CIO oversight of Title 40/CCA compliance the exception.
Further, the risk-based Title 40/CCA compliance oversight enables the DoD CIO to
identify and implement a cost-effective means for ensuring Title 40/CCA compliance, by
providing a decision making framework to help leverage Title 40/CCA oversight
responsibility to the DoD Component CIO. In a risk-based oversight model, the DoD
Component CIOs oversee programs within their portfolios, commensurate with their
demonstrated level of capability across Title 40/CCA compliance areas.
These procedures are applicable to all MAIS programs and MDAPs, even those
delegated to the DoD Components. Nothing in these procedures detracts from
responsibilities described in DoDI 5000.02. The risk-based oversight process addresses
the manner and level of DoD CIO and DoD Component CIO involvement in oversight of
MAIS and MDAP programs. The process is initiated when the DoD Component CIO
conducts a self-assessment of Title 40/CCA compliance oversight capability.
This document asks a series of questions related to the implementation of oversight for
Title 40/CCA within DoD Components. The primary audience for this assessment is the
DoD Component CIO. These questions were derived from a range of resources,
including policy and guidance documents, feedback from a 2004-2005 Title 40/CCA
Assessment sponsored by the Office of the Assistant Secretary of Defense for Networks
and Information Integration/DoD Chief Information Officer ()DoD CIO)/Deputy CIO
(DCIO), and USD(AT&L), and input from DoD personnel across multiple organizations
and functions. For further information, see the Risk-Based Oversight for Title 40/Clinger-
Cohen Act (CCA) Compliance folder in the Information Technology (IT) Community of
Practice .
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      700
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Compliance.doc" asks a series of questions related to the implementation of oversight
for Title 40/ CCA within DoD Components. The primary audience for this assessment is
the DoD Component CIO. These questions were derived from a range of resources,
including policy and guidance documents, feedback from a 2004-2005 Title 40/CCA
Assessment sponsored by the Office of the Assistant Secretary of Defense for Networks
and Information Integration/DoD CIO/DCIO and USD(AT&L), and input from DoD
personnel across multiple organizations and functions .
7.9.1. Background
7.9.2. Overview
7.9.1. Background
The Government Performance and Results Act (GPRA) Modernization Act of 2010
requires that Federal Agencies compare actual program results with established
performance objectives. In addition, Section 11313 of Subtitle III of title 40 of the United
States Code (formerly known as Division E of the Clinger-Cohen Act (CCA) (hereinafter
referred to as "Title 40/CCA") requires that Federal Agencies ensure that outcome-
based performance measurements are prescribed for the Information Technology
(including National Security Systems (IT/NSS)) to be acquired and that these
performance measurements measure how well the IT/NSS supports the programs of
the Agency.
DoD Instruction 5000.02, Tables 2-1 and 2-2 , identify this information requirement as a
Post-Implementation Review (PIR) and require a PIR for all acquisition program
increments at the Full-Rate Production Review/Full-Deployment Decision Review
(FRPDR/FDDR). To clarify this requirement, it is a plan for conducting a PIR that is due
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      701
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
at the FRPDR or FDDR. The actual PIR is conducted, and a report is generated after
Initial Operational Capability (IOC) and generally before Full Operational Capability.
(Refer to section 7.9.5 of this guidebook for specific PIR Implementation Steps.)
The Office of Management and Budget (OMB) in OMB Circular A-130 Chapter 8
paragraph b.1(c and d) prescribes PIR procedures within the capital planning and
investment control construct for measuring how well acquired IT supports Federal
Agency programs.
7.9.2. Overview
This section provides guidance on how to plan and conduct a PIR for a capability that
has been fielded and is operational in its intended environment. A PIR verifies the
measures of effectiveness (MOEs) of the Initial Capabilities Document (ICD) or the
benefits of a business plan and answers the question, "Did the Service/Agency get what
it needed, per the ICD/Business Plan, and if not, what should be done?
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      702
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                          Table 7.9.2.T1. Potential PIR Information Sources
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      703
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
       Figure 7.9.3.F1. Identification, Development and Verification of Capability
PIRs provide important user feedback and consequently are a fundamental element of
evolutionary acquisition. Ideally, we want to understand how well a recently completed
increment meets the needs of users before finalizing the requirements for a subsequent
increment. In practice however, the opportunity for such feedback depends on the level
of concurrency in the increment development schedule.
Additionally, changes in the environment may drive new requirements. The PIR gives
both the Sponsor, PM, and other stakeholders such as DOT&E and CAPE, empirical
feedback to better understand DOTMLPF issues with the completed increment. This
feedback enables the acquisition principals to adjust or correct the Capability
Development Document/Capability Production Document and/or the DOTMLPF Change
Recommendation (DCR) for subsequent increments.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      704
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
The final PIR Plan is due at the FRP/FD Decision Review.
    •    Timing of the PIR. The PIR should take place post-IOC after a relatively stable
         operating environment has been established and the data identified in the PIR
         plan has been collected. Time frame for the PIR varies with the solution
         deployment strategy, but the PIR is to be executed and a report submitted prior
         to Full Operational Capability. If an FOC is not planned, the PIR is to be executed
         within one year of IOC.
    •    Identification of Scope.
    •    Identification of Stakeholders. Remember to consider those who will be tasked to
         provide resources.
    •    Team Composition. The PIR team should include, at minimum, the following:
             o Functional experts with working knowledge of the business area and its
                 processes;
             o People with relevant technical knowledge;
             o CIO representatives, functional sponsors, and Domain Owners;
             o Oversight representatives.
    •    Identification of information sources. The ICD or Business Plan that articulated
         the outcome-based performance measures, or MOEs, is a good place to start.
         Additional data can be gleaned from operations conducted in wartime and during
         exercises. The lead time for most major exercises is typically one year and
         requires familiarity with the exercise design and funding process. Sources to
         consider are found in Table 7.9.2.T1.
    •    Analysis approach. The analysis approach is key to defining the structure and
         metadata of the information to be collected. For example, the definition of return
         on investment (ROI) in the Economic Analysis will drive the analysis approach of
         achieved ROI and the data to be collected.
    •    Reporting. The report describes the execution of the PIR, addresses the
         capability gaps that the IT/NSS investment was intended to fill, addresses the
         degree to which the gaps were filled, and recommends actions to mitigate
         unfilled capability gaps.
    •    Resource requirements. Identify the sources of resources such as manpower,
         travel, analysis tools, communications and other needs unique to the program.
         Demonstrate agreement by the resource providers; including them in the chop
         page or citing existing agreements.
    •    Schedule.
    •    Routing of the draft and final PIR Plan should include the identified stakeholders.
         An information copy of the final PIR Plan and Report is sent to the Senior Military
         Advisor DOT&E and the Program's CAPE Action Officer.
The PIR should be carried out according to the PIR planning that was reviewed and
approved at the FRPDR/FDDR. Care should be given to quality of the raw data. Based
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      705
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
on the PIR plan, the PIR should, at a minimum, address:
    •    Customer Satisfaction: Are the users satisfied that the IT investment meets their
         needs?
    •    Mission/Program Impact: Did the implemented capability achieve its intended
         impact?
    •    Confirmation that the validated need has not changed; or if it has, include as part
         of the course of action provided in the PIR report.
    •    A measure of the MOE found in the ICD.
    •    Benefits such as the ROI found in the business plan. Compare actual project
         costs, benefits, risks, and return information against earlier projections.
         Determine the causes of any differences between planned and actual results.
The analysis portion of the PIR should answer the question, "Did we get what we
needed?" This provides a contrast to the test and evaluation measurements of key
performance parameters that answer the question, "Did we get what we asked for?"
This would imply that the PIR should assess, if possible, the extent to which the DoD's
investment decision-making processes were able to capture the warfighter's/users initial
intent. The PIR should also address whether the warfighter/user needs changed during
the time the system was being acquired. The outputs of the analysis become the PIR
findings. The findings should clearly identify the extent to which the warfighters got what
they needed.
Based on the PIR findings, the PIR team prepares a report and makes
recommendations that can be fed back into the capabilities and business needs
processes. The primary recipient of the PIR report is the Sponsor who articulated the
original objectives and outcome-based performance measures on which the program or
investment was based.
A copy of the PIR report is also forwarded to DOT&E and CAPE. DOT&E will use the
results to confirm the effectiveness and suitability assessment made during IOT&E and
possibly to improve the test planning and execution for follow-on increments or similar
systems (see Section 9.7.9). CAPE will compare the benefits of selected programs to
those presented in the Economic Analysis.
The results of the PIR can also aid in refining requirements for subsequent increments.
Recommendations may be made to correct errors, improve user satisfaction, or improve
system performance to better match warfighter/business needs. The PIR team should
also determine whether different or more appropriate outcome-based performance
measures should be developed to enhance the assessment of future spirals or similar
IT investment projects.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      706
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
For further guidance on PIRs, see the Information Technology Community of Practice
Post Implementation Review Area. This contains the following additional guidance:
Both government and the commercial sector address the practice of conducting a PIR
for materiel, including software IT investments. The Government Accountability Office
and several not-for-profit organizations have written on the subject of measuring
performance and demonstrating results. The Clinger-Cohen Act Community of Practice
PIR Area lists a number of key public and private sector resources that can be used in
planning and conducting a PIR.
7.10.2. Definition
Subtitle III of Title 40 of the United States Code (formerly known as Division E of the
Clinger-Cohen Act (CCA) (referred to as "Title 40/Clinger-Cohen Act") and DoD
Instruction 5000.02, Enclosure 2, paragraphs 4.c.(6) and 5.d.(1)(b)3, all require the use
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      707
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
of COTS IT solutions to the maximum practical extent.
7.10.2. Definition
[From the Twelfth Edition of GLOSSARY: Defense Acquisition Acronyms and Terms.]
The following bullets quote or paraphrase sections in the DoD 5000 series that
specifically address COTS:
Hence, commercially available products, services, and technologies are a first priority
for acquisition solutions.
Modifying the core code of a COTS product should be avoided. It is possible to add
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      708
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
code to the existing product, to make the product operate in a way it was not intended to
do "out-of-the-box." This, however, significantly increases program and total life-cycle
costs, and turns a commercial product into a DoD-unique product. The business
processes inherent in the COTS product should be adopted, not adapted, by the
organization implementing the product. Adopting a COTS product is done through
business process reengineering (BPR). This means the organization changes its
processes to accommodate the software, not vice versa. In many cases there will be a
few instances where BPR is not possible. For example, due to policy or law, it may be
necessary to build or acquire needed reports, interfaces, conversions, and extensions.
In these cases, adding to the product must be done under strong configuration control.
In cases where a particular COTS product does not provide the entire set of required
functionality, a "bolt-on" could be used. A bolt-on is not part of the COTS software
product, but is typically part of a suite of software that has been certified to work with the
product to provide the necessary additional functionality. These suites of software are
integrated to provide the full set of needed functionality. Using a bolt-on, however, also
increases program and total life-cycle costs.
See section 7.10.6.3 for a more detailed discussion of reports, interfaces, conversions,
and extensions.
The actions below are unique to acquiring COTS Information Technology solutions.
These activities should occur within a tailored, responsive, and innovative program
structure authorized by DoD Instruction 5000.02. The stakeholder primarily responsible
for each action is shown at the end of each bullet.
    •    Define strategy and plan for conducting BPR during COTS software
         implementation phase of the program.
         (Sponsor/Domain Owner)
    •    Consider COTS and BPR when developing the Analysis of Alternatives. (See
         section 3.3 and Table 7.8.4.T1 of this guidebook).
         (Sponsor/Domain Owner)
    •    Consider commercially available products, services, and technologies when
         defining initial user needs in the Initial Capabilities Document.
         (Sponsor/Domain Owner)
    •    When developing the Technology Development Strategy and/or the Acquisition
         Strategy, consider commercial best practice approaches and address the
         rationale for acquiring COTS.
         (Program Manager (PM))
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      709
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         best practice business rules inherent in the COTS product. Define a process for
         managing and/or approving the development of reports, interfaces, conversions,
         and extensions.
    •    Ensure scope and requirements are strictly managed and additional reports,
         interfaces, conversions, and extensions objects are not developed without prior
         authorization.
         (Program Manager (PM))
    •    Ensure adequate planning for life-cycle support of the program. See section 3.4,
         Engineering for life-cycle support, of "Commercial Item Acquisition:
         Considerations and Lessons Learned".
Conduct ongoing engineering and integration for sustainment activities throughout the
life cycle of the program.
7.10.6.2. SmartBUY
The DoD Enterprise Software Initiative (DoD ESI) is a joint, Chief Information Officer
(CIO)-sponsored project designed to: "Lead in the establishment and management of
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      710
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
enterprise COTS information technology (IT) agreements, assets, and policies for the
purpose of lowering total cost of ownership across the DoD, Coast Guard and
Intelligence communities." DoD ESI is a key advisor to the DoD Strategic Sourcing
Directors Board. With active working members from OSD, Department of the Army,
Department of the Navy, Department of the Air Force, Defense Logistics Agency,
Defense Information Systems Agency, National Geospatial-Intelligence Agency,
Defense Intelligence Agency, Director of National Intelligence, and Defense Finance
and Accounting Service, the DoD ESI team collaborates to create Enterprise Software
Agreements (ESA) for use by DoD, the Intelligence Community, and U.S. Coast Guard
IT buyers. ESA negotiations and management activities are performed by IT acquisition
professionals within participating DoD Components, who are designated ESI "Software
Product Managers (SPM)." SPM are supported by experienced IT contracting experts.
The DoD ESI can use the Defense Working Capital Fund to provide "up-front money"
for initial wholesale software buys and multi-year financing for DoD customers. This
funding process assures maximum leverage of the combined buying power of the
Department of Defense, producing large software discounts.
On-line resources include the DoD ESI websitelisting general products, services and
procedures; the Defense Federal Acquisition Regulation Supplement Subpart 208.74;
DoD Instruction 5000.2, Enclosure 5, Paragraph 6 and DoD Component requirements
for compliance with DoD Enterprise Software Initiative policies.
7.10.6.2. SmartBUY
The General Services Administration (GSA) manages the SmartBUY Program, and
leads the interagency team in negotiating government-wide enterprise licenses for
software. The GSA SmartBUY Program focuses on commercial-off-the-shelf software
that is generally acquired using license agreements with terms and prices that vary
based on volume. The GSA SmartBUY Program was formally announced on June 2,
2003 in an Office of Management and Budget Memorandum to the federal agencies.
The DoD ESI Team has worked closely with the SmartBUY project since its inception,
and negotiates and manages many of the SmartBUY agreements as a partner to GSA.
The DoD ESI team implements SmartBUY within the DoD through the joint DoD Deputy
CIO and DPAP Policy Memorandum of December 22, 2005: Department of Defense
(DoD) Support to the SmartBUY Initiative. This policy mandates use of SmartBUY
agreements when user requirements match a product on SmartBUY, and also provides
the framework for migrating existing Enterprise Software Initiative Enterprise
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      711
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Agreements to SmartBUY Enterprise Agreements. The OMB Memo establishes
requirements to be followed by federal departments and agencies. Specifically, federal
agencies are to: develop a migration strategy and take contractual actions as needed to
move to the government-wide license agreements as quickly as practicable; and
integrate agency common desktop and server software licenses under the leadership of
the SmartBUY team. This includes, to the maximum extent feasible, refraining from
renewing or entering into new license agreements without prior consultation with, and
consideration of the views of, the SmartBUY team.
The Federal Acquisition Regulation (FAR) Committee has developed draft regulations to
implement SmartBUY.
On September 14, 2010, the Director, Operational Test and Evaluation, signed an
updated memorandum entitled "Guidelines for Conducting Operational Test and
Evaluation of Information and Business Systems." The guidelines help streamline and
simplify COTS software testing procedures. They assist in tailoring pre-deployment test
events to the operational risk of a specific system increment acquired under OSD
oversight. For increments that are of insignificant to moderate risk, these guidelines
streamline the operational test and evaluation process by potentially reducing the
degree of testing. Simple questions characterize the risk and environment upon which
to base test decisions, for example, "If the increment is primarily COTS, or government
off-the-shelf items, what is the past performance and reliability?"
Section 881 of the FY 2008 National Defense Authorization Act (NDAA) requires the
Department to have a Clearing-House for Rapid Identification and Dissemination of
Commercial Information Technologies. To meet this need, a partnership between the
Under Secretary of Defense (USD) for Acquisition, Technology and Logistics (AT&L),
the Director of Defense Research and Engineering (DDR&E), the Defense Technical
Information Center (DTIC) and the Assistant Secretary of Defense for Networks and
Information Integration/DoD Chief Information Officer (DoD CIO) was formed to develop
a capability that 1) allows better visibility into the Department's technology needs, 2)
attracts non-traditional defense emerging technology suppliers, and 3) allows for review
and discussion of COTS IT products in wide use throughout the Department. This effort,
termed "DoD Techipedia" comprised of both an internal, DoD CAC-only Wiki-based
collaboration area, and an external Wiki (internal.dodtechipedia.mil or a separate
redirected .mil site) where DoD Capability buyers and their representatives can
collaborate with Industry on a range of technology areas. Regarding wide-use COTS IT
products, the objective is to raise the awareness of Government and commercial sector
practices relative to the use of COTS software.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      712
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
7.11. Space Mission Architectures
A space mission architecture will be used by the MDA as the baseline or gold standard
of performance attributes of a mission capability to aid in key decisions regarding new
or improved capabilities. The architecture serves as one of many sources of information
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      713
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
that will be available to inform deliberations at milestone reviews, and uniquely provides
the overarching strategic viewpoint to ensure introduction of new capability will support
the global joint warfighter with performance against approved requirements, and result
in a high degree of interoperability with other elements of the architecture. The MDA will
need to understand important items such as the progress that the anticipated space
system provides in moving from the As-Is toward its contribution to the To-Be (in light of
other parallel efforts reflected within the architecture), the performance and capability
benefits to end users, remaining gaps and shortfalls, secondary impacts to other users,
alignment with the goals of the National Security Space Strategy, and the implications to
health and welfare of the supporting industrial base. Likewise, the MDA will use the
space mission architecture to ensure that inserting new capabilities is not disruptive;
generate unnecessary costs of sustainment, or any other unintentional consequence
that would necessitate an unanticipated expenditure of resources.
The offices responsible for development, maintenance, and content of the As-Is and To-
Be architectures will be the EA of the Department of Defense for the mission areas to
which they have been assigned. The EA for Space is responsible for maintaining
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      714
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
architectures associated with space systems and space missions. For those mission
areas without an EA, such as communications, PNT, etc., the Principle Staff Assistant
will serve as the responsible party for the architecture. The content of a space mission
architecture should include, but not be limited to, a comprehensive schematic (detailed
picture) of the architecture physical elements, space components, ground components,
data flow between components, interface specifications both internal and external to the
boundary of the architecture, performance specifications, logistics support, and
communication protocols, etc. Validation of each DoD mission area architecture from a
requirements perspective will be the responsibility of the Joint Staff, and will ultimately
be validated by the Joint Requirements Oversight Council as comprehensive and
necessary in meeting the needs of the warfighter. Validation of architectures will also be
accomplished from an acquisition, requirements, resourcing and policy perspective. The
Office of the Under Secretary of Defense (Acquisition, Technology & Logistics)
(USD(AT&L)) will validate that architecture updates are consistent with MDA decisions,
ODCAPE will validate that architectures are affordable and consistent with assigned
funding by the Department, and USD(Policy) will validate that architectures are in
compliance with current policy, respectively. When there is a mission architecture that
does not receive validation from one of more of these sources, it is incumbent upon the
office responsible for the mission architecture to assemble a joint session between
requirements, acquisition, resourcing, and policy to develop a refined architecture that
achieves unanimous validation. Conflicts between validation activities should be taken
to the Defense Space Council for resolution. The timeline for subsequent revalidation of
architectures will be determined jointly by the responsible offices, the Joint Staff, and
ODCAPE as having changed sufficiently to warrant fresh review and validation; or at the
discretion of the DEPSECDEF. Offices responsible for architecture development and
maintenance should anticipate and resource for at least an annual update to their
respective mission area architectures.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      715
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
DEFENSE ACQUISITION GUIDEBOOK
Chapter 8 - Intelligence Analysis Support to Acquisition
8.0. Introduction
8.0. Introduction
8.0.1. Purpose
8.0.2. Contents
8.0.3. Applicability
8.0. Introduction
Early and incremental involvement and collaboration with the DoD Intelligence
Community (DoD IC) will help reduce program risks to schedule, cost, and performance.
Early collaboration also increases the likelihood that the delivered system will be fully
capable and more survivable against the relevant adversary threats.
Reduced risk to schedule is derived from the early identification of work to be performed
by the DoD IC, proper tasking of the DoD IC at the appropriate acquisition milestone
through production requirements, identification of capability gaps, costing, and
negotiated delivery dates for products.
Reduced risk to cost is derived from the earliest identification of the costs and resource
strategies to realize the intelligence support needed to close capability gaps throughout
the acquisition life-cycle. Collaboration with the DoD IC assists both the DoD IC and the
acquisition communities in determining the costs to be borne by the DoD IC and the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      716
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
costs to be borne by the program.
8.0.1. Purpose
The purpose of this chapter is to enable the PM to use intelligence information and data
to ensure maximum war-fighting capability at the minimum risk to cost and schedule.
8.0.2. Contents
The program may require intelligence analysis of foreign threat capabilities integral to
the development of future U.S. military systems and platforms over the life of the
program. Identifying projected adversarial threat battlefield capabilities and evolving
scientific and technical developments that affect a program or a capability's design or
implementation is crucial to successful development, employment, and sustainment
processes.
This section explains how PMs can successfully account for signatures and other IMD
during system and sensor acquisition for building target models, developing algorithms,
optimizing sensor design, and validating sensor functionality. As requirements for
smarter, interoperable platforms and systems grow, the need for signatures and other
IMD will continue to trend upwards.
This section explains how PMs complete the Intelligence Certification and threat
validation required by the Joint Staff in support of the Joint Capabilities Integration and
Development System (JCIDS) process.
8.0.3. Applicability
This chapter applies to programs that are dependent upon threat intelligence analysis,
signatures, and other IMD to enable mission capability in accordance with DoDD
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      717
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
5000.02 , and DoDD 5250.01 .
Programs dependent on signature and other IMD are those that require data for
programming platform mission systems in development, testing, operations and
sustainment to conduct combat identification; Intelligence, Surveillance and
Reconnaissance; and targeting using, but not limited to the signatures and IMD as
described above.
This Chapter does not apply to acquisitions by the DoD Components that involve a
Special Access Program (SAP) created under the authority of Executive Order 12958 .
The unique nature of SAPs requires compliance with special security procedures of
DoDD 5205.07 .
The acquisition program documents discussed in Chapter 8 are listed below in Table
8.0.4.T1.
                                                                                           Preparation
     Document                              Prepare
                                                                                           Reference
     Capstone Threat                       During capability shortfall                     JCIDS Manual
     Assessment (CTA)                      identification process.
                                           (Maintained by the DoD                          CJCSI 3312.01B
                                           Intelligence Community
                                           throughout the capability                       DIAI 5000.002
                                           development and acquisition
                                           lifecycle.)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      718
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
     System Threat          Prior to Milestone A, task the                                 DoDI 5000.02 E4,
     Assessment Report      supporting intelligence                                        Table 3
     (STAR) / System Threat production center.
     Assessment (STA)                                                                      DIAI 5000.002
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      719
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                            Table 8.0.5.T1. Functional Offices in Chapter 8
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      720
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
       Figure 8.1.F1. Depiction of Life-Cycle Intelligence Analysis Requirements
Figure 8.1.F1 illustrates the range of support provided by the threat intelligence
community over the life of a particular capability shortfall identification process and
resulting system acquisition program. Capstone Threat Assessments (CTA) inform the
capability shortfall identification process as well as during early phases of system
acquisition prior to the generation of a STAR/STA. The CTAs project foreign capabilities
in particular warfare areas looking out 20 years.
At the beginning of the Material Solution Analysis phase, the program office or capability
sponsor should contact the appropriate intelligence production center to support
integration of validated threat information into the Technology Development Strategy.
Threat information may come from DIA-validated Capstone Threat Assessments or
other DIA/Service validated STARs/STAs that align with the capability mission,
CONOPs, and employment timeline.
Once the capabilities sponsor, program manager or other appropriate enabler identifies
concepts or prototypes for the materiel solution, the program office or capability sponsor
should task the appropriate intelligence production center for the lead service to
produce the System Threat Assessment Report (STAR) for Acquisition Category
(ACAT) I/ Major Defense Acquisition Programs (MDAPs) and the System Threat
Assessment (STA) for ACAT II programs in accordance with the regulations of that
service. The program office needs to work with the producing intelligence center to
provide system specific characteristics, employment CONOPS, and employment
timeline as they evolve. The program office must also work with the appropriate Service
Intelligence Production Center to identify Critical Intelligence Parameters (CIPs) and
ensure production requirements are levied against those CIPs.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      721
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
policy planning guidance and an intelligence assessment of present trends, patterns
and conditions, combined with validated parametric, characteristics/performance and
employment data needed for development, testing, and/or training. When combined
with information on appropriate friendly and neutral (Blue/Gray/White*) systems, it
represents an extrapolation of the total security environment in which the system is
expected to operate. A package comprises a scenario, concept of operations, and
integrated data used by the DOD components as a foundation for strategic analyses.
Examples of analytical baselines include scenarios and supporting data used for
computer assisted war games and theater campaign simulations.
* The three colors reflect three different entities. Blue represents U.S. system data, Gray
represents U.S.-produced but foreign-operated system data, and White represents
neutrals. When doing long-term analysis, the impact of Blue systems must be taken in
light of friendly and neutral systems.
CTAs provide the bedrock analytical foundation for threat intelligence support to the
defense acquisition process. CTAs, covering major warfare areas, present the DoD
Intelligence Community validated position with respect to those warfare areas and will
constitute the primary source of threat intelligence for the preparation of Initial Threat
Environmental Assessments, STARs/STAs, and threat sections of documents
supporting the JCIDS process. In order to effectively support both the capability
development and acquisition processes, CTAs are not specific to existing or projected
US systems, cover the current threat environment, and, in general, project threats out
20 years from the effective date of the CTA. With the lead intelligence production
center, DIA's Defense Warning Office (DIA/DWO) co-chairs the Threat Steering Group
(TSG) that produces and reviews the document. CTAs should be updated as
determined by the responsible TSG but in any case every 24 months. DIA validates all
CTAs.
    WARFARE
                               PRIMARY PRODUCTION OFFICE OR CENTER
    AREA
    Air Warfare                National Air and Space Intelligence Center (NASIC)
    Chemical,
    Biological and
                               Defense Intelligence Agency (DIA)
    Radiological
    Defense
    Information
                               DIA/Joint Information Operations Threat Working Group
    Operations
    Land Warfare               National Ground Intelligence Center (NGIC)
    Missile Defense            Defense Intelligence Agency (DIA)
    Naval Warfare              Office of Naval Intelligence (ONI)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      722
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    Space Warfare National Air and Space Intelligence Center (NASIC)
    The Capstone Threat Assessments can be found at the JWICS or SIPRNET
    websites of the primary production office or center.
Commercial 434-956-2170
DSN 521
The Defense Intelligence Agency (DIA) provides validation for System Threat
Assessment Reports (STARs), prepared by the appropriate Service, to support
Acquisition Category (ACAT) ID/ Major Defense Acquisition Programs (MDAPs).
Appropriate Defense Intelligence organization(s), identified by the component
headquarter intelligence organizations, prepare the STAR. The assessment should be
kept current and validated throughout the acquisition process. DoD Instruction 5000.02
requires that MDAPs have a validated STAR in place at Milestones B and C (and at
program initiation for shipbuilding programs). The assessment should be system
specific, to the degree that the system definition is available at the time the assessment
is being prepared, and should address projected adversary capabilities at system initial
operating capability (IOC) and at IOC plus 10 years. DIA will co-chair the TSGs for
ACAT ID STARs with the producing command or center. STARs for ACAT IC MDAPs
and STAs for ACAT II non-MDAPs are prepared and validated by the lead service in
accordance with service regulations. DIA Instruction 5000.002 describes the required
STAR elements and format.
Critical Intelligence Parameters (CIPs) are established and examined through the joint
and collaborative efforts of the intelligence, capability sponsor, and acquisition
management community to aid in developing intelligence production requirements to
support an acquisition program. CIPs are those key performance thresholds of foreign
threat systems, which, if exceeded could compromise the mission effectiveness of the
U.S. system in development. Adversary military doctrine, tactics, strategy, and expected
employment of systems should be considered in the CIPs. Program specific CIPs, and
their associated production requirements, are a key part of a STAR and will be required
for validation. The inclusion of CIPs is also encouraged for STAs. If a CIP is breached,
the responsible intelligence production center will notify the program office and
DIA/DWO in accordance with DIA Instruction 5000.002. DIA/DWO will notify the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      723
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
appropriate organizations in the Office of the Secretary of Defense.
At the discretion of the responsible TSG, STARs/STAs can be used to support multiple
programs which address like performance attributes, share an employment CONOPs,
and have a similar employment timeline. Individual system descriptions and CIPs are
still required to support the generation of the STAR.
Major Automated Information System (MAIS) programs use the Joint Information
Operations Working Group and DIA-validated Information Operations (IO) Capstone
Threat Assessment or service produced System Threat Assessment Report. DIA will
validate service produced ACAT IAM STARs when the IO CTA is not used. Non-MAIS
programs are encouraged to use the IO Capstone Threat Assessment or service
produced System Threat Assessment Report as their threat baseline. MAIS programs
still need to provide system descriptions, as well as the CIPs and production
requirements that are specific to their program's needs.
As noted above, for Major Defense Acquisition Programs (MDAPs) subject to Defense
Acquisition Board review, the Defense Intelligence Agency (DIA) validates System
Threat Assessment Reports (STARs) for Acquisition Category (ACAT) ID/ Major
Defense Acquisition Programs (MDAPs). STARs for ACAT IC MDAPs and System
Threat Assessments for ACAT II programs are validated by the appropriate service. DIA
validation assesses the appropriateness and completeness of the intelligence,
consistency with existing intelligence positions, and the use of accepted analytic
tradecraft in developing the assessments. Working with its partners in the DOD
intelligence community and, as needed, in the larger intelligence community, validation
is intended to ensure that all relevant data is considered and appropriately used by
author(s) of the assessment.
The Test and Evaluation Master Plan should define specific intelligence requirements to
support program operational test and evaluation. When requested by the appropriate
authority in the offices of the Director, Operational Test and Evaluation (DOT&E) or the
Under Secretary of Defense for Acquisition, Technology and Logistics (USD(AT&L)),
DIA, working with the Department of Defense Intelligence Community (DoD IC), will
provide additional intelligence support to the operational testing of programs on the
annual DOT&E Oversight List. DIA support will not include the validation of specific
testing scenarios or the validation of "Blue" (see paragraph 8.1) surrogate systems or
platforms, but can include certification that the threat information in the test plan is
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      724
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
correct and consistent with existing assessments.
Per DoD Instruction 5000.02 certain programs on the DOT&E Oversight List are to be
considered as MDAPs for testing and evaluation purposes and will require a System
Threat Assessment Report regardless of Acquisition Category designation.
8.2.1. Signature and other Intelligence Mission Data support in the Technology
Development Strategy (TDS)
8.2.2. Distributed DoD Signatures and other Intelligence Mission Data Pool and
Standards
The first step for managers involved with acquisition efforts and programs is to identify
any requirement for intelligence analysis related to enabling mission capability. The data
derived from this analysis and needed by acquisition programs is commonly referred to
as signatures and other IMD. See definitions at Section 8.0.3. Applicability .
Further, Services have liaisons with expertise in both intelligence and acquisitions.
These professionals know how to interface with the DoD IC and are typically part of the
acquisition team (or are accessible to the team).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      725
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Plan (LSSP) (per DoD Directive 5250.01) and defines overall signature support
requirements and compliance with signature standards in the Capability Development
Document and Capability Production Document (per CJCS Instruction 3312.01 , "Joint
Military Intelligence Requirements Certification"). Under CJCS Instruction 3312.01, the
SSP uses the LSSP to assess the ability of the signatures community to support a
program's signature requirements.
8.2.1. Signature and other Intelligence Mission Data support in the Technology
Development Strategy (TDS)
DoD Directive 5250.01 requires that signature support requirements and funding be
incorporated into a program's acquisition strategy. Per PDUSD AT&L Memo, 20 APR
2011 Document Streamlining Program Strategies and Systems Engineering Plan, the
TDS should provide a table that indicates the program life-cycle signature support
requirements. Life-cycle signature support funding requirements will be reflected in the
TDS program funding summary. [ Technology Development Strategy Memo ] If required
signatures are not already available in the distributed national signatures pool, the
program will need to plan and budget for development of these signatures. Stating in
the TDS that a program is signature dependent and will identify requirements in a Life-
cycle Signature Support Plan ensures that the Program Office has considered signature
development resource needs in the program planning and budgeting process.
8.2.2. Distributed DoD Signatures and other Intelligence Mission Data Pool and
Standards
DoD Directive 5250.01 requires that all signatures provided for the DoD be made
available through a distributed DoD signature pool and adhere to established standards.
Whether developed by a government signature center or by a contractor, if the
signatures and other IMD are made available through a distributed pool, they can be
shared to prevent duplication of work and cost across the DoD.
An essential element to make this possible is the use of standards to ensure common
meta-data tags and processing methods are used. This in turn ensures the signatures
will be discoverable in the distributed pool and that the signatures will be usable for
multiple customer’s including acquisition programs and operational systems.
The Signatures Support Program provides single access point connectivity to the
distributed pool through web-pages on JWICS (http://ssp.dodiis.ic.gov/), SIPRNet
(http://dt.dia.smil.mil/ssp), and NIPRNet (site under development). (NOTE: These sites
cannot be accessed via the Internet or Non-secured Internet Protocol Router Network.)
Current signature standards are also available at these web-sites.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      726
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
efforts to submit a Life-cycle Signature Support Plan (LSSP) throughout their respective
lifecycle. An LSSP is intended to facilitate collaboration and agreement between the
acquisition, requirements and intelligence communities regarding signatures, also
known as Intelligence Mission Data (IMD), which is DoD intelligence used for
programming platform mission systems in development, testing, operations, and
sustainment including, but not limited to the following functional areas: Intelligence
Signatures, Electronic Warfare Integrated Reprogramming (EWIR), Order of Battle
(OOB), Characteristics and Performance (C&P), and Geospatial Intelligence (GEOINT).
The LSSP defines specific technology and program IMD requirements. The DAU LSSP
webpage provides an LSSP template, instructions for LSSP completion, an LSSP
Signature Requirements Table template, and an example Contract Data Requirements
List form for procuring signature data. The LSSP Template provides an outline and
guidance that standardizes communication between the technology or program offices
and the intelligence community. The LSSP should contain as much detail as possible to
inform intelligence community production and collection decisions. Therefore, increasing
detail should be provided in each update and submission of the LSSP. Content
considerations for an LSSP by phase and milestone can be found below.
Since final material solutions are yet to be approved prior to Milestone A, specific
system configuration and detailed signature requirements are generally not known.
However, based on the intended operational mission, the program should identify the
IMD type(s) (e.g. Radar, Thermal, Acoustic, EWIR, GEOINT etc.) the domain (e.g.
Space, Air, Land, Naval, Missile Defense, etc.), data fidelity (e.g. queuing quality), and
possibly sub-categories within a domain (e.g. for Air: Fighter Aircraft) for each
subsystem that requires the data. To the level that specific requirements are known,
they should be stated.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      727
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
IMD requirements and related implications to design, performance, and test &
evaluation, will be accounted for and considered throughout the Materiel Solution
Analysis Phase. Relevant questions to consider and actions to take during this phase
include:
Questions:
    •    Has the program been identified for Foreign Military Sales (FMS)? If yes, then
         how will this effect design, development, testing, disclosure and releasability of
         IMD-dependent components?
    •    For each proposed material solution identified during the Analysis of Alternatives
         (AoA) process, will the solution require the detection and identification of an
         activity, event, person, material, or equipment? If yes, then for each proposed
         detection or identification method (radar, EO/IR, acoustic, chemical, etc.), assess
         the technical feasibility of acquiring IMD within cost and schedule constraints.
         Consider the quality of available IMD, the ICs capability to deliver IMD and
         whether the IMD needs to be collected, processed and/or developed.
Actions:
As a program approaches Milestone B (MS B), the LSSP must include mission or
capability specific details and IMD requirements to support program development. For
example, as the design matures, additional details should emerge about the design of
the sensors and the algorithms. The LSSP should also identify any IMD-based models
and intelligence production requirements (PRs) already submitted to a Service
Intelligence Production Center (NASIC, NGIC, ONI, MSIC, etc.), other IMD production
efforts (e.g. lab, warfare research center, or other agency, organization, etc.), and
planned IMD collection events that the program will conduct.
Based on initial IMD requirements defined for Milestone A, refine and add details for the
MS B LSSP during development of the Systems Performance Specification and the
Allocated Baseline. Relative questions to consider and actions to take during this phase
include:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      728
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Questions:
    •    Has the program been identified for Foreign Military Sales (FMS)? If yes, then
         how will this effect design, development, testing, disclosure and releasability of
         IMD-dependent components?
    •    For each proposed detection/identification method (radar, Electro Optical/Infra-
         Red (EO/IR), acoustic, chemical, etc.), does the required IMD (signature, EWIR,
         GEOINT, OOB, C&P) already exist (at the estimated quality needed) or will it
         need to be processed, produced, or collected?
    •    Is the required detection/identification technology sufficiently mature (Technology
         Readiness Level 6 or higher) to proceed into end-item design or Milestone B?
    •    Which IMD-dependent performance requirements need to be verified through
         test and evaluation?
    •    Does the program have IMD requirements derived from Modeling and Simulation
         activities?
    •    Can the estimated IMD processing, production, collection be completed within
         required cost and schedule?
    •    Do the detection/identification algorithms or processes need to be designed to
         accommodate IMD updates?
    •    Is there potential for the detection/identification hardware and software to perform
         IMD collection and provide updates to IMD databases? If yes, has a design study
         been conducted to assess feasibility and cost/benefit analysis?
    •    Have significant IMD-dependent functions been included in the proposed exit
         criteria for the Engineering & Manufacturing Development (EMD) Phase?
    •    Has the programs spectrum requirements taken into account bandwidth needed
         for IMD updates during system operations and sustainment?
    •    Should any IMD data sets be considered as GFE for the EMD Contract?
Actions:
This LSSP will be an update to the previous LSSP. The purpose is to add any new IMD
requirements resulting from design maturity or changes in the Concept of Operations
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      729
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
(CONOP). It should identify the expected IMD production support and concept
necessary for system employment in an operational environment. The LSSP should
include information on IMD data existing within the program (modeling and simulation or
measured physical parameters) for sensor or algorithm development or for testing
purposes, and; information on the existence of any blue IMD collected to support the
program. Additionally, the IMD production concept must be defined and coordinated
with the intelligence community. At a minimum this should include the identification of
organizations for the production of IMD, addressing responsible entities for adversary
commercial systems, and US systems (blue). This information is required to ensure that
this form of IMD is available through the DoD data sources.
Based on IMD requirements defined in the Milestone B LSSP, refine and add details for
the MS C LSSP during development of the System Functional Spec and the Initial
Product Baseline. Relevant questions to consider and actions to take during this phase
include:
Questions:
    •    Has the program been identified for Foreign Military Sales (FMS)? If yes, then
         how will this effect design, development, testing, disclosure and releasability of
         IMD-dependent components?
    •    For each proposed detection/identification method (radar, EO/IR, acoustic,
         chemical, etc.), has IMD (signature, EWIR, GEOINT, OOB, C&P) required for
         system operations and sustainment been accounted for in the LSSP and
         Acquisition Plan, at the level of quality needed?
    •    Which IMD requirements need to be verified in Follow-on test and evaluation
         (FOT&E)?
Actions:
In preparation for IOC, an LSSP update is required to ensure congruence with the Final
Production Baseline and to fully account for required operational signatures based on
the latest threat assessments and CONOPS for the system. This LSSP also needs to
fully account for IMD sustainment plans including identification of processes and data
sources which are essential for system operations, such as: IMD production processes;
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      730
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
IMD databases; IMD verification and validation for operational use; processes and
systems which support development and dissemination of IMD data loads for
operational missions.
This LSSP requires COCOM coordination and identification of COCOM processes for
updating and fulfilling IMD requirements during operation and sustainment of the
system. Relevant questions to consider and actions to take during this phase include:
Questions
    •    For FMS versions of the system, have IMD-dependent components been verified
         for release and approved by the Designated Disclosure Authority?
    •    Have IMD support requirements been included in the Life-cycle Sustainment
         Plan and the Product Support Package?
    •    Does the current CONOPS for the system drive new or updated IMD
         requirements? Have these new/updated IMD requirements been handed off to
         the COCOM requirements prioritization process?
    •    If the operational system has an IMD reprogramming process, is the
         reprogramming system and organization ready for operations?
Actions
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      731
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
process, and will review and validate the threat input within the JCIDS documents.
Threat sections should not include non-adversarial, natural events as threats to
capabilities or systems.
Information Support Plan (ISP). Per DoDI 4630.8 and CJCSI 3312.01B, DIA/DWO
reviews program generated ISPs during the Intelligence Certification process. A threat
summary or section is not required in the ISP format; however, if used should reference
the current and applicable CTA or STAR/STA.
Those personnel with a SIPRNET terminal can access the specific procedures and
criteria for the Intelligence Certification on the Intelligence Requirements Certification
Office homepage (under "Certification Process"). By telephone, additional information
may be obtained by calling the Intelligence Requirements Certification Office at 703-
571-9543 (Mr. Vernon Wilson) or 703-571-9541 (Mr. Dana Smith).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      732
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
DEFENSE ACQUISITION GUIDEBOOK
Chapter 9 - Test and Evaluation (T&E)
9.0 Overview
9.0. Overview
9.0.1. Purpose
9.0.2. Contents
9.0. Overview
9.0.1. Purpose
This chapter supplements direction and instruction in DoDD 5000.01 and DoDI 5000.02
with processes and procedures for planning and executing an effective and affordable
T&E program in the DoD acquisition model. A rigorous and efficient T&E program
provides early knowledge of developmental and operational issues. Correcting these
issues early enough can mitigate risks of cost overruns and schedule slippages, and
can ultimately contribute to delivery of effective and suitable weapons, information
technology (IT) and National Security Systems (NSS) to the Warfighters in a timely
manner. The principles and practices in this chapter apply to all acquisition programs
regardless of size or cost; however, some aspects focus on acquisition programs of
sufficient interest, cost, size, complexity, or need for interoperability, requiring oversight
by the Office of the Secretary of Defense (OSD): the OSD T&E Oversight List.
9.0.2. Contents
Section 9.1 OSD T&E Organization provides a guide to OSD organizations having roles
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      733
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
in the accomplishment or overseeing the DoD T&E mission.
Section 9.2 Service-level T&E Management identifies the top level management
structure for the Services and the Major Range and Test Facilities Base (MRTFB).
Section 9.3 Test and Evaluation describes the different types of T&E and test events.
Section 9.4 Integrated Test and Evaluation defines integrated testing and describes how
all areas within T&E utilize Integrated Testing.
Section 9.5 T&E Planning describes actions needed to develop an Evaluation Plan,
Test and Evaluation Strategy (TES), Test and Evaluation Master Plan (TEMP), and test
plan.
Section 9.6 T&E Reporting describes actions and documentation needed to report T&E
results and evaluations.
Section 9.7 Special Topics addresses T&E programs deviating from the DoDI 5000.02
Defense Acquisition System model (e.g., associated with urgent needs programs,
defense business systems, National Security Systems (NSS), etc.).
Section 9.8 Best Practices presents examples of best practices to improve planning,
execution, and reporting of T&E.
Section 9.9 Prioritizing Use of Government Test Facilities for T&E provides information
on the mandate to use Government test facilities for T&E.
Throughout this chapter, interpret the terms developmental and operational as broad
statements of the types of testing or evaluation, and not as the testing controlled by a
particular organization.
The Director of Operational Test and Evaluation (DOT&E) for operational test and
evaluation (OT&E) and live fire test and evaluation (LFT&E), and the Deputy Assistant
Secretary of Defense for Developmental Test and Evaluation (DASD(DT&E)) within the
office of the Assistant Secretary of Defense for Research and Engineering (ASD(R&E))
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      734
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
in the Office of the Under Secretary of Defense for Acquisition, Technology and
Logistics (USD(AT&L)) provide oversight and policy for T&E of certain acquisition
programs within OSD. The DASD(DT&E) also serves as the Director, Test Resource
Management Center ( TRMC ) and has responsibility for oversight of DoD T&E
resources and infrastructure. By law, DASD(DT&E) closely coordinates with Deputy
Assistant Secretary of Defense for Systems Engineering (DASD(SE)), and routinely
coordinates with other OSD organizations, such as Cost Assessment and Program
Evaluation (CAPE).
    •    Prescribe policies and procedures for the T&E within the DoD
    •    Provide advice and make recommendations to the Secretary of Defense
         (SecDef),Deputy SecDef (DepSecDef), and USD(AT&L); as well as support
         Overarching Integrated Product Teams (OIPTs) and Defense Acquisition
         Boards/Information Technology Acquisition Boards for programs on the OSD
         T&E Oversight List
    •    Develop, in consultation with the DoD Components, the OSD T&E Oversight List
    •    Ensure the adequacy of T&E strategies and plans for programs on the OSD T&E
         Oversight List
    •    Ensure DoD Components do not terminate or substantially reduce participation in
         joint Acquisition Category (ACAT) ID or ACAT IAM programs without
         Requirements Authority review and USD(AT&L) approval
    •    Attend systems engineering technical reviews
    •    Monitor and review DT&E, OT&E, and LFT&E events of oversight programs
    •    Participate in the operational test readiness review (OTRR) process by providing
         recommendations concerning a systems readiness for operational testing
    •    Provide independent performance, schedule, and T&E assessments to the
         Defense Acquisition Executive Summary (DAES) process
    •    Provide representatives to the T&E working-level integrated product team ( T&E
         WIPT ) for oversight programs to assist program managers (PMs) in developing
         their strategy as well as preparing a TES / TEMP
The DOT&E and the DASD(DT&E), jointly, and in consultation with the DoD Component
T&E executives and other offices as appropriate, publish an annual OSD T&E Oversight
List . DOT&E and the DASD(DT&E) designate programs for DT&E, OT&E, and/or
LFT&E oversight. They consider all programs for inclusion, regardless of ACAT level,
and can add to or delete from the list at any time during the year. OSD considerations
for inclusion on formal T&E oversight include:
    •    ACAT level
    •    Potential for Joint designation
    •    Potential for establishment as an acquisition program (such as Technology
         Projects identified in Enclosure 3 of DoDI 5000.02 or a pre-Major Defense
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      735
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         Acquisition Program (MDAP))
    •    Stage of development or production
    •    Potential for DAES reporting
    •    Congressional and/or DoD interest
    •    Programmatic risk (cost, schedule, or performance)
    •    Past programmatic history of the developmental command
    •    Relationship with other systems as part of a system-of-systems (SoS)
    •    Technical complexity of system
The DOT&E, a Principal Staff Assistant and advisor to the Secretary of Defense, has
specific responsibilities as identified in DoDD 5141.02 , "Director of Operational Test
and Evaluation", dated February 2, 2009. Sections 139 and 2399 of title 10 USC
prescribe the duties for OT&E and section 2366 of title 10 USC for LFT&E . For
additional information on the DOT&E office, visit the DOT&E website . For purposes
here, DOT&E:
    •    Prescribes policies and procedures for the conduct of OT&E and LFT&E for DoD.
    •    Assesses the adequacy of OT&E and LFT&E performed by the Services and
         operational test agencies (OTAs) for programs on the OSD T&E Oversight List,
         for their effectiveness and suitability for advising the USD(AT&L) as well as for
         reporting to the SecDef and Congress.
    •    Advises the DoD Executive Agent for Space and the acquiring Military
         Department on T&E of DoD Space MDAPs and other space programs
         designated for T&E oversight, in support of DoDD 3100.10 Space Policy, dated
         July 9, 1999.
    •    Manages:
             o The efforts to improve interoperability and information assurance (IA)
                 through the operational evaluation of the systems under oversight and
                 major exercises conducted by the Combatant Commands and the Military
                 Departments.
             o The Joint Test and Evaluation (JT&E) Program .
             o The Joint Live Fire Program.
             o The Center for Countermeasures .
             o The activities of the Joint Aircraft Survivability Program .
             o The activities of the Joint Technical Coordinating Group for Munitions
                 Effectiveness and producing the Joint Munitions Effectiveness Manual.
             o The activities of the T&E Threat Resource Activity .
    •    Provides support to the Director, Joint Improvised Explosive Device Defeat
         Organization ( JIEDDO ), consistent with DoDD 2000.19E Joint Improvised
         Explosive Device Defeat Organization (JIEDDO), dated February 14, 2006.
    •    Assists the Chairman of the Joint Chiefs of Staff (CJCS) in efforts to ensure the
         Joint Capabilities Integration and Development System ( JCIDS ) documents, in
         terms verifiable through testing or analysis in support of CJCS Instruction
         3170.01 Joint Capabilities Integration and Development System, dated March 1,
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      736
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         2009, provides the expected joint operational mission environment, mission level
         measures of effectiveness (MOEs), and key performance parameters (KPPs).
    •    Oversees and assesses operational capability demonstrations conducted by the
         Missile Defense Agency, consistent with DoDD 5134.09 Missile Defense Agency
         (MDA), dated September 17, 2009.
    •    Establishes policy on the verification, validation, and accreditation (VV&A) of
         models and simulations used in support of OT&E and LFT&E.
    •    Oversees the International T&E (IT&E) program for the SecDef.
    •    Oversees and prescribes policy, as appropriate, to ensure adequate usage and
         verification of protection of human subjects and adherence to ethical standards in
         OT&E and LFT&E; in support of DoDD 3216.02 Protection of Human Subjects
         and Adherence to Ethical Standards in DoD-Supported Research, dated
         November 8, 2011.
As an advisor to the USD(AT&L) for DT&E through ASD(R&E), the DASD(DT&E) has
responsibilities and duties as prescribed in section 139b of title 10 USC . For additional
information on DASD(DT&E), visit the ODASD(DT&E) website. For purposes here, the
DASD(DT&E):
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      737
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
adequate testing in support of development, acquisition, fielding, and sustainment of
defense systems; and, maintain awareness of other T&E facilities and resources, within
and outside the Department, and their impacts on DoD requirements. The above
directive also provides the specific responsibilities of the TRMC.
9.2.3.2. Assistant Deputy Under Secretary of the Army for Test & Evaluation
(ADUSA(T&E))
9.2.3.4. Department of the Navy Test & Evaluation Executive (OPNAV N091)
Ultimately, management responsibility for an acquisition programs T&E resides with the
PM. However, the planning, executing, and reporting of T&E involves interactions,
support, and oversight from other organizations within OSD, the Services, Defense
Agencies, and in some cases, other government agencies; as well as the system
contractor(s). The PM charters a T&E WIPT early in the acquisition model to support
development of test strategies and estimates of resource requirements, strengthening
the overall input to the programs integrated product team (IPT). For additional
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      738
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
information, consult Rules of the Road A Guide for Leading a Successful Integrated
Product Team , October 1999.
The PM, in concert with the user and the T&E community, coordinates DT&E,
Acting as the agent for the Service Vice Chiefs and equivalent OUSD and Defense
Agency representatives with T&E management responsibilities is the BOD Executive
Secretariat (BOD(ES)), consisting of the Service T&E principals and equivalent OUSD
and Defense Agency representatives with T&E infrastructure management
responsibilities. The BOD(ES):
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      739
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         maximize commonality, interoperability, and effective utilization of products and
         services in support of the T&E infrastructure).
    •    Approves joint T&E requirements and recommends solutions from the needs and
         solutions process for the Central T&E Investment Program ( CTEIP )
         consideration.
    •    Serves as the T&E representatives on the OSD chartered Defense Test and
         Training Steering Group (DTTSG).
The DISA T&E Executive serves as the Test, Evaluation, and Certification (TE&C)
subject matter expert and Special Advisor to the DISA Director, DISA, and Senior
Executive Leadership. The DISA T&E Executive duties and responsibilities include:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      740
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         as the track manager for the DAWIA T&E component.
9.2.3.2. Assistant Deputy Under Secretary of the Army for Test & Evaluation
(ADUSA(T&E))
Within the Army, the T&E Executive is the Director, T&E Office under the authority,
direction, and control of the Deputy Under Secretary of the Army. Key Army T&E
Executive duties and responsibilities include:
    •    Serving as the senior advisor to the Secretary of the Army and the Chief of Staff,
         Army, on all Army T&E matters.
    •    Advising the Army Systems Acquisition Review Council (ASARC), the Army
         Requirements Oversight Council (AROC), and OIPTs on T&E matters.
    •    Approving test-related documentation for the Secretary of the Army and
         forwards, as appropriate, to OSD.
    •    Coordinating T&E matters with the Joint Staff and OSD, to include serving as
         principal Army interface on matters of T&E with the USD(AT&L) and DOT&E.
    •    Overseeing all Army T&E missions and functions, to include formulating
         overarching Army T&E strategy, policy, and program direction, providing policy
         oversight, and managing resources.
    •    Providing HQDA oversight on the funding of the Army Threat Simulator Program
         , Army Targets Program , and Army Instrumentation Program ; and coordinate
         with the Project Manager for Instrumentation, Targets, and Threat Simulators (
         PM ITTS ).
    •    Overseeing Army responsibilities in Joint T&E, Foreign Comparative Testing
         (FCT), and multi-Service and multinational T&E acquisition programs.
    •    Serving as the Acquisition Workforce Functional Chief for the T&E acquisition
         workforce Career Field.
The Air Force T&E Executive serves as the Director, Air Force Test and Evaluation
(AF/TE), who serves under the authority and direction of the Secretary of the Air Force
(SECAF) and the Chief of Staff of the Air Force (CSAF). In this capacity, the AF/TE:
    •    Functions as the sole focal point for Air Force T&E policy, guidance, direction,
         and oversight for the formulation, review, and execution of T&E plans, programs,
         and budgets.
    •    Functions as the chief T&E advisor to senior Air Force leadership on T&E
         processes; DT&E, including contractor testing and LFT&E; OT&E; and the use of
         M&S in T&E.
    •    Functions as the final T&E review authority and signatory for TEMPs prior to CAE
         and OSD approval and signature.
    •    Collaborates with requirements sponsors and system developers to improve
         operational requirements, system development, and the fielding of operationally
         effective, suitable, safe, and survivable systems.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      741
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Reviews and/or prepares T&E information for timely release to OSD, Congress,
         and decision makers.
    •    Oversees the Air Force T&E infrastructure by determining the adequacy of T&E
         resources required to support system acquisition activities. Administers various
         T&E resource processes and chairs or serves on various committees, boards,
         and groups supporting T&E activities.
    •    Acts as the single point of entry for the Air Force Foreign Materiel Program.
    •    Manages the Air Force Joint Test & Evaluation Program according to DoDI
         5010.41 Joint Test and Evaluation (JT&E) Program, dated September 12, 2005.
    •    Functions as the certifying authority for T&E personnel for T&E Level 3 in the
         Acquisition Professional Development Program (APDP) when not delegated to
         the Major Commands (MAJCOMs).
9.2.3.4. Department of the Navy Test & Evaluation Executive (OPNAV N091)
The Director, Test and Evaluation and Technology Requirements (OPNAV N091)
serves as the Department of Navy (DON) T&E Executive. The DON T&E Executive
reports to the Chief of Naval Operations (CNO), the Commandant of the Marine Corps
(CMC), and the Principle Military Deputy to the Assistant Secretary of the Navy for
Research, Development, and Acquisition ( PMD ASN (RDA) ) on all matters pertaining
to test and evaluation.
The DON T&E Executive supports and advises the Vice Chief of Naval Operations
(VCNO) regarding the VCNOs role on the T&E BOD and serves as the Navy
representative on the T&E BOD Executive Secretariat.
    •    Approves all Navy Test and Evaluation Master Plans for CNO.
    •    Establishes Navy T&E requirements and promulgates policy, regulation, and
         procedures governing Navy T&E.
    •    Acts for CNO in resolving T&E requirements.
Statute and policy prescribes the management of DT by the DASD(DT&E), who, for all
programs on DT oversight, acts as the final approval authority for DT planning in the
TEMP. ODASD(DT&E) staff representatives actively participate in acquisition program
T&E WIPTs and provide advice to the T&E WIPT and PM; as well as providing
independent assessments to DASD(DT&E) on progress of performance of the test
program and overall performance of the system. By statute, the DASD(DT&E) has
access to all test data and program information relevant to the execution of testing and
fulfillment of the ODASD(DT&E) responsibilities. As a member of the OIPT, the
DASD(DT&E) provides advice and recommendations at Defense Acquisition Board
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      742
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
(DAB), reviews and submits an independent Assessment for Operational Test
Readiness (AOTR) to the Component Acquisition Executive (CAE) and USD(AT&L) for
all programs on DT oversight prior to the CAE decision on material readiness for initial
operational test and evaluation (IOT&E).
The PM should initiate early engagement with the ODASD(DT&E) and charter a T&E
WIPT to aid in development of test strategies and building a TEMP. Given that DT
spans the entire lifecycle of an acquisition program and remains a vital part of all levels
in the work structure of the systems engineering process, DASD(DT&E) expects due
diligence from the PMs to ensure they base program and design decisions on test
results conducted and reported as independent verification steps in the process, and
not simply pulled from design and test learning processes. This effort requires close and
continuous coordination with the SEP, Information Support Plan (ISP), and developing
activity engineering and test activities to ensure test plans and reports reflect
independent evaluation of the test data from the engineering staff vested in the
development activities.
Ideally, the PM bases all development decisions on test events and not schedules or
costs; but in the pragmatic environment of developing systems for the Warfighter, time
and cost prove significant drivers in pressuring test activities. Therefore, DT activities
must provide realistic T&E schedules to PMs during the establishment of the programs
integrated management schedule. This effort ensures the effective management of the
overall progress and cost of the program; particularly with complex systems that have a
number of dependent sub systems and technologies requiring efficient integration as an
end product.
    •    Develops policies and guidance for the planning, execution, and reporting of
         DT&E in the DoD, according to section 139b of title 10 USC .
    •    Develops policies and guidance for the integration of DT and OT, in coordination
         with DOT&E.
    •    Publishes, in conjunction with DOT&E, a combined list of OSD T&E Oversight
         programs for DT&E, OT&E, and LFT&E.
    •    Monitors and reviews the DT&E activities of MDAPs and other programs.
    •    Periodically conducts AOTRs.
    •    Provides advocacy, oversight, and guidance to the acquisition workforce
         responsible for test and evaluation.
    •    Reviews and approves TES/TEMPs and selected DT&E plans.
    •    Periodically reviews the Services organizational DT&E capabilities to identify
         needed changes or improvements.
By law, DOT&E prescribes policies and procedures for the conduct of OT&E in the
Department of Defense. For programs on DOT&E OT oversight, DOT&E serves as the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      743
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
final approval authority for OT&E planning to include approval of the TEMP. DOT&E
staff representatives actively participates in acquisition program T&E WIPTs and
provide advice to the T&E WIPT and PM; as well as providing independent
assessments to the DOT&E on progress of performance of the test program and overall
performance of the system. By law, DOT&E has access to all data and records DOT&E
considers necessary to review in fulfillment of DOT&E OT&E responsibilities. DOT&E
serves as a member of both the Joint Requirements Oversight Council and the OIPT,
providing advice and recommendations at DAB reviews; and has direct access to both
USD(AT&L) and the SecDef, on all matters relating to operational test and evaluation.
The PM should initiate early engagement with DOT&E through the Service and Defense
Agency T&E Executive and independent OTA and charter a T&E WIPT to aid in
development of T&E strategies and the TEMP. Since OT&E generally acts as the
validation process in SE, early engagement of the OTA and DOT&E, as early as the
Analysis of Alternatives and requirements development, ensures a comprehensive
assessment of measurability and testability of requirements; and the associated
implications to cost and schedule to effectively evaluate the system capabilities and
limitations. This requires close and continuous coordination with users, sponsors,
developers, and all test activities to ensure understanding and articulation of end-game
expectations during program planning and documentation.
Per section 2399 of title 10 USC , an MDAP must complete IOT&E before proceeding
beyond full-rate production (FRP). Law also requires DOT&E to provide a Beyond Low-
Rate Initial Production (BLRIP) report to the SecDef, USD(AT&L), and congressional
defense committees on the adequacy of OT&E conducted; as well as the results of T&E
to confirm effectiveness and suitability for combat. Additionally, DoDI 5000.02 charges
DOT&E with completing the section 2366 of title 10 USC LFT&E report requirement for
submission to the congressional defense committees, SecDef, and USD(AT&L) before
the system may proceed to FRP. For purposes of compliance with completion of
IOT&E, the PM must ensure the system under test reflect production configured or
representative systems, preferably Low Rate Initial Production (LRIP) systems. Title 10
requires DOT&E to determine the number of LRIP systems for all operational testing of
programs on DOT&Es OT&E oversight and the Service OTA to determine LRIP
requirements for non-OSD T&E oversight programs. DOT&E and the OTAs routinely
engage the PM in those decisions. For programs not on the OSD T&E Oversight List,
the Service or Defense Agency OTA will work with the PMs for OT&E, including
planning, applicable oversight, execution and reporting. Service or Defense Agency
OTAs may delegate the responsibilities to other responsible DoD test agencies.
DOT&E approves all OT&E plans, to include early operational assessments (EOAs),
OAs, Limited User Tests (LUTs), IOT&E, and Follow-on Operational Test & Evaluation
(FOT&E). DOT&E requires the OTAs to provide plans to assess adequacy of data
collection and analysis planning to support the operational evaluation of a systems
operational effectiveness and operational suitability, since integrated test concepts aid
in generating test efficiencies and reduced development time. OTAs must schedule test
concept briefings 180 days prior to an operational test. PMs must provide OT&E plans
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      744
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
for DOT&E approval 60 days prior to test events.
In addition to OT&E oversight, the SecDef charges DOT&E with approving waivers to
full up system level (FUSL) LFT&E and approval of required alternative LFT&E plans
prior to Milestone B.
For programs to effectively track through the complex acquisition process and meet
their cost, schedule, and performance goals, it remains essential to engage OSD early,
continuously, and to quickly resolve working issues presenting obstacles to any of the
T&E stakeholders duties. Service T&E Executives must establish clear issue resolution
processes to resolve issues in a timely fashion.
    •    Prescribes OT&E and LFT&E policies for the DoD according to sections 139 ,
         2366 , 2399 , and 2400 of title 10; and DoDD 5141.2 , Director of Operational
         Test and Evaluation (DOT&E), dated February 2, 2009.
    •    Exercises oversight responsibility for ACAT I or other programs in which the
         SecDef has special interest. Monitors and reviews OT and LF activities in the
         DoD.
    •    Participates in integrated test teams and test integrated product teams to foster
         program success.
    •    Publishes, in conjunction with the DASD(DT&E), a combined list of OSD T&E
         Oversight programs for DT, OT, and LF.
    •    Approves, in writing, the adequacy of operational test plans for those programs
         on OSD OT&E Oversight prior to the commencement of operational testing.
         Approves the operational test portions of integrated test plans. Approves the
         quantity of test articles required for operational testing of major defense
         acquisition programs (MDAP).
    •    Approves TEMP and T&E strategies for OSD T&E Oversight programs in
         conjunction with the DASD(DT&E) and DoD Chief Information Officer (CIO).
    •    Approves LFT&E strategies and waivers prior to commencement of LFT&E
         activities.
    •    Submits a report to SecDef and Congress before systems on OSD OT&E
         Oversight may proceed BLRIP.
The DoD, through the TRMC, oversees sustainment of twenty-four T&E organizations
or activities with a skilled workforce and T&E technical capabilities and processes, and
available to all components under a common charge policy. In accordance with DoDD
3200.11 Major Range and Test Facility Base MRTFB, dated December 27, 2007 and
DoDI 3200.18 Management and Operation of the Major Range Test Facility Base
(MRTFB), dated February 1, 2010, TRMC manages the following activities:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      745
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
ARMY ACTIVITIES
U.S. Army Kwajalein Atoll (Ronald Reagan Ballistic Missile Defense Test Site)
NAVY ACTIVITIES
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      746
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Utah Test and Training Range
DEFENSE-WIDE ACTIVITIES
Joint Interoperability Test Command, to include capabilities at Indian Head, MD, and
Fort Huachuca, AZ
DoD employs three formal types of T&E (directed by statute) in the acquisition of
weapon systems, business systems, NSS, and joint systems administered by OSD:
DT&E, OT&E, and LFT&E. The TRMC, also directed by statute, oversees the MRTFB to
ensure availability of capabilities to support the three T&E types. Within these broad
categories, the military departments and Defense Agencies have their own directives,
guidance, organizations, T&E resources, ranges, and facilities specific to their needs.
This section provides distinguishing features of each type.
Programs conduct DT&E throughout the systems life cycle, from program initiation
through system sustainment, to reduce design and programmatic risks and provide
assessments. DT&E can occur as either contractor testing or government testing or a
mix of both. As such, DT&E:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      747
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Supports progress toward and final characterization of the system readiness for
         dedicated IOT&E via the AOTR process and document.
    •    Characterizes system functionality and provides information for cost,
         performance, and schedule tradeoffs.
    •    Assesses system specification compliance.
    •    Reports progress to plan for Reliability Growth and characterizes reliability and
         maintainability.
    •    Identifies system capabilities, limitations, and deficiencies.
    •    Assesses system safety.
    •    Assesses compatibility with legacy systems.
    •    Stresses the system within an intended mission environment.
    •    Supports the joint interoperability certification process and achieves information
         assurance certification and accreditation.
    •    Documents achievement of contractual technical performance and verifies
         incremental improvements and system corrective actions.
Evaluation in the context of DT&E refers to evaluating the generated performance data
to ensure it appropriately depicts the performance of the item as tested in the conditions
of the test.
Service and Defense Agency OTAs have a responsibility for OT&E. OT&E determines
the operational effectiveness and operational suitability of a system under realistic
operational conditions, including joint combat operations; determines the satisfaction of
thresholds in the approved JCIDS documents and critical operational issues; assesses
impacts to combat operations; and provides additional information on the systems
operational capabilities.
OTAs have a responsibility for early involvement in a systems acquisition; for example,
EOAs during the Technology Development (TD) phase, OAs during engineering and
manufacturing development (EMD) phase, and review of Capabilities Documents to
assess measurability, testability, and operational relevancy of requirements in the
JCIDS documents (that is, Capability Development Document (CDD) and Capability
Production Document (CPD)). OTAs also have responsibility for the assessment and
evaluation of systems operational effectiveness, operational suitability, and survivability
or operational security completed in IOT&E, and when necessary, Follow-on
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      748
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Operational Test and Evaluation (FOT&E).
    •    For dedicated OT&E, typical users operate and maintain the system under test
         conditions simulating combat and peacetime operations.
    •    OT&E uses threat or threat representative forces, targets, and threat
         countermeasures, validated by the Defense Intelligence Agency (DIA) or the DoD
         Component intelligence agency, as appropriate, and approved by DOT&E during
         the test plan approval process.
    •    Conducting IA Testing and evaluation for all weapon, information, and C4ISR
         programs depending on external information sources, or providing information to
         other DoD systems.
    •    Persons employed by the contractor for the system under development may only
         participate in the OT&E of MDAPs to the extent the PM planned for their
         involvement in the operation, maintenance, and other support of the system
         when deployed in combat.
    •    Testing production representative systems, which include any system accurately
         representing it’s final configuration using mature and stable hardware and
         software; that accurately mirrors the production configuration, but not produced
         on a final production line (although production tooling may account for some
         components).
For weapon systems, integrate LFT&E of system lethality into the evaluation of weapon
system effectiveness. For example, operational testing could identify likely shot lines, hit
points, burst points, or miss distances providing a context for LFT&E lethality
assessments. Fuse performance, as determined under DT&E, can provide information
for both OT&E and LFT&E assessments.
Operational suitability defines the degree in which a system satisfactorily places in field
use, with consideration given to reliability, availability, compatibility, transportability,
interoperability, wartime usage rates, maintainability, safety, human factors, manpower
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      749
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
supportability, logistics supportability, documentation, environmental effects, and
training requirements.
Early planning for the operational suitability evaluation should include any special needs
for the number of operating hours, environmental testing, maintenance demonstrations,
testing profiles, usability of DT&E data, or other unique test requirements.
Integrate DT&E, OT&E, and LFT&E strategies to ensure the consistent assessment of
the full spectrum of system survivability or operational security. The COIs should include
any issues needing addressed in the OT&E evaluation of survivability or operational
security. Systems under LFT&E oversight must address personnel survivability
(reference section 2366 of title 10 USC ) and integrate it into the overall system
evaluation of survivability or operational security conducted under OT&E.
Generally, LFT&E address vulnerability while OT&E addresses susceptibility, but areas
of overlap exist. The evaluation of LFT&E results requires realistic hit distributions. The
OT&E evaluation of susceptibility might identify realistic hit distributions of likely threats,
hit/burst points, and representative shot lines providing a context for LFT&E vulnerability
assessments. DT&E and OT&E testing of susceptibility may provide other LFT&E
insights, such as information on signatures, employment of countermeasures, and
tactics used for evasion of threat weapons. Similarly, LFT&E tests, such as Total Ship
Survivability trials, may provide OT&E evaluators with demonstrations of operability and
suitability in a combat environment.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      750
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
damage assessment and repair, crashworthiness, crew escape, and rescue capabilities.
LFT&E conducts real time casualty assessment (RTCA) during IOT&E to ensure
assumptions supporting the RTCA remain consistent with LFT&E results.
Structure and schedule the LFT&E Strategy to incorporate any design changes resulting
from testing and analysis before proceeding beyond LRIP.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      751
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
9.3.3.2. Covered Systems
The DoD term for a covered system includes all categories of systems or programs
requiring LFT&E. A "covered system" defines a system that DOT&E, acting for the
SecDef, designates for LFT&E oversight. These systems include, but are not limited to,
the following categories:
    •    Any major system within the meaning of that term in section 2302(5) of title 10
         USC , including user-occupied systems and designed to provide some degree of
         protection to its occupants in combat; or
    •    A conventional munitions program or missile program; or a conventional
         munitions program planning to acquire more than 1,000,000 rounds (regardless
         of major system status); or
    •    A modification to a covered system likely to significantly affect the survivability or
         lethality of such a system.
DOT&E approves the adequacy of the LFT&E Strategy before the program begins
LFT&E. LFT&E issues identified in the strategy should drive the program, and fully
integrate it with planned DT&E and OT&E. LFT&E typically includes testing at the
component, subassembly, and subsystem level; and may also draw upon design
analyses, modeling and simulation, combat data, and related sources such as analyses
of safety and mishap data. As a standard practice, this occurs regardless of whether the
LFT&E program culminates with FUSL testing, or obtaining a waiver from FUSL testing.
Conducting LFT&E early in the program life cycle allows time to correct any design
deficiency demonstrated by the T&E. Where appropriate, the program manager may
correct the design or recommend adjusting the employment of the covered system
before proceeding beyond LRIP.
DoD defines "full-up, system-level testing" as testing that fully satisfies the statutory
requirement for "realistic survivability" or "realistic lethality testing," as defined in section
2366 of title 10 USC . The criteria for FUSL testing differs somewhat based on the type
of testing: survivability or operational security or lethality. The following describes FUSL
testing:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      752
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
designed to produce.
The statute requires a LFT&E program to include FUSL testing unless granted a waiver
in accordance with procedures defined by the statute. To request a waiver, submit a
waiver package to the appropriate Congressional defense committees prior to Milestone
B; or, in the case of a system or program initiated at Milestone B, as soon as practicable
after Milestone B; or if initiated at Milestone C, as soon as practicable after Milestone C.
Typically, this should occur at the time of TEMP approval.
The waiver package includes certification by the USD(AT&L) or the DoD CAE that
FUSL testing would prove unreasonably expensive and impractical. It also includes a
DOT&E-approved alternative plan for conducting LFT&E in the absence of FUSL
testing. Typically, the alternative plan appears similar or identical to the LFT&E Strategy
contained in the TEMP. This alternative plan should include LFT&E of components,
subassemblies, or subsystems; and, as appropriate, additional design analyses, M&S,
and combat data analyses.
Programs receiving a waiver from FUSL testing conduct their plans as LFT&E programs
(with exception of the statutory requirement for FUSL testing). In particular, the TEMP
contains an LFT&E Strategy approved by DOT&E; and DOT&E, as delegated by the
SecDef, submits an independent assessment report on the completed LFT&E to the
Congressional committees as required by statute.
According to OSD Memorandum Definition of Integrated Testing , dated April 25, 2008,
OSD defines integrated testing as the collaborative planning and collaborative execution
of test phases and events to provide shared data in support of independent analysis,
evaluation, and reporting by all stakeholders, particularly the development (both
contractor and government) and operational test and evaluation communities.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      753
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
planning of test events; where a single test point or mission can provide data to satisfy
multiple objectives, without compromising the test objectives of participating test
organizations. Test points in this context, mean a test condition denoted by time, three-
dimensional location and energy state, and system operating configuration; where
applying a pre-planned test technique to the system under test and observing and
recording the response(s).
Integrated testing includes more than just concurrent or combined DT and OT, where
both DT and OT test points remain interleaved on the same mission or schedule.
Integrated testing focuses the entire test program (contractor test, Government DT, OT,
and LFT) on designing, developing, and producing a comprehensive plan that
coordinates all test activities to support evaluation results for decision makers at
required decision reviews.
Integrated testing may include all types of test activities such as contractor testing,
developmental and operational testing, interoperability and IA testing, and certification
testing. All testing types, regardless of the source, should receive consideration;
including tests from other Services for multi-Service programs. Software intensive and
IT systems should use the reciprocity principle as much as possible, i.e., "Test by one,
use by all." Specifically name any required integrated test combinations.
For successful integrated testing, understanding and maintaining the pedigree of the
data proves vital. The pedigree of the data refers to accurately documenting the
configuration of the test asset and the actual test conditions under which each element
of test data was obtained. The pedigree of the data should indicate whether the test
configuration represented operationally realistic or representative conditions. The T&E
WIPT plays an important role in maintaining the data pedigree within the integrated test
process for a program. The T&E WIPT establishes agreements between the test
program stakeholders; regarding roles and responsibilities in not only implementing the
integrated test process, but also in developing and maintaining data release
procedures, and data access procedures or a data repository, where all stakeholders
will have access to test data for separate evaluations.
Integrated testing must provide shared data in support of independent analyses for all
T&E stakeholders. A requirement exists for a common T&E database, including
descriptions of the test environments to ensure commonality and usability by other
testers. Integrated testing must allow for and support separate, independent OT&E
according to section 2399 of title 10 USC and DoDI 5000.02 , Operation of the Defense
Acquisition System, dated December 8, 2008. It does not include the earliest
engineering design or testing of early prototype components.
Integrated testing serves as a concept for test design, not a new type of T&E. Programs
must intentionally design it into the earliest program strategies, plans, documentation,
and test plans, preferably starting before Milestone A. Developing and adopting
integrated testing strategies early in the process increases the opportunities and
benefit’s. If done correctly, the enhanced operational realism in DT&E provides greater
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      754
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
opportunity for early identification of system design improvements, and may even
change the course of system development during EMD. Integrated testing can increase
the statistical confidence and power of all T&E activities. Most obviously, integrated
testing can also reduce the number of T&E resources needed in OT&E. However,
integrated testing does not replace or eliminate the need for dedicated IOT&E, as
required by section 2399 of title 10 USC , "Operational Test and Evaluation of Defense
Acquisition Programs" and DoDI 5000.02 .
The T&E strategy should embed integrated testing, although most of the effort takes
place during the detailed planning and execution phases of a test program. It is critical
that all stakeholders understand the required evaluations to assess risks, assess
maturity of the system and assess the operational effectiveness, operational suitability
and survivability or operational security /lethality. Up front, define the end state for
evaluation, ensuring all stakeholders work toward the same goal. Once accomplished,
develop an integrated test program that generates the data required to conduct the
evaluations.
Early identification of system and mission elements enable the development and
execution of an efficient and effective T&E strategy and an integrated DT/OT program.
The use of scientific and statistical principles for test and evaluation; for example,
design of experiments (DOE), will help develop an integrated DT/OT program by
providing confidence about the performance of a system in a mission context.
Although DT and OT require different fidelity to meet their individual objectives (e.g.,
data parameters, mission control, onboard and test range instrumentation, data
collection and analysis), some of areas of commonality include:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      755
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
continuum. They focus on the early discovery of problems in a mission context and in
realistic operational environments even for component testing. The appropriate T&E
environment includes the system under test (SUT) and any interrelated systems (that is,
it’s planned or expected environment in terms of weapons, sensors, command and
control, and platforms, as appropriate) needed to accomplish an end-to-end mission in
combat. The following includes a few integrated test concerns:
    1. Balancing the test event to effectively capture different DT and OT data collection
       objectives
    2. Requiring early investment in detailed planning that many programs lack in early
       stages
    3. Requiring constant planning and updates to effectively maximize test results
    4. Much of the early information for a program is preliminary, requiring rework and
       updates
    5. Analyzing proves difficult when unanticipated anomalies appear in test results
T&E planning should include statistically defensible test results to effectively support
decision makers. A common approach, DOE serves as a structured process to assist in
developing T&E strategies utilizing statistical analyses. Many constraints exist in testing
limited test resources, limited test time, and limited test articles. DOE aids in the
understanding of the tradeoffs among these constraints and their implications.
Additionally, DOE can provide a statistically optimum allocation of assets under given
constraints. It can also provide optimal allocation test points between multiple phases of
testing. DOE ensures the synergistic results in the data collected in multiple phases in
sequential learning about the system.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      756
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
A program applying DOE should start early in the acquisition process and assemble a
team of subject matter experts who can identify operational and environmental
conditions (the driving factors in the successful performance of the system and the
consideration of levels of each factor). The team should include representation for all
testing (contractor testing, Government DT and OT). The developed TEMP should
include the resources needed, the plan for early tests (including component tests), and
use of the results of early tests to plan further testing.
A well planned and executed DT&E program supports the technology development and
acquisition strategies as well as the systems engineering process; providing the
information necessary for informed decision-making throughout the development
process and at each acquisition milestone. DT&E provides the verification and
validation (V&V) of the systems engineering process as well as confidence that the
system design solution satisfies the desired capabilities. The strategy for T&E should
remain consistent with and complementary to the SEP and acquisition strategy. The
T&E WIPT, working closely with the PM and the system design team, facilitates this
process. Rigorous component and sub-system DT&E enables early performance and
reliability assessments for utilization in system design. DT&E and integrated testing
events should advance to rigorous, system-level and system-of-systems (SoS) level
T&E; ensuring the system maturity to a point where it can enter production, and
ultimately meet operational employment requirements.
DT&E reduces technical risk and increases the probability of a successful program.
During early DT&E, the prime contractor focuses contractor testing on technical contract
specifications. Government testers observe the critical contractor testing, conduct
additional T&E, and, when practical, facilitate early user involvement. The PMs contract
with industry must support open communication between government and contractor
testers. The OSD document, "Incorporating Test and Evaluation into Department of
Defense Acquisition Contracts," dated October 2011, provides additional guidance on
contract-related issues for the successful solicitation, award, and execution of T&E
related aspects of acquisition contracts. Items such as commercial-off-the-shelf, non-
developmental items, and Government-off-the-shelf products, regardless of the manner
of procurement, must undergo DT&E to verify readiness to enter IOT&E, for proper
evaluation of operational effectiveness, operational suitability, and survivability or
operational security for the intended military application. Programs should not enter
IOT&E until the DoD Components indicate confidence that the production
representative system will successfully demonstrate effective, suitable, and survivable
criteria established in the capability production document (CPD). In addition, the
government will report DT&E results at each program milestone, providing knowledge to
reduce the risk in those acquisition decisions.
DoDI 5000.02 Enclosure 6 lists mandatory elements of OT&E planning and execution.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      757
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Other considerations include:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      758
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         inclusion/exclusion of test data for use during OAs and determine the
         requirement for any additional operational testing needed for evaluation of
         operational effectiveness, operational suitability and mission capability.
    •    OT&E uses threat or threat representative forces, targets, and threat
         countermeasures, validated by the DIA or the DoD Component intelligence
         agency, as appropriate, and approved by DOT&E during the operational test plan
         approval process. DOT&E oversees threat target, threat simulator, and threat
         simulation acquisitions and validation to meet OT&E and LFT&E needs.
    •    PMs and OTAs assess the reliability growth required for the system to achieve its
         reliability threshold during IOT&E and report the results of that assessment to the
         MDA at Milestone C.
An integral element of the Defense Acquisition System ( DoDI 5000.02 ), T&E has a role
across the entire lifecycle as depicted in the following Figure 9.5.3.F1. The Integrated
Defense Acquisition, Technology, and Logistics Life Cycle Management System Chart
(v5.3.4, 15 Jun 2009) outlines the key activities in the systems acquisition processes
that must work in concert to deliver the capabilities required by the warfighters: the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      759
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
requirements process (JCIDS; the acquisition process (Defense Acquisition System);
and program and budget development (Planning, Programming, Budgeting, and
Execution (PPBE) process).
Figure 9.5.3.F1: Key T&E Processes across the Lifecycle T&E Perspective
NOTE: A larger version of the process is available by clicking on the image above.
Key sources of T&E information, used during the formulation of a Materiel Solution,
include the capabilities-based assessment (CBA), Analysis of Alternatives (AOA),
JCIDS documents, etc. Items of particular interest to the T&E community include:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      760
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         performance attributes and measures.
JCIDS processes are currently undergoing a significant revision, with the expectation of
releasing the new policy in late FY 2011. The current JCIDS process has evolved from
a joint mission-based process, focused on evaluating MOE and MOP in a mission
context to deliver a capability to an operational environments-based process focused on
evaluating system performance attributes to deliver a required capability, as seen in
excerpt from the current JCIDS policy below:
    •    If the system does not meet all of the threshold levels for the KPPs, the Joint
         Requirements Oversight Council (JROC) will assess whether or not the system
         remains operationally acceptable.
    •    The CDD and CPD identify the attributes contributing most significantly to the
         desired operational capability in threshold-objective format. Whenever possible,
         state attributes in terms reflecting the range of military operations the capabilities
         must support and the joint operational environment intended for the system
         (family of systems (FoS) or SoS).
    •    Other compatibility and interoperability attributes (e.g., databases, fuel,
         transportability, and ammunition) might need identification to ensure a
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      761
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         capability’s effectiveness.
The CJCSI 3170.01H Joint Capabilities Integration and Development System, dated
January 10, 2012 complements the JCIDS instruction. Additionally:
    •    DOT&Es role with respect to the ICD is included in the JCIDS Manual: DOT&E
         will advise on the testability of chosen capability attributes and metrics so that the
         systems performance measured in operational testing can be linked to the CBA.
    •    The JCIDS manual further states The ICD will include a description of the
         capability, capability gap, threat, expected joint operational environments,
         shortcomings of existing systems, the capability attributes and metrics, joint
         Doctrine, Organization, Training, Materiel, Leadership and Education, Personnel
         and Facilities (DOTMLPF), and policy impact and constraints for the capabilities.
Note: the JCIDS policy no longer requires or discusses MOPs and MOEs; however, the
JCIDS derives and documents performance attributes from analysis that supported the
CBA and the AOA. Additionally, the CBA, AOA, and MOPs and MOEs remain essential
metrics needed for evaluation of those performance attributes.
    •    Measure of Effectiveness (MOE) The data used to measure the military effect
         (mission accomplishment) that comes from the use of the system in its expected
         environment. That environment includes the system under test and all
         interrelated systems, that is, the planned or expected environment in terms of
         weapons, sensors, command and control, and platforms, as appropriate, needed
         to accomplish an end-to-end mission in combat.
    •    Measures of Performance (MOPs) System-particular performance parameters
         such as speed, payload, range, time-on-station, frequency, or other distinctly
         quantifiable performance features. Several MOPs may be related to the
         achievement of a particular MOE.
Further, the OTAs and DOT&E have a requirement to address effectiveness in their
evaluations. In the memorandum Reporting of Operational Test and Evaluation (OT&E)
Results , dated January 6, 2010, DOT&E states:
    •    The data used for evaluation are appropriately called measures of effectiveness,
         because they measure the military effect (mission accomplishment) that comes
         from the use of the system in its expected environment. This statement of policy
         precludes measuring operational effectiveness and suitability solely on the basis
         of system-particular performance parameters.
    •    . . . “performance attributes ( sic ) are often what the program manager is
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      762
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         required to deliver they are not the military effect or measure of operational
         effectiveness required for achieving the primary purpose of a mission capability”.
    •    It is therefore unacceptable in evaluating and reporting operational effectiveness
         and suitability to parse requirements and narrow the definition of mission
         accomplishment so that MOP are confused with MOE.
The JCIDS process begins with the CBA, which provides the bases for JCIDS to
articulate the systems performance attributes required by the warfighters. Any DoD
organization may initiate a CBA. See the Manual for the Operation of the Joint
Capabilities Integration and Development System , dated July 31, 2009 for CBA
information.
For potential and designated ACAT I and IA programs, the Director, Cost Assessment
and Program Evaluation (CAPE) should draft, for MDA approval, AoA study guidance
for review at the Materiel Development Decision. Following approval, the guidance
should be issued to the DoD Component designated by the MDA, or for ACAT IA
programs, to the office of the Principal Staff Assistant responsible for the mission area.
According to DoDI 5000.02, Enclosure 7 , dated December 8, 2008, the DoD
Component or the Principal Staff Assistant shall designate responsibility for completion
of the study plan and the AoA; neither of which may be assigned to the PM. The study
plan shall be coordinated with the MDA and approved by the CAPE prior to the start of
the AoA. The final AoA shall be provided to the CAPE not later than 60 days prior to the
DAB or Information Technology Acquisition Board milestone reviews. The CAPE shall
evaluate the AoA and provide an assessment to the Head of the DoD Component or
Principal Staff Assistant and to the MDA. In this evaluation, the CAPE, in collaboration
with the OSD and Joint Staff, shall assess the extent to which the AoA:
e) Calculated costs.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      763
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
       end items that create a demand for energy, consistent with mission requirements
       and cost effectiveness.
    3. Appropriate system training to ensure that effective and efficient training is
       provided with the system.
CTPs should focus on critical design features or risk areas (e.g., technical maturity,
reliability, availability, and maintainability (RAM) issues, physical characteristics or
measures) that if not achieved or resolved during development will preclude delivery of
required operational capabilities. CTPs will likely evolve/change as the system matures
during EMD. Resolve existing CTPs and identify new CTPs as the system progresses
during development. Identify any CTPs not resolved prior to entering LRIP and establish
an action plan to resolve them prior to the FRP Decision Review.
The Program T&E Lead has responsibility for coordinating the CTP process with the
Programs Chief or Lead Systems Engineer, with assistance from the appropriate test
organization subject matter experts and lead OTA. The evaluation of CTPs proves
important in projecting maturity of the system and to inform the PM as to whether the
system is on (or behind) the planned development schedule or will likely (or not likely)
achieve an operational capability, but are not sufficient in projecting mission capability.
The projection of mission capability requires an evaluation of the interoperability of
systems and sub-systems in the mission context, when used by a typical operator,
CTPs associated with the systems/sub-systems provide a basis for selecting entry or
exit criteria demonstrated for the major developmental test phases.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      764
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
9.5.4. Test and Evaluation Strategy (Milestone A)
9.5.4.1. Description
9.5.4.1. Description
The TES describes the concept for tests and evaluations throughout the program life
cycle, starting with Technology Development and continuing through EMD into
Production and Deployment. The TES evolves into the TEMP at Milestone B.
Development of a TES requires early involvement of testers, evaluators, and others as
a program conducts pre-system acquisition activities. These personnel provide the
necessary technical, operational, and programmatic expertise to ensure nothing is
overlooked in laying out a complete strategy. The TES approval process is explained in
9.5.4.3.
The TES must remain consistent with the Technology Development Strategy (TDS) and
Initial Capabilities Document (ICD) . The TES should address the identification and
management of technology risk, the evaluation of system design concepts against the
preliminary mission and sustainment requirements resulting from the analysis of
alternatives, competitive prototyping, early demonstration of technologies in
operationally relevant environments, and the development of an integrated test
approach. The TES also satisfies the TDS test plan to ensure the completion of goals
and exit criteria for the technology demonstrations in a relevant environment in
accordance with section 2359a of title 10 USC . It also provides a road map for
evaluations, integrated test plans, and resource requirements necessary to accomplish
the TD phase objectives.
The TES begins by focusing on TD phase activities, and describes the demonstration of
component technologies under development in an operationally relevant environment to
support the program's transition into the EMD Phase. It contains hardware and software
maturity success criteria used to assess key technology maturity for entry into EMD. For
programs following an evolutionary acquisition strategy with more than one
developmental increment, the TES describes the application of T&E and M&S to each
planned increment to provide the required operational effectiveness, suitability, and
survivability or operational security, as would be required of a program containing only
one increment. TES development supports the initial Milestone A decision. The TEMP
subsumes the TES for all increments thereafter, unless a follow-on increment requires a
new Milestone A decision. TES development establishes an early consensus among
T&E WIPT member organizations on the programs scope for testing and evaluation,
with particular consideration given to needed resources to support PPB&E process
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      765
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
activities. The TES requires the inclusion of cost estimates beginning with program
initiation and continuing through development and production, including nonrecurring
and recurring research and development (R&D) costs for prototypes, engineering
development equipment and/or test hardware (and major components thereof).
Additionally, the TES fully identifies and estimates contractor T&E and Government
support to the test program. Estimate any support, such as support equipment, training,
data, and military construction. Include the cost of all related R&D (such as redesign
and test efforts necessary to install equipment or software into existing platforms). See
DoD 5000.4-M , "Cost Analysis Guidance Procedures," Table C2.T2, "Defense
Acquisition Program Life-Cycle Cost Categories Research and Development," for a
more specific list of R&D costs. The basis for the T&E resources required in the Cost
Analysis Requirements Description comes from the TES cost information.
The following content and format provides all necessary information for a TES, and
assists in the transition to a TEMP at Milestone B.
PART I INTRODUCTION
1.1. Purpose. State the purpose of the TES. Reference the documentation initiating the
TES (i.e., ICD, AoA, CONOPS).
1.2. Mission Description. Briefly summarize the mission need described in the capability
requirements documents in terms of the capability it will provide to the Joint Forces
Commander. Briefly summarize the CONOPS, and include a high level operational
concept graphic ( OV-1) or similar diagram.
1.3. System Description. Describe the system or prototype configurations. Identify key
features, technologies, and components, both hardware and software for the planned
Technology Development phase.
1.3.2. Program Background. Briefly discuss any background information. Reference the
AoA, the materiel development decision, and any previous tests or evaluations that
have an effect on the T&E strategy.
1.3.3. Key Capabilities. Identify the system attributes that support key capabilities from
the ICD. Identify the T&E-related TD Phase exit criteria.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      766
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
accomplishment.
2.1. T&E Management. Discuss the test and evaluation role of participating
organizations. Describe the role of contractor and governmental personnel. Provide
organizational construct that includes organizations such as the T&E WIPT or Service
equivalent.
2.2. T&E Data Strategy. Describe the strategy and methods for collecting, validating,
and sharing data as it becomes available from the contractors, DT&E, and oversight
organizations.
2.3. Integrated Test Program Schedule. Provide the overall time sequencing of the
major events with an emphasis on the TD phase. Include event dates such as major
decision points, preliminary design reviews, prototypes and test article availability, and
phases of DT&E.
3.1. T&E Strategy Introduction. This section should summarize an effective and efficient
approach to the T&E program.
3.2. Evaluation Framework. Describe the overall concept of the T&E program with an
emphasis on decisions in the Technology Development phase and information required
to draft the CDD. Specific areas of evaluation should include Technology Readiness
Level (TRL) and prototype testing. Include a Top-Level Evaluation Framework matrix
that shows the correlation between decisions, the primary capabilities, critical
technologies, critical technical parameters, and other key test measures.
3.3.1. Developmental Test Objectives. Summarize the planned objectives and state the
methodology to test the technology attributes defined by the TDS.
3.3.2. Modeling & Simulation. Describe the key models and simulations and their
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      767
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
intended use. Identify who will perform M&S verification, validation, and accreditation.
3.3.3. Test Limitations. Discuss any test limitations that may significantly affect the
evaluator's ability to draw conclusions about the TRL and capabilities.
3.4. Operational Evaluation Approach. Discuss the approach during the TD phase to
providing operational insights from the user perspective, including resolution of the ICD
issues. Include reliability growth testing, if appropriate.
3.4.2. Operational Test Objectives. Summarize the planned objectives and state the
methodology to test the technology attributes defined by the TDS.
3.4.3. M&S. Describe the key models and simulations and their intended use. Identify
who will perform M&S verification, validation, and accreditation.
3.4.4. Test Limitations. Discuss any test limitations that may significantly affect the
evaluator's ability to draw conclusions about the TRL and capabilities.
3.5. Future Test and Evaluation. Summarize all remaining significant T&E that has not
been discussed yet, extending through the acquisition life cycle. Test events after
Milestone B will be described in detail in the Milestone B TEMP update.
4.1. Introduction. Testing will be planned and conducted to take full advantage of
existing DoD investment in ranges, facilities, and other resources wherever practical.
Describe all key test and evaluation resources, both government and contractor, that
will be used during the course of the TD phase. Include long-lead items for the next
phase, if known.
4.1.2. Test Sites and Instrumentation. Identify the test ranges and facilities to be used
for testing.
4.1.3. Test Support Equipment. Identify test support, analysis equipment, and personnel
required to conduct testing.
4.1.5. Test Targets and Expendables. Specify the type, number, availability, and
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      768
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
schedule for test targets and expendables, (e.g. targets, weapons, flares, chaff,
sonobuoys, countermeasures).
4.1.6. Operational Force Test Support. Specify the type and timing of aircraft flying
hours, ship steaming days, and on-orbit satellite contacts/coverage, and other
operational force support.
4.1.7. Simulations, Models and Testbeds. Specify the models and simulations to be
used. Identify opportunities to simulate any of the required support. Identify the
resources required to validate and accredit their usage, responsible agency, and
timeframe.
4.1.8. Joint Mission Environment. Describe the live, virtual, or constructive components
or assets necessary to create an acceptable environment to evaluate TRLs and mission
capabilities.
4.2. Test and Evaluation Funding Summary. Provide initial estimates of DT&E, OT&E,
and LFT&E costs.
For programs under OSD T&E oversight, the PM or leader of the concept development
team, with the T&E WIPT providing support, submits the DoD Component/Defense
Agency-approved TES to OSD for staffing and approval before Milestone A. The PM
should submit the TES at least 45 days prior to Milestone A to support the decision. The
DOT&E and the DASD(DT&E) approve the TES for all programs on the OSD T&E
Oversight List. For programs not on the OSD T&E Oversight List, the CAE, or
designated representative, approves the TES.
The TEMP serves as the overarching document for managing a T&E program. PMs
should develop a draft TEMP for the pre-EMD review and a formal TEMP for Milestone
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      769
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
B, based on the AT&L memo Improving Milestone Process Effectiveness, dated June
23, 2011. Prior to each subsequent Defense Acquisition System Milestone, the PMs
must submit an updated TEMP. The TEMP should include sufficient detail to support
development of other test related documents.
PMs develop a TEMP and subsequent updates meeting the following objectives:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      770
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         hardware-in-the-loop simulation, and installed-system test facilities prior to
         conducting full-up, system-level and end-to-end testing in open-air realistic
         environments. Programs normally limit DT&E of military medical devices to
         airworthiness certification and environmental testing to ensure the device does
         not fail due to the austere or harsh environments imposed by the operational
         environment or interfere with the aircrafts operational environment. This can
         often be integrated into, or performed alongside, the requisite OT.
    •    Perform V&V in the use of M&S and the systems engineering process.
    •    Stress the system under test to at least the limits of the Operational Mode
         Summary/Mission Profile, and for some systems, beyond the normal operating
         limit’s to ensure the robustness of the design. This testing will reduce risk for
         performance in the expected operational environments.
    •    Provide safety releases (to include formal Environment, Safety, and Occupational
         Health (ESOH) risk acceptance), in concert with the user and the T&E
         community, to the developmental and operational testers prior to any test using
         personnel.
    •    Demonstrate the maturity of the production process through Production
         Qualification Testing (PQT) of low-rate initial production (LRIP) assets prior to
         full-rate production (FRP). The focus of this testing is on the contractor's ability to
         produce a quality product, since the design testing should have been completed.
    •    Provide data and analytic support to the Milestone C decision to enter LRIP.
    •    For weapons systems, use the System Threat Assessment (STA) or System
         Threat Assessment Report (STAR) as a basis for scoping a realistic test
         environment.
    •    For IT & NSS, use DIA, North American Industry Class System (NAICS), or other
         applicable standard as a basis for scoping a realistic test environment.
    •    Conduct Information Assurance (IA) testing on any system that collects, stores,
         transmits, and processes unclassified or classified information; The extent of IA
         testing depends upon the assigned Mission Assurance Category and
         Confidentiality Level. DoDI 8500.2 , "Information Assurance (IA) Implementation,"
         dated February 6, 2003, mandates specific IA Control Measures a system should
         implement as part of the development process.
    •    In the case of IT systems, including NSS , support the DoD Information
         Assurance Certification and Accreditation Process and Joint Interoperability
         Certification process.
    •    Discover, evaluate, and mitigate potentially adverse electromagnetic
         environmental effects (E3) .
    •    Support joint interoperability assessments required to certify system-of-systems
         interoperability.
    •    For business systems, the TEMP identifies certification requirements needed to
         support the compliance factors established by the Office of the Under Secretary
         of Defense (Comptroller) (USD(C)) for financial management, enterprise
         resource planning, and mixed financial management systems.
    •    Demonstrate performance against threats and their countermeasures as
         identified in the Defense Intelligence Agency (DIA) or component-validated threat
         document. Any impact on technical performance by these threats should be
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      771
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         identified early in technical testing, rather than in operational testing where their
         presence might have serious repercussions.
    •    Assess SoS Command, Control, Communications, Computers, Intelligence,
         Surveillance, and Reconnaissance (C4ISR) prior to OT&E to ensure
         interoperability under loaded conditions will represent stressed OT&E scenarios.
PMs should structure a T&E program strategy to provide knowledge to reduce risk in
acquisition and operational decisions. The evaluations of all available and relevant data
and information from contractor and government sources develop that knowledge. The
evaluation should focus on providing essential information to decision makers,
specifically with regard to attainment of technical performance attributes and an
assessment of the systems missions operational effectiveness, operational suitability,
and survivability or operational security. The evaluation framework supports estimates
for test resource requirements and provides a basis for determining test program
adequacy and assessing risk margins within the T&E plans and events.
In other words, the evaluation should describe the links between key program and user
decisions, as well as the developmental and operational tests that requiring evaluation
for those decisions. It correlates the knowledge required concerning KPPs/ KSAs,
CTPs, key test measures (i.e., MOEs and Measure of Suitability (MOSs)), and the
planned test methods, key test resources, facility, or infrastructure needs. The
framework discussion should also identify major risks or limitations to completing the
evaluations. The TEMP should clearly reflect what key questions the evaluations will
answer for the program and user, and at what key decision points. This layout and
discussion provides a rationale for the major test objectives and the resulting major
resource requirements shown in the Resources portion of the TEMP.
The evaluation should also discuss the intended maturation of key technologies within
the overall system, the evaluation of capabilities in a mission context, and evaluations
needed to support required certifications or to comply with statute(s). Separate
evaluation plans should provide details for the PMs overall evaluation strategy (e.g.,
System Evaluation Plan (Army), Operational Test and Evaluation plan, LFT&E plan).
The DT&E section describes the evaluation of the maturation of a system or capability,
and should address the overall approach to evaluate the development of system
capabilities, in operationally relevant environments. The approach should cover CTPs,
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      772
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
key system risks, and any certifications required (weapon safety, interoperability, etc.).
The evaluation of technology maturity should support the TDS.
The evaluation of system maturity should support the acquisition strategy. The amount
of development in the acquisition strategy will drive the extent of the discussion. For
example, if a non-developmental item (i.e., Commercial-Off-The-Shelf (COTS) or
Government-off-the-shelf (GOTS)) then there may not be much, if any, maturation of the
system required. If a new technology effort, pushing the state-of-the-art or capabilities
significantly improved over what is currently being achieved in the operational
environment, then it may require a significant amount of effort in maturing or developing
the system or it’s support system, and therefore more decisions requiring knowledge
from evaluations. In assessing the level of evaluations necessary, give equal
consideration to the maturity of the technologies used, the degree to which system
design (hardware and software) has stabilized, as well as the operational environment
for the employment of the system. Using COTS items in a new environment can result
in significant capability changes, potentially eliminating a true COTS item from a system
maturity perspective.
The system maturation discussions should also cover evaluations for production
qualification, production acceptance, and sustainment of the system. The Defense
Contract Management Agency (DCMA) representatives and procedures may cover the
production evaluations at the contractors manufacturing plant, or may require the T&E
effort to establish and mature the processes. Therefore, the appropriate level of
evaluation could range from none, for normal DCMA practices, to minimal for first article
qualification checks, to more extensive evaluations based upon PQT results for new or
unique manufacturing techniques, especially with new technologies. The sustainment
evaluation discussions should address key risks or issues in sustaining or assessing the
system capability in operational use. The sustainment evaluation discussion should
address the overall T&E logistics effort, maintenance (both corrective and preventative),
servicing, calibration, and support aspects.
COIs also prove relevant to this discussion. COIs act as key operational effectiveness
or operational suitability issues requiring examination in OT&E to determine the systems
capability to perform its mission. COIs must be relevant to the required capabilities and
of key importance to the systems operationally effectiveness, operationally suitability
and survivability, and represent a significant risk if not satisfactorily resolved.
The strategy for T&E must include those evaluations required by statute, specifically
IOT&E, survivability or operational security, and lethality. The IOT&E discussion should
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      773
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
describe the approach to conduct the independent evaluation of the system, including
official resolution of COIs. The discussion of the approach to evaluate the survivability
or operational security /lethality of the system should show how it will influence the
development and maturation of the system design. The discussion should include a
description of the overall live fire evaluation strategy for the system (as defined in
section 2366 of title 10 USC ); critical live fire evaluation issues; and any major
evaluation limitations.
The Evaluation Framework Matrix describes in table format the most important links and
relationships between the types of testing conducted to support the entire acquisition
program. It also shows the linkages between the KPPs/KSAs, CTPs, key test measures
(i.e., MOEs, MOSs), planned test methods, key test resources (i.e., facility and
infrastructure), and the decisions supported. Table 9.5.5.2.T1. depicts Top-Level
Evaluation Framework Matrix from the TEMP format annex (and shown below) shows a
notional Evaluation Framework Matrix. Programs may also use equivalent Service-
specific formats identifying the same relationships and information. Note: the Evaluation
Framework Matrix provides a tabular summary of the evaluation strategy.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      774
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                    MOE          Reliability      Component level stress                       PDR
                                    1.4.         based on         testing
                                                 growth                                                        CDR
                                                 curve            Sample performance on
                                                                  growth curve                                 MS-C
The Evaluation Framework Matrix acts as a key tool used to capture all major parts of a
complete T&E program, identify gaps in coverage, and ensure more efficient integrated
testing. Programs must include it in Part III of the TEMP and base it on the strategy for
T&E (aka evaluation strategy) developed at Milestone A. The Evaluation Framework
Matrix should succinctly enumerate the top-level, key values and information for all
types of T&E. Updates should occur as the system matures and the updating of source
documents (e.g., CDD/CPD, AS, STAR, SEP, ISP). Include demonstrated values for
measures and parameters as the acquisition program advances from milestone to
milestone and as the updating of the TEMP.
Three major sections comprise the Evaluation Framework Matrix: Key Requirements
and T&E Measures; Test Methodologies/Key Resources; and Decisions Supported.
When filled in, readers can scan the matrix horizontally and see all linkages from the
beginning of a program (i.e., from the requirement document) to the decision supported.
Each requirement should associate with at least one or more T&E issues and
measures. However, T&E measures can exist without an associated key requirement or
COI/ COI Criteria (COIC). Hence, some cells in Table 9.5.5.2.T1. may be void.
Key Requirements and T&E Measures These include KPPs and KSAs and the top-level
T&E issues and measures for evaluation. The top-level T&E issues would typically
include COIs and COIC, CTPs, and key MOEs/MOSs. This should also include SoS
issues. Each measure should be associated with one or more key requirements.
However, there could be T&E measures without an associated key requirement or
COI/COIC. Hence, some cells in Table 9.5.5.2.T1. of the TEMP may be void. A simple
test to determine if this section of the matrix is minimally adequate is to confirm that
each decision supported has at least one T&E measure associated with it, and each key
requirement also has at least one T&E measure associated with it. Outside of that, only
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      775
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
include the T&E issues and measures that drive size or scope of the T&E program.
If portions of any T&E activity are missing, those become immediately evident. For
example, if a KPP for reliability, availability, and maintainability (RAM) is listed, then
there must be a supporting COI (or criterion in the set of COIC), along with CTPs and
MOSs, to show that RAM will be fully evaluated in DT&E and OT&E. Specifically in the
case of RAM measures, many acquisition programs included little to no RAM testing in
DT&E and subsequently failed Suitability in OT&E (i.e., were rated "Not Suitable" by
DOT&E). Had the TEMPs for those programs contained a full Evaluation Framework
Matrix, the weak or missing RAM areas may have been identified early and properly
tested before systems reached OT&E. Increasing the visibility of all key measures will
help ensure these areas are developed and properly tested in DT&E and are ready for
OT&E.
The Evaluation Framework Matrix also aids integrated testing and systems engineering
by providing a broad outline of the linkages and corresponding areas for each kind of
T&E activity. Mutual support between tests can be planned based on these linkages.
For example, DT&E can augment the high visibility areas in OT&E, and OT&E can
"right-size" their T&E concept based on what they can use in DT&E. More synergy is
possible where DT and OT measures are the same or similar, or where the same T&E
resources (test articles and/or facilities) are used. Data sharing protocols can be
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      776
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
developed early to aid test planning. DOD Information Assurance Certification and
Accreditation Process(s) (DIACAP's) Certification & Accreditation (C&A) requirements
can be folded in early. Redundancy and gaps can be spotted and eliminated. Greater
visibility and transparency between T&E activities will generate countless ways to
enhance integration. The discussion of the evaluation strategy can fill in all the details.
     TEMP                           Milestone
                                    B                                                     C
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      777
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                        Plan for the conduct of dedicated       A listing of all test
                        Government DT&E or integrated           events within the
                        test (lead by Government                dedicated IOT&E
                        personnel) to provide confidence
                        that the system design solution is
                        on track to satisfy the desired
                        capabilities.
                        Identify Lead Government DT&E
                        organization.
                        Plan for one full-up system level
                        government DT&E event and at
                        least one OA with intended
                        operational users.
                        Reliability Growth Curve(s) (RGCs) Updated RGC
                        reflecting the reliability growth plans
                        at the appropriate level of analysis
                        for the program
                        Listing of all commercial and NDIs
                        Provide a tabulation of factors
                        Determination of critical interfaces
                        and information security
                        The TEMP should describe the T&E program in sufficient
                        detail for decision makers to determine whether the planned
                        activities are adequate to achieve the T&E objectives for the
                        program.
                        Identify each test event as Contractor or Government DT&E
                        Identify M&S to be used and VV&A process. Annotate
                        supporting usage (i.e., DT&E or OT&E)
                        T&E Support of Reliability Growth Plan
                        Plan for data collection
                        The TEMP should identify entrance and exit criteria and
                        their associated test events or test periods.
                        The TEMP should consider the potential impacts on the
                        environment and on personnel.
     Part IV, Resource Summary
                        The TEMP should describe the resources required in
                        sufficient detail and aligned with Part III of the TEMP.
                        Programs should maximize the use DoD Government T&E
                        capabilities and invest in Government T&E infrastructure
                        unless an exception can be justified as cost-effective to the
                        Government.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      778
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
9.5.5.3. TEMP Format
FOR
ACRONYM
ACAT Level
Program Elements
Xxxxx
************************************************************************
SUBMITTED BY
____________________________________________________ ____________
CONCURRENCE
____________________________________________________ ____________
____________________________________________________ ____________
____________________________________________________ ____________
____________________________________________________ ____________
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      779
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
____________________________________________________ ____________
Note: For Joint/Multi Service or Agency Programs, each Service or Defense Agency
should provide a signature page for parallel staffing through its CAE or Director, and a
separate page should be provided for OSD Approval
************************************************************************
OSD APPROVAL
____________________________________________________ ____________
DASD(DT&E) DATE
____________________________________________________ ____________
D,OT&E DATE
TABLE OF CONTENTS
1.1 PURPOSE.....................................................................................................
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      780
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
PART II TEST PROGRAM MANAGEMENT AND SCHEDULE ...................................
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      781
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
3.6 Operational Evaluation Approach.............................................
4.1 Introduction..........................................................................................
APPENDIX A BIBLIOGRAPHY
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      782
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
APPENDIX B ACRONYMS
1. PART I - INTRODUCTION
1.1. Purpose.
    •    State the purpose of the Test and Evaluation Master Plan (TEMP).
    •    Identify if this is an initial or updated TEMP.
    •    State the Milestone (or other) decision the TEMP supports.
    •    Reference and provide hyperlinks to the documentation initiating the TEMP (i.e.,
         Initial Capability Document (ICD), Capability Development Document (CDD),
         Capability Production Document (CPD), Acquisition Program Baseline (APB),
         Acquisition Strategy Report (ASR), Concept of Operations (CONOPS)).
    •    State the Acquisition Category (ACAT) level, operating command(s), and if listed
         on the OSD T&E Oversight List (actual or projected)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      783
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         the system will operate.
    •    Reference the appropriate DIA or component-validated threat documents for the
         system.
    •    Reference the Analysis of Alternatives (AoA), the APB and the materiel
         development decision to provide background information on the proposed
         system.
    •    Briefly describe the overarching Acquisition Strategy (for space systems, the
         Integrated Program Summary (IPS)), and the Technology Development Strategy
         (TDS).
    •    Address whether the system will be procured using an incremental development
         strategy or a single step to full capability.
    •    If it is an evolutionary acquisition strategy, briefly discuss planned upgrades,
         additional features and expanded capabilities of follow-on increments.
               o The main focus must be on the current increment with brief descriptions of
                   the previous and follow-on increments to establish continuity between
                   known increments.
    •    Discuss the results of any previous tests that apply to, or have an effect on, the
         test strategy.
    •    Identify the Key Performance Parameters (KPPs) and Key System Attributes
         (KSAs) for the system.
            o For each listed parameter, provide the threshold and objective values from
                 the CDD/CPD and reference the paragraph.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      784
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
              o    (e.g., security test and evaluation and Information Assurance (IA)
                   Certification and Accreditation (C&A),
              o    post deployment software support,
              o    resistance to chemical, biological, nuclear, and radiological effects;
              o    resistance to countermeasures;
              o    resistance to reverse engineering/exploitation efforts (Anti-Tamper);
              o    development of new threat simulation, simulators, or targets.
    •    Reference all SE-based information that will be used to provide additional system
         evaluation targets driving system development.
            o Examples could include hardware reliability growth and software maturity
               growth strategies.
            o The SEP should be referenced in this section and aligned to the TEMP
               with respect to SE Processes, methods, and tools identified for use during
               T&E.
• Describe the requirements for and methods of collecting, validating, and sharing
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      785
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         data as it becomes available from the contractor, Developmental Test (DT),
         Operational Test (OT), and oversight organizations, as well as supporting related
         activities that contribute or use test data (e.g., information assurance C&A,
         interoperability certification, etc.).
    •    Describe how the pedigree of the data will be established and maintained. The
         pedigree of the data refers to understanding the configuration of the test asset,
         and the actual test conditions under which the data were obtained for each piece
         of data.
    •    State who will be responsible for maintaining this data.
    •    Display (see Figure 2.1) the overall time sequencing of the major acquisition
         phases and milestones (as necessary, use the NSS-03-01 time sequencing).
            o Include the test and evaluation major decision points, related activities,
               and planned cumulative funding expenditures by appropriation by year.
            o Include event dates such as
                    Major decision points as defined in DoD Instruction 5000.02, e.g.,
                      operational assessments,
                    Preliminary and critical design reviews,
                    Test article availability; software version releases;
                    Appropriate phases of DT&E; LFT&E; Joint Interoperability Test
                      Command (JITC) interoperability testing and certification date to
                      support the MS-C and Full-Rate Production (FRP) Decision Review
                      (DR).
                    Include significant Information Assurance certification and
                      accreditation event sequencing, such as Interim Authorization to
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      786
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                          Test (IATT), Interim Authorization to Operate (IATO) and
                          Authorization to Operate (ATO).
                       Also include operational test and evaluation;
                       Low-Rate Initial Production (LRIP) deliveries;
                       Initial Operational Capability (IOC); Full Operational Capability
                          (FOC);
                       Statutorily required reports such as the Live-Fire T&E Report and
                          Beyond Low-Rate Initial Production (B-LRIP) Report.
              o    Provide a single schedule for multi-DoD Component or Joint and
                   Capstone TEMPs showing all related DoD Component system event
                   dates.
    •    Introduce the program T&E strategy by briefly describing how it supports the
         acquisition strategy as described in Section 1.3.2. This section should summarize
         an effective and efficient approach to the test program.
    •    The developmental and operational test objectives are discussed separately
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      787
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         below; however this section must also address how the test objectives will be
         integrated to support the acquisition strategy by evaluating the capabilities to be
         delivered to the user without compromising the goals of each major kind of test
         type.
    •    Where possible, the discussions should focus on the testing for capabilities, and
         address testing of subsystems or components where they represent a significant
         risk to achieving a necessary capability.
    •    As the system matures and production representative test articles are available,
         the strategy should address the conditions for integrating DT and OT tests.
    •    Evaluations shall include a comparison with current mission capabilities using
         existing data, so that measurable improvements can be determined.
             o If such evaluation is considered costly relative to the benefit’s gained, the
                 PM shall propose an alternative evaluation strategy.
             o Describe the strategy for achieving this comparison and for ensuring data
                 are retained and managed for future comparison results of evolutionary
                 increments or future replacement capabilities.
    •    To present the programs T&E strategy, briefly describe the relative emphasis on
         methodologies (e.g., Modeling and Simulation (M&S), Measurement Facility
         (MF), Systems Integration Laboratory (SIL), Hardware-In-the-Loop Test (HILT),
         Installed System Test Facility (ISTF), Open Air Range (OAR)).
    •    Describe the overall evaluation approach focusing on key decisions in the system
         lifecycle and addressing key system risks, program unique Critical Operational
         Issues (COIs) or Critical Operational Issue Criteria (COIC), and Critical Technical
         Parameters (CTPs).
    •    Specific areas of evaluation to address are related to the:
(1) Development of the system and processes (include maturation of system design)
    •    Describe any related systems that will be included as part of the evaluation
         approach for the system under test (e.g., data transfer, information exchange
         requirements, interoperability requirements, and documentation systems).
• Also identify any configuration differences between the current system and the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      788
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         system to be fielded.
             o Include mission impacts of the differences and the extent of integration
                with other systems with which it must be interoperable or compatible.
    •    Describe how the system will be evaluated and the sources of the data for that
         evaluation.
             o The discussion should address the key elements for the evaluations,
                including major risks or limitations for a complete evaluation of the
                increment undergoing testing.
             o The reader should be left with an understanding of the value-added of
                these evaluations in addressing both programmatic and warfighter
                decisions or concerns.
             o This discussion provides rationale for the major test objectives and the
                resulting major resource requirements shown in Part IV - Resources.
    •    Include a Top-Level Evaluation Framework matrix that shows the correlation
         between the KPPs/KSAs, CTPs, key test measures (i.e., Measures of
         Effectiveness (MOEs) and Measures of Suitability (MOSs)), planned test
         methods, and key test resources, facility or infrastructure needs.
             o When structured this way, the matrix should describe the most important
                relationships between the types of testing that will be conducted to
                evaluate the Joint Capabilities Integration and Development System
                (JCIDS)-identified KPPs/KSAs, and the programs CTPs.
             o Figure 3.1 shows how the Evaluation Framework could be organized.
                Equivalent Service-specific formats that identify the same relationships
                and information may also be used.
             o The matrix may be inserted in Part III if short (less than one page), or as
                an annex.
             o The evaluation framework matrix should mature as the system matures.
                Demonstrated values for measures should be included as the acquisition
                program advances from milestone to milestone and as the TEMP is
                updated.
    •    Key requirements & T&E measures These are the KPPs and KSAs and the top-
         level T&E issues and measures for evaluation. The top-level T&E issues would
         typically include COIs/Critical Operational Issues and Criteria (COICs), CTPs,
         and key MOEs/MOSs. System-of-Systems and technical review issues should
         also be included, either in the COI column or inserted as a new column. Each
         T&E issue and measure should be associated with one or more key
         requirements. However, there could be T&E measures without an associated key
         requirement or COI/COIC. Hence, some cells in figure 3.1 may be empty.
    •    Overview of test methodologies and key resources These identify test
         methodologies or key resources necessary to generate data for evaluating the
         COIs/COICs, key requirements, and T&E measures. The content of this column
         should indicate the methodologies/resources that will be required and short notes
         or pointers to indicate major T&E phases or resource names. M&S should be
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      789
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         identified with the specific name or acronym.
    •    Decisions Supported These are the major design, developmental, manufacturing,
         programmatic, acquisition, or employment decisions most affected by the
         knowledge obtained through T&E.
                                                                                                                           FRP
                                   MOE 1.4.                Reliability         Component level stress testing              PDR
                                                           based on
                                                           growth curve        Sample performance on growth                CDR
                                                                               curve
                                                                                                                           MS-C
                                                                               Sample performance with M&S
                                                                               augmentation
                                                                                                                           SRR
KPP #3        COI #4. Is           MOE 1.2.                                    Observation and Survey                      MS-C
              training.
                                                                                                                           FRP
KSA #3.a COI #5.       MOS 2.5.                                                                                            MS-C
         Documentation
                                                                                                                           FRP
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      790
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Describe the top-level approach to evaluate system and process maturity, as well
         as, system capabilities and limitations expected at acquisition milestones and
         decision review points.
1) rationale for CTPs (see below for a description of how to derive CTPs),
4) any technology or subsystem that has not demonstrated the expected level of
technology maturity at level 6 (or higher), system performance, or has not achieved the
desired mission capabilities for this phase of development,
6) key issues and the scope for logistics and sustainment evaluations, and
7) reliability thresholds when the testing is supporting the systems reliability growth
curve.
    •    Limit the list of CTPs to those that support the COIs. Using the system
         specification as a reference, the chief engineer on the program should derive the
         CTPs to be assessed during development.
• A mission context focuses on how the system will be employed. Describe the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      791
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         rationale for the COIs or COICs.
    •    Summarize the planned objectives and state the methodology to test the system
         attributes defined by the applicable capability requirement document (CDD, CPD,
         CONOPs) and the CTPs that will be addressed during each phase of DT as
         shown in Figure 3.1, Top-Level Evaluation Framework matrix and the Systems
         Engineering Plan.
    •    Describe the key models and simulations and their intended use.
    •    Include the developmental test objectives to be addressed using M&S to include
         any approved operational test objectives.
    •    Identify data needed and the planned accreditation effort.
    •    Identify how the developmental test scenarios will be supplemented with M&S,
         including how M&S will be used to predict the Sustainment KPP and other
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      792
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         sustainment considerations.
    •    Identify who will perform M&S verification, validation, and accreditation. Identify
         developmental M&S resource requirements in Part IV.
    •    Discuss any developmental test limitations that may significantly affect the
         evaluator's ability to draw conclusions about the maturity, capabilities, limitations,
         or readiness for dedicated operational testing.
             o Also address the impact of these limitations, and resolution approaches.
    •    State the key live fire test objectives for realistic survivability or lethality testing of
         the system.
    •    Include a matrix that identifies all tests within the LFT&E strategy, their
         schedules, the issues they will address, and which planning documents will be
         submitted for DOT&E approval and which will be submitted for information and
         review only.
    •    Quantify the testing sufficiently (e.g., number of test hours, test articles, test
         events, test firings) to allow a valid cost estimate to be created.
    •    Describe the key models and simulations and their intended use.
    •    Include the LFT&E test objectives to be addressed using M&S to include
         operational test objectives. Identify data needed and the planned accreditation
         effort.
    •    Identify how the test scenarios will be supplemented with M&S.
    •    Identify who will perform M&S verification, validation, and accreditation. Identify
         M&S resource requirements in Part IV
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      793
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
3.4.3. Test Limitations.
    •    Discuss any test limitations that may significantly affect the ability to assess the
         systems vulnerability and survivability.
            o Also address the impact of these limitations, and resolution approaches.
    •    Explain how and when the system will be certified safe and ready for IOT&E.
    •    Explain who is responsible for certification and which decision reviews will be
         supported using the lead Services certification of safety and system materiel
         readiness process.
    •    List the DT&E information (i.e., reports, briefings, or summaries) that provides
         predictive analyses of expected system performance against specific COIs and
         the key system attributes - MOEs/MOSs.
    •    Discuss the entry criteria for IOT&E and how the DT&E program will address
         those criteria.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      794
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Ensure the operational tests can be identified in a way that allows efficient
         DOT&E approval of the overall OT&E effort in accordance with Title 10 U.S.C.
         139(d).
    •    Describe the scope of the operational test by identifying the test mission
         scenarios and the resources that will be used to conduct the test.
    •    Summarize the operational test events, key threat simulators and/or simulation(s)
         and targets to be employed, and the type of representative personnel who will
         operate and maintain the system.
    •    Identify planned sources of information (e.g., developmental testing, testing of
         related systems, modeling, simulation) that may be used to supplement
         operational test and evaluation.
    •    Quantify the testing sufficiently (e.g., number of test hours, test articles, test
         events, test firings) to allow a valid cost estimate to be created.
    •    Describe the key models and simulations and their intended use.
    •    Include the operational test objectives to be addressed using M&S. Identify data
         needed and the planned accreditation effort.
    •    Identify how the operational test scenarios will be supplemented with M&S.
    •    Identify who will perform the M&S verification, validation, and accreditation.
    •    Identify operational M&S resource requirements in Part IV.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      795
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
3.8. Reliability Growth.
    •    Summarize all remaining significant T&E that has not been discussed yet,
         extending through the system life cycle.
            o Significant T&E is that T&E requiring procurement of test assets or other
               unique test resources that need to be captured in the Resource section.
            o Significant T&E can also be any additional questions or issues that need
               to be resolved for future decisions.
            o Do not include any T&E in this section that has been previously discussed
               in this part of the TEMP.
4.1. Introduction.
    •    In this section, specify the resources necessary to accomplish the T&E program.
    •    Testing will be planned and conducted to take full advantage of existing DoD
         investment in ranges, facilities, and other resources wherever practical.
    •    Provide a list in a table format (see Table 4.1) including schedule (Note: ensure
         list is consistent with figure 2.1 schedule) of all key test and evaluation resources,
         both government and contractor, that will be used during the course of the
         current increment. Include long-lead items for the next increment if known.
    •    Specifically, identify the following test resources and identify any shortfalls,
         impact on planned testing, and plan to resolve shortfalls.
    •    Identify the actual number of and timing requirements for all test articles,
         including key support equipment and technical information required for testing in
         each phase of DT&E, LFT&E, and OT&E.
             o If key subsystems (components, assemblies, subassemblies or software
                 modules) are to be tested individually, before being tested in the final
                 system configuration, identify each subsystem in the TEMP and the
                 quantity required.
    •    Specifically identify when prototype, engineering development, or production
         models will be used.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      796
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
4.1.2. Test Sites and Instrumentation.
    •    Identify the specific test ranges/facilities and schedule to be used for each type of
         testing.
    •    Compare the requirements for test ranges/facilities dictated by the scope and
         content of planned testing with existing and programmed test range/facility
         capability.
    •    Identify instrumentation that must be acquired specifically to conduct the planned
         test program.
    •    Identify test support equipment and schedule specifically required to conduct the
         test program.
    •    Anticipate all test locations that will require some form of test support equipment.
         This may include test measurement and diagnostic equipment, calibration
         equipment, frequency monitoring devices, software test drivers, emulators, or
         other test support devices that are not included under the instrumentation
         requirements.
    •    Identify the type, number, availability, fidelity requirements, and schedule for all
         representations of the threat (to include threat targets) to be used in testing.
    •    Include the quantities and types of unit’s and systems required for each of the
         test phases. Appropriate threat command and control elements may be required
         and utilized in both live and virtual environments.
    •    The scope of the T&E event will determine final threat inventory.
    •    Specify the type, number, availability, and schedule for all test targets and
         expendables, (e.g. targets, weapons, flares, chaff, sonobuoys, smoke
         generators, countermeasures) required for each phase of testing.
    •    Identify known shortfalls and associated evaluation risks.
    •    Include threat targets for LFT&E lethality testing and threat munitions for
         vulnerability testing.
    •    For each test and evaluation phase, specify the type and timing of aircraft flying
         hours, ship steaming days, and on-orbit satellite contacts/coverage, and other
         operational force support required.
    •    Include supported/supporting systems that the system under test must
         interoperate with if testing a system-of-systems or family-of-systems.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      797
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Include size, location, and type unit required.
    •    For each test and evaluation phase, specify the models and simulations to be
         used, including computer-driven simulation models and hardware/software-in-
         the-loop Testbeds.
    •    Identify opportunities to simulate any of the required support.
    •    Identify the resources required to validate and accredit their usage, responsible
         agency and timeframe.
    •    All T&E efforts must comply with federal, state, and local environmental
         regulations.
    •    Current permit’s and appropriate agency notifications will be maintained
         regarding all test efforts.
    •    Specify any National Environmental Policy Act documentation needed to address
         specific test activities that must be completed prior to testing and include any
         known issues that require mitigations to address significant environmental
         impacts.
    •    Describe how environmental compliance requirements will be met.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      798
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
4.3. Manpower/Personnel and Training.
  Fiscal Year                                    06       07       08          09      10       11       12
                                                                                                         TBD
  TEST EVENT                                                                                             OT-
                                                                   IT-B2 / IT-
                                                 IT-B1 IT-B2                           IT-C1 IT-C2 OT-C1 D1
                                                                   IT-C1 C1
  TEST RESOURCE
  Integration Lab                                X        X        X           X       X        X
  Radar Integration Lab                          X        X        X           X       X        X
  Loads (flights)
  Operating Area #1 (flights)                             X (1) X (1)                                    X (1)       X (2)
  Operating Area #2 (flights)                             50 (1) 132 (1) 60            100      140      X (1)       X (2)
  Northeast CONUS Overland
                                                          10                                             X (1)       X (2)
  (flights)
  SOCAL Operating Areas
                                                                               X                X
  (flights)
  Shielded Hangar (hours)                                          160                          160
  Electromagnetic Radiation
                                                                   40                           40
  Facility (hours)
  Arresting Gear
                                                                               10               10
  (Mk 7 Mod 3)(events)
  NAS Fallon                                                                   5       5        A/R      X (1)       X (2)
  Link-16 Lab, Eglin AFB                                                                                 X
  NAWCAD WD, China Lake
                                                                                                         X
  Range
  Eglin AFB ESM Range                                                                                    X
1. Explanations as required.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      799
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
9.5.5.4. Other Milestone TEMPs and Updates
An updated TEMP is required as part of entry criteria for entering each acquisition
phase, and at any time a major programmatic change occurs. For example, an updated
TEMP may be required due to a change resulting in a CDR or configuration change,
change to the acquisition strategy, or changes to capability requirements.
9.5.6. Contractual
9.5.6. Contractual
All contract preparation documents (RFP, statement of work) and contract documents
(contract, Contract Data Requirements List (CDRL)) are to identify contractor
requirements for conducting DT&E, and supporting government DT&E, OT&E, and
LFT&E events. At a minimum, contract documents should provide for data rights to
contractor performed DT&E, identification of M&S to be used, and the V&V
methodology to be used. For more information, read the OSD "Incorporating Test and
Evaluation into Department of Defense Acquisition Contracts" , dated October 2011.
    •    PEO/Deputy PEO
    •    PM (ACAT I, IA and II)
    •    DPM (DPM) (ACAT I )
    •    Senior Contracting Official
    •    MDAP/MAIS positions (ACAT I and IA) when the function is required based on
         the phase or type of acquisition program:
            o Program Lead SE
            o Program Lead Cost Estimator
            o Program Lead Contracting Officer
            o Program Lead Logistician (Product Support Manager)
            o Program Lead Business Financial Manager
            o Program Lead T&E
            o Program Lead Production, Quality, and Manufacturing
            o Program Lead IT
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      800
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
In general, the Service/Defense Agency should fill the "program lead" positions with
military members at the lieutenant colonel/colonel or commander/Navy captain levels or
by the civilian equivalent. Program leads advise the PM/DPM and may be matrixed to
the program office. Although program leads may report to a higher-level functional (i.e.,
command/center functional lead or his or her direct report), these positions must be
designated as KLPs. Program lead KLPs must be designated in the position category
associated with the lead function. For example, "lead logistician" positions must be
designated as positions in the "Life Cycle Logistics" position category.
For programs on the OSD T&E Oversight List for OT&E, the DoD CAE is required to
evaluate and determine materiel system readiness for IOT&E. The intent of this
requirement is to ensure systems do not enter IOT&E before they are sufficiently
mature. Scarce resources are wasted when an IOT&E is halted or terminated early
because of technical problems with the System Under Test (SUT); problems that should
have been resolved prior to the start of IOT&E.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      801
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
based on capabilities demonstrated in DT&E and earlier OAs. As outlined in DoDI
5000.02, Enclosure 6, paragraphs 4.b and 4.c, a DT&E report of results and the
progress assessment shall be provided to the DASD(DT&E) and the DOT&E prior to the
AOTR. That report can be a written document or a briefing to the DASD(DT&E) and
DOT&E representatives, and should include the following: an analysis of the systems
progress in achieving CTPs, satisfaction of approved IOT&E entrance criteria, a
technical risk assessment, level of software maturity and status of software trouble
reports, and predicted IOT&E results, including the impacts of any shortcomings on the
systems expected performance during IOT&E. Provide the report at least 20 days prior
to the CAE's determination of system readiness. This will allow OSD time to formulate
and provide its recommendation to the CAE. All appropriate developmental and
operational T&E organizations should be invited to the IOT&E readiness review.
The goal of the AOTR is to assess the risk associated with the system’s ability to meet
operational suitability and effectiveness goals, identify system and subsystem maturity
levels, assess programmatic and technical risk, and provide risk mitigation
recommendations. The results of the AOTR will be provided to the USD(AT&L),
DOT&E, and CAE. As outlined in DoD Instruction 5000.02, Enclosure 6, paragraphs 4.b
and 4.c, the CAE shall consider the results of the AOTR prior to making a determination
of materiel readiness for IOT&E.
Programs on the OSD T&E Oversight List report to the appropriate OSD oversight
organization(s) on a periodic or event-driven basis. Reports are required from the
program office, the proposed lead DT&E Organization, and the lead OTA to assist OSD
in preparation for the Milestone Decision Authority (MDA) review of system
development and operational progress and risk, and for congressionally mandated
annual reports.
The risk associated with a Milestone B decision, should be based on reports to the
DASD(DT&E) and the DOT&E to permit assessments from the TD Phase for: (1)
technology maturity, (2) performance of Critical Technology Element (CTEs) to meet
CTPs or other performance parameter thresholds, and (3) adequacy of executing the
test plan submitted for the TD Phase. The assessment (for TRLs for all CTEs) will be
based on objective evidence gathered during events such as tests, demonstrations,
pilots, or physics-based simulations. Based on the requirements, identified capabilities,
system architecture, software architecture, CONOPS, and/or the concept of
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      802
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
employment, the IRT (Integrated Requirements Team) will define operationally relevant
environments and determine which TRL is supported by the objective evidence. This
metric would evaluate the adequacy of the test/demonstration approach used for
determining the CTPs for each CTE; i.e., the confidence the DASD(DT&E) has that the
CTE was appropriately stressed and the TRL was accurately assessed. This confidence
will be based on a number of factors assessed by comparing test and/or evaluation
reports with the approved TEMPs. Some of those factors may include adequacy of:
Development of an OSD position on the risk of a Milestone C approval for initiating the
Production and Deployment (P&D) Phase should be based on: (1) the DT&E results
from the preceding EMD phase, including consideration of how thoroughly the system
was stressed during EMD (mission-oriented context and operationally realistic
environments); and (2) adequacy of the DT&E planning for the remaining P&D phase.
EMD phase DT results and evaluations extracted from DT&E reports, OA results if the
OTA conducted one, and action officer observations from monitoring EMD phase DT&E
and participating in Program Support Review(s) (PSRs), WIPT meetings, test readiness
reviews, and data analysis working group meetings to provide the basis for assessing
whether Milestone C entrance criteria were met. Reporting should permit OSD to
determine the adequacy of the TEMP the PM submits for Milestone C, knowledge of the
mission and operating environment requirements, and knowledge of both T&E
infrastructure capabilities (including threat surrogates) and the projected threat at the
time of program IOC, and provide the basis for assessing the adequacy of P&D phase
DT&E planning. The assessment based on DT&E results should speak directly to the
maturity of the system being developed and its readiness to advance to the P&D phase;
the assessment based on P&D Phase DT&E planning speaks directly to the adequacy
of the planned DT&E to deliver a system that will succeed in IOT&E, and for assessing
and articulating the risk associated with an acquisition program proceeding into LRIP
and the P&D phase.
Reporting should demonstrate, based on the DT&E and OA results of EMD, the degree
of compliance for:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      803
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Acceptable performance in DT&E and OA
    •    Mature software capability
    •    Acceptable interoperability
    •    Acceptable operational supportability
    •    IA certification and acceptance
The shift away from point-to-point system interfaces to net-centric interfaces brings
implications for the T&E community. The challenge to the test community will be to
represent the integrated architecture in the intended operational environment for test.
Furthermore, the shift to net-centric capabilities will evolve gradually, no doubt with
legacy point-to-point interfaces included in the architectures. PMs, with PEO support,
are strongly encouraged to work with the operating forces to integrate operational
testing with training exercises, thereby bringing more resources to bear for the mutual
benefit of both communities. It is imperative the T&E community engages the user
community to assure that test strategies reflect the intended operational and
sustainment/support architectures and interfaces within which the intended capabilities
are to be tested and evaluated.
For T&E, the appropriate application of M&S is an essential tool in achieving both an
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      804
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
effective and efficient T&E program. T&E is conducted in a continuum of Live, Virtual,
Constructive (LVC) environments. DoD Components have guidelines for use of M&S in
acquisition, especially T&E. These guidelines are intended to supplement other
resources. The PM should have an M&S subgroup to the T&E WIPT that develops the
program's M&S strategy that should be documented in the programs SEP and the TES /
TEMP . Some DoD components require planning for M&S to be documented in a
separate M&S Support Plan. This M&S strategy will be the basis for program
investments in M&S. M&S should be planned for utility across the programs life cycle,
modified and updated as required to ensure utility as well as applicability to all
increments of an evolutionary acquisition strategy. A program's T&E strategy should
leverage the advantages of M&S. M&S planning should address which of many possible
uses of M&S the program plans to execute in support of T&E. M&S can be used in
planning to identify high-payoff areas in which to apply scarce test resources.
Rehearsals using M&S can help identify cost effective test scenarios and reduce risk of
failure. During conduct of tests, M&S might provide adequate surrogates to provide
stimulation when it is too impractical or too costly to use real world assets. This
impracticality is particularly likely for capability testing or testing a system that is part of
a system-of-systems, or for hazardous/dangerous tests or in extreme environments, or
for testing the systems supportability. M&S can be used in post-test analysis to help
provide insight and for interpolation or extrapolation of results to untested conditions.
To address the adequacy and use of M&S in support of the testing process the program
should involve the relevant OTA in planning M&S to ensure support for both DT and OT
objectives. This involvement should begin early in the programs planning stages.
An initial goal for the T&E WIPT is to assist in developing the programs M&S strategy by
helping integrate a programs M&S with the overall T&E strategy; plan to employ M&S
tools in early designs; use M&S to demonstrate system integration risks; supplement
live testing with M&S stressing the system; and use M&S to assist in planning the scope
of live tests and in data analysis.
Another goal for the T&E WIPT is to develop a T&E strategy identifying ways to
leverage program M&S which could include how M&S will predict system performance,
identify technology and performance risk areas, and support in determining system
effectiveness and suitability. For example, M&S should be used to predict sustainability
or KSA drivers. The T&E WIPT should encourage collaboration and integration of
various stakeholders to enhance suitability (see section 5.2.3 ).
A philosophy for interaction of T&E and M&S is to use the model-test-fix-model. Use
M&S to provide predictions of system performance, operational effectiveness,
operational suitability, and survivability or operational security and, based on those
predictions, use tests to provide empirical data to confirm system performance and to
refine and further validate the M&S. This iterative process can be a cost-effective
method for overcoming limitations and constraints upon T&E. M&S may enable a
comprehensive evaluation, support adequate test realism, and enable economical,
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      805
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
timely, and focused tests.
All M&S used in T&E must be accredited by the intended user (PM or OTA).
Accreditation can only be achieved through a rigorous VV&A process as well as an
acknowledged willingness by the user to accept the subject M&S for their application
requirements. Therefore, the intended use of M&S should be identified early so
resources can be made available to support development and VV&A of these tools. The
OTA should be involved early in this process to gain confidence in the use of M&S and
possibly use them in support of OT. DoDI 5000.61 , "DoD Modeling and Simulation
(M&S) Verification, Validation, and Accreditation (VV&A)," dated December 9, 2009,
provides further guidance on VV&A.
The following is provided to help the M&S subgroup to the T&E WIPT think through the
planning process to best incorporate M&S into the testing process. Additional guidance
for M&S is available in section 4.5.8 .
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      806
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
              o  Synthetic countermeasures, Testbeds, environments, and battlespaces;
              o  M&S whether embedded in weapon systems, implemented as stand-alone
                 systems, or integrated with other distributed simulations; and
             o Test assets, test planning aids, and post-test analysis tools that address
                 other than real time characteristics.
    •    Infrastructure needed to conduct the test(s), to include networks, integration
         software, and data collection tools:
             o Provide descriptive information for each M&S resource:
                     Title, acronym, version, date;
                     Proponent (the organization with primary responsibility for the
                        model or simulation);
                     Assumptions, capabilities, limitations, risks, and impacts of the
                        model or simulation;
                     Availability for use to support T&E; and
                     Schedule for obtaining.
    •    Identify the M&S data needed to support T&E:
             o Describe the input data the M&S needs to accept;
             o Describe the output data the M&S should generate;
             o Describe the data needed to verify and validate the M&S; and
             o Provide descriptive information for each data resource:
                     Data title, acronym, version, date;
                     Data producer (organization responsible for establishing the
                        authority of the data);
                     Identify when, where, and how data was or will be collected;
                     Known assumptions, capabilities, limitations, risks, and impacts;
                     Availability for use to support T&E; and
                     Schedule for obtaining.
    •    For each M&S and its data, describe the planned accreditation effort based on
         the assessment of the risk of using the model and simulation results for decisions
         being made:
             o Explain the methodology for establishing confidence in the results of M&S;
             o Document historical source(s) of VV&A in accordance with DoDI 5000.61 ;
                 and
             o Provide the schedule for accreditation prior to their use in support T&E.
    •    Describe the standards (both government and commercial) with which the M&S
         and associated data must comply; for example:
             o IT standards identified in the DoD IT Standards Registry (DISR);
             o Standards identified in the DoD Architecture Framework Technical
                 Standards Profile (TV-1) and Technical Standards Forecast (TV-2);
             o M&S Standards and Methodologies (requires registration/login);
             o Data standards; and
             o VV&A standards:
                     IEEE Std 1516.4TM -2007, IEEE Recommended Practice for VV&A
                        of a Federation-An Overlay to the High Level Architecture
                        Federation Development and Execution Process;
                     IEEE Std 1278. 4TM -1997(R2002), IEEE Recommended Practice
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      807
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                             for Distributed Interactive Simulation - VV&A;
                            MIL-STD-3022 , DoD Standard Practice for Model & Simulation
                             VV&A Documentation Templates, dated January 28, 2008.
M&S is an essential tool for achieving both an effective and efficient T&E program. T&E
should be conducted in a continuum of LVC environments throughout a systems
acquisition process. DoD Components have guidelines for the use of M&S in
acquisition, especially T&E. The PM should have an M&S subgroup to the T&E WIPT
that develops the program's M&S strategy which should be documented in the
programs SEP and the TES / TEMP or in a separate M&S Support Plan.
M&S can be used in test planning to identify high-payoff areas in which to apply scarce
test resources, and in dry-running a test to assess the sensitivity of test variables to the
response variable being used, and to evaluate system operational effectiveness,
operational suitability or survivability or operational security. During the conduct of tests,
M&S can provide surrogates to provide stimulation when it is too impractical or too
costly to use real world assets. This impracticality is particularly likely for capability
testing or testing a system that is part of a system-of-systems, or for
hazardous/dangerous tests or in extreme environments, or for testing the systems
supportability. M&S can be used in post-test analysis to help provide insight, and for
interpolation or extrapolation of results to untested conditions.
To ensure test adequacy, OT should only incorporate validated and accredited threat
representations unless coordinated with DOT&E.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      808
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         the IPT is to involve DOT&E and its Threat Systems Office (TSO) early and
         continuously throughout the validation process. DoD Component organizations
         responsible for conducting threat representation validation should notify DOT&E
         of their intent to use an IPPD process and request DOT&E/TSO representation at
         meetings and reviews, as appropriate. The DOT&E representative will be
         empowered to provide formal concurrence or non-concurrence with these
         validation efforts as they are accomplished. After the IPPD process, DOT&E will
         issue an approval memorandum, concurring with the threat representation
         assessment.
    •    When a WIPT is not used, draft threat representation validation reports should be
         forwarded to the TSO for review. The TSO will provide recommendations for
         corrections, when necessary. Final reports are then submitted by the TSO for
         DOT&E approval.
    •    DOT&E approval confirms that an adequate comparison to the threat has been
         completed. It does not imply acceptance of the threat test asset for use in any
         specific test. It is the responsibility of the OTA to accredit the test resource for a
         specific test and for DOT&E to determine if the threat test resource proves
         adequate.
Testing in a mission-oriented context will also allow the OTA to participate earlier in the
development cycle and use the results of integrated tests to make operational
assessments. Integrated planning of tests is a key element in this process. This allows
the data to be used by the developmental community to better predict system
performance and allows the OTA to potentially reduce the scope of IOT&E while still
providing an adequate evaluation of the COIs .
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      809
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
9.7.5. Testing in a Joint Operational Environment
The phrase testing in a joint environment originated in the U.S. Department of Defense
2006-2011 Strategic Planning Guidance for Joint Testing in Force Transformation. It
refers to testing military systems as participating elements in overarching joint SoS. This
testing in a joint operational environment initiative supports the departments long-term
strategy to test as it fights. Joint operations have become the mainstay of Warfighting.
Force transformation will require the T&E community to place a greater emphasis on
testing joint war fighting capabilities developed in response to the JCIDS process.
Future T&E must ensure combatant commanders can rely on equipment to operate
together effectively without introducing problems to warfighters. For a detailed
discussion of changes needed to bring about this vision of T&E, see the DepSecDefs
Testing in a Joint Environment Roadmap , dated November 12, 2004. The proposals in
this roadmap provide important enablers for acquiring new systems created with joint
and testing legacy equipment and systems that are made joint.
The Joint Mission Environment (JME) is defined as, "a subset of the joint operational
environment composed of force and non-force entities; conditions, circumstances and
influences within which forces employ capabilities to execute joint tasks to meet a
specific mission objective". It describes the expected operating environment of the
system (or system of systems) under test, and includes all of the elements that
influence the required performance the new capability must demonstrate. These include
the particular mission requirements in which the system is being employed; physical
factors such as the blue and opposing force structures; geographic and demographic
aspects of the joint operating area, etc., as well as the interactions between these
elements.
To be successful, testing in the JME cannot be a new step added at the end of
operational T&E, nor can it replace current DT or OT. It does however represent a
departure from the way DoD acquisition professionals plan and execute systems
engineering, DT&E, and OT&E indeed the entire acquisition process. Testing in a JME
involves the appropriate combination of representative systems, forces, threats and
environmental conditions to support evaluations. These representations can be LVC, or
distributed combinations thereof.
Testing in a JME applies throughout the life cycle of the system. Identification of a joint
issue/problem early in a systems life (including as early as the conceptual phase) will
reduce costs and issues. This applies to evaluating system performance, or how well
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      810
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
the system does what it is designed to do, as well as the systems contribution to the
joint mission, or how DoD employs the system to achieve the mission. A systems
interaction with the JME is evaluated along an evaluation continuum using constructive
and virtual representations and live systems in various combinations.
The JME and associated joint capability requirements will be defined in the ICD, CDD,
and the CPD. The evaluation plans for assessing these requirements will be articulated
in the SEP and the TES at Milestone A. At the pre-EMD Review, evaluation plans for
assessing these requirements will be articulated in the Pre-EMD draft documents (SEP,
TEMP, and ISP). At Milestones B and C, they will be articulated in the SEP, TEMP, and
ISP.. For each case, the selection of LVC systems that will be used to recreate the JME
to support testing will depend on the purpose of the assessment and on the interactions
the SUT will have with other elements in the JME.
This section also briefly addresses some additional areas as outlined in the Testing in a
Joint Environment Methods and Processes (M&P) Implementation Plan originally
produced by the M&P Working Group that was formed during the summer of 2004 to
address testing in a joint environment. The areas of concern outlined below are: (1)
Description of Joint Mission Environments, (2) How to use the Joint Mission
Environment, (3) Testing in a Joint Mission Environment Program Management Office
Support, and (4) Important Acquisition Program Responsibilities.
The JCIDS will create requirements for effects and capabilities at the joint mission level.
This means JCIDS will identify desired mission level effects that are shortfalls. Shortfalls
are addressed by materiel and non-materiel solutions. Materiel or possible system (for a
new/modified system or SoS) KPPs are then proposed to provide the desired mission
level effect(s). Because of this, systems development should not begin and testing
cannot occur without definition(s) of the JME and a defined joint mission associated with
a shortfall to be addressed by a system or systems.
With respect to obtaining information for selected joint missions, users of the joint
environment can start with the universal joint planning process to break down missions,
but it is a process that starts at the Universal Joint Task List (UJTL) level and extends
down to the COCOM level to plan joint task force operations and/or training events.
However, this level of "fidelity" may not be available at the JCIDS ICD/CDD/CPD level
because it is mission specific at the COCOM or Joint Task Force level.
The joint mission descriptions should set the stage for evaluation of a system(s) within a
joint mission area and provide the tester what they need to plan the test. There are
essential elements of the joint mission description necessary to plan, execute, and
analyze assessments and T&E throughout a systems acquisition process.
Additionally, users of the joint environment determine and obtain representations for the
threat, threat composition and disposition, and threat scheme of maneuver appropriate
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      811
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
for the selected joint mission/task. The currently approved Guidance for the
Development of the Force (GDF) scenarios and/or the maturing Defense Planning
Scenarios will provide the source of this information. There is also a Threat Scenarios
Group from the U.S. Army Test & Evaluation Office working threat scenarios. In
addition, coordination with the Service intelligence agencies and the DIA is critical. The
threat must be system specific (specific to the platform under examination) and also
mission specific (specific to the joint mission examined). The next step (after
identification of the threat scenarios) is to determine what should be used to represent
the threat; which can be a LVC representation.
Different Services should be referred to depending on the type of model needed for test.
As the Services have generally focused their modeling efforts based on their usual area
of operations. The Army and/or the National Geospatial-Intelligence Agency are the
best sources for all terrain models. The Navy is the best source for all oceanographic
(surface and subsurface) models, and the Air Force is the best source for air and space
models. DoD M&S responsibilities are delineated in DoDD 5000.59 , DoD Modeling and
Simulation (M&S) Management, dated August 8, 2007, and there are M&S Executive
Agents with responsibilities defined by the DMSO. There should also be a standard set
of environment/background models established for the JME.
Systems engineering and testing will require insertion of concepts and systems into the
JME as a standard part of the acquisition process. Since this is a change of scope for
previous assessments and tests, a process for how to use the joint mission environment
needs established.
The ultimate goal for systems engineering and testing in a joint environment is the
ability to insert any system into the applicable JME at any time during the life of a
system. Two basic items will be examined through insertion into the JME. The first item
is to ensure the systems to be acquired are interoperable with other systems. This
includes not only how they interact and communicate as expected and required, but
also understanding SoS dependencies. The second item goes beyond the system
interaction and communications to examine what value the systems add to joint military
capabilities. In other words, the second item is to assess the contribution of the system
to the mission success.
Interoperability and contribution should be examined each time a system is inserted into
the JME, including times when substantive changes or upgrades are made to an
individual system. Users can determine which joint mission/task(s) to test for a system
with a role in multiple missions.
Selection of the most stressing mission(s) and/or the mission(s) with the most
interactions appears to be the most defensible approach. Test authorities must ensure
that if another required mission involves a system interaction not included in the "most
stressing" mission, the interaction is tested separately. Examining different joint
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      812
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
missions as the system progresses through the acquisition process is also a good
approach especially if there appear to be multiple stressing missions. Another option is
to consult with the intended joint users (COCOM & Service Combatant) and have them
define representative mission tasks.
Scheduling all of the assets in the JME, especially live assets participating in exercises,
will prove a complex undertaking. A management and scheduling capability must exist,
and it is assumed the PM will establish a JME PMO (or equivalent) for this purpose. The
JME PMO will coordinate all LVC assets, and the script of events, which is the plan for
the specific JME missions incorporating acquisition systems under test in accordance
with their schedules. Note that acquisition systems tend to have fixed decision points
where unplanned delays could severely impact production. Finally, with a complex
facsimile of a mission environment in place and acquisition systems scheduled to
perform missions within it, additional programs may ask to "join in" the scheduled
events, for testing, training exercises, or other special events. This is encouraged, but
the testing needs of the sponsoring program must of course take precedence over the
needs of other participants, and their participation should not interference with the core
purpose of the JME events.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      813
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
9.8. Best Practices
An integral part of the overall T&E process includes the T&E of IA requirements. DoDI
5000.02 , Operation of the Defense Acquisition System, dated December 8, 2008,
directs the conducting of IA T&E during both DT&E and OT&E. To ensure IA testing
adequately addresses system IA requirements, the PM must consider IA requirements
that protect and defend information and information systems by ensuring their
availability, integrity, authentication, confidentiality, and non-repudiation. This includes
providing for restoration of information systems by incorporating protection, detection,
and reaction capabilities. DoDI 8500.02 , Information Assurance (IA) Implementation,
dated February 6, 2003, specifies baseline IA controls for DoD systems. PMs should
ensure adequate testing of all applicable IA controls prior to testing in an operational
environment or with live data, except for those programs requiring testing in an
operational environment. In consultation with the PM or Systems Manager, the
Designated Approving Authority (DAA) determines which programs require testing of IA
controls in an operational environment. In addition to baseline IA controls, some
capabilities documents (e.g., ICD, CDD, and CPD) may also specify unique IA
requirements, such as a specific level of system availability. PMs may also identify
additional IA requirements as a result of the risk management process, or as directed by
the DoD Components. They should also consider the impact of the DoD Information
Assurance Certification and Accreditation Process (DIACAP) on the systems overall
T&E cost and schedule.
Significant C&A activities and events should be visible on the integrated test schedule to
ensure appropriate coordination of events. The DoD Component IA program regularly
and systematically assess the IA posture of DoD Component-level information systems,
and DoD Component-wide IA services and supporting infrastructures through
combinations of self-assessments, independent assessments and audit’s, formal testing
and certification activities, host and network vulnerability or penetration testing, and IA
program reviews. The planning, scheduling, conducting, and independent validation of
conformance testing should include periodic, unannounced in-depth monitoring and
provide for specific penetration testing to ensure compliance with all vulnerability
mitigation procedures; such as the DoD information assurance and vulnerability
assessment or other DoD IA practices. Testing ensures the systems IA capabilities
provide adequate assurance against constantly evolving threats and vulnerabilities.
PMs should consider the re-use and sharing of information to reduce rework and cycle
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      814
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
time. DoD memorandum for establishing DoD Information System Certification and
Accreditation Reciprocity , dated June 11, 2009, mandated a mutual agreement among
participating enterprises to accept each other’s security assessments in an effort to
reuse IS resources and/or accept each other’s assessed security posture for the timely
deployment of IS critical to attaining the Departments strategic vision of Net-Centricity.
Additionally, DOT&E memorandum, Procedures for Operational Test and Evaluation
(OT&E) of Information Assurance in Acquisition Programs , dated January 21, 2009
contains the OT&E strategy for IA assessment; addressing the test process,
identification of required IA test resources and funding, and a reference to the
appropriate threat documentation. For more information, see DAG Section 7.5 .
All IT & NSS must undergo joint interoperability testing and evaluation for certification
prior to fielding, in accordance with section 2223 of Title 10 USC , DoDI 5000.02 , DoDD
4630.05 , Interoperability and Supportability of Information Technology (IT) and National
Security Systems (NSS), dated April 23, 2007, DoDI 4630.8 , Procedures for
Interoperability and Supportability of Information Technology (IT) and National Security
Systems (NSS), dated June 30, 2004, CJCSI 3170.01H , and CJCSI 6212.01F ,
Interoperability and Supportability of Information Technology and National Security
Systems, dated March 21, 2012. This includes IT & NSS compliance with technical
standards, Net-Ready Key Performance Parameters (NR-KPP), solution architectures,
and spectrum supportability requirements. Interoperability compliance with joint
interoperability test certification requirements remains a continuous process throughout
a systems life cycle. JITC bases a Joint interoperability test certification on test and
evaluation results from operationally realistic test configurations as well as joint and
coalition environments. It then provides input to the MDA and PM for a fielding decision.
The PM must plan, program, budget, execute and provide resources according to
agreed-to costs, schedules, and test plans. Interoperability requirements impact a
programs schedule and costs, so PMs must provide adequate time and funding for
Interoperability and Supportability (I&S), NR-KPP, test certification, and Spectrum
Supportability Risk Assessments (SSRA). Additional information can be found in
Chapter 7.6.4.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      815
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Early identification and resolution of interoperability issues minimizes negative impact to
the joint, multi-national, interagency, and Warfighter community. Interoperability testing
of all IT & NSS follows the NR-KPP development process. Net-ready attributes
determine specific measurable and testable criteria for interoperability, and operationally
effective end-to-end information exchanges. The NR-KPP identifies operational, net-
centric requirements with threshold and objective values that determine its measure of
effectiveness (MOE) and measure of performance (MOP). Architectures provide a
foundation to effectively evaluate the probability of interoperability and net-centricity.
The NR-KPP covers all communication, computing, and electromagnetic spectrum
requirements involving information elements among producer, sender, receiver, and
consumer. Information elements include the information, product, and service
exchanges. These exchanges enable successful completion of the Warfighter mission
or joint business processes. Mandatory KPPs for all program increments include the
NR-KPP.
JITC acts as the DoD organization responsible for joint interoperability testing and net-
readiness certifications. Statute requires JITC to provide a system Net-Ready
certification evaluation memorandum to the Director, Joint Staff J-8, throughout the
system life cycle and regardless of acquisition category. Based on net-readiness
evaluations and other pertinent factors, the Joint Staff J-8 issues a Net-Ready System
Certification memorandum to the respective DoD Components as well as
developmental and operational test organizations in support of the FRP Decision
Review. JITC collaborates with the PM and lead DT&E organization during development
of the TEMP, recommending interoperability T&E measures to ensure I&S testing
satisfies all requirements during DT&E, OT&E, or IA T&E events. PMs should include
JITC as a member of the T&E WIPT and ensure they participate in TEMP development.
JITCs philosophy leverages test results from planned test events or exercises to
generate the necessary data for joint test and net-ready certifications; combining
valuable resources, eliminating redundancy, and ultimately ensuring one test. JITC
evaluates the operational effectiveness of information exchanges using joint mission
threads in an operational environment. JITC establishes processes to ensure
operational tests include operationally mission-oriented interoperability assessments
and evaluations using common outcome-based assessment methodologies to test,
assess, and report on the impact interoperability and information exchanges have on a
systems effectiveness and mission accomplishment for all acquisitions, regardless of
ACAT level.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      816
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    The T&E strategy should address evaluation of highest risk technologies in
         system design and areas of complexity in the system software architecture. The
         strategy should identify and describe:
             o Required schedule, materiel and expertise,
             o Software evaluation metrics for Resource Management, Technical
                Requirements and Product Quality, including Reliability,
             o Types and methods of software testing to support evaluation in unit,
                integration and system test phases across the life cycle,
             o Data and configuration management methods and tools,
             o Models and simulations supporting software T&E including accreditation
                status.
    •    A defined T&E process consistent with and complementing the software and
         system development, maintenance and system engineering processes,
         committed to continuous process improvement and aligned to support project
         phases and reviews, including an organizational and information flow hierarchy.
    •    Software test planning and test design initiated in the early stages of functional
         baseline definition and iteratively refined with T&E execution throughout allocated
         baseline development, product baseline component construction and integration,
         system qualification and in-service maintenance.
    •    Software T&E embedded with and complementary to software code production
         as essential activities in actual software component construction, not planned
         and executed as follow-on actions after software unit completion.
    •    Formal planning when considering reuse of COTS or GOTS, databases, test
         procedures and associated test data that includes a defined process for
         component assessment and selection, and T&E of component integration and
         functionality with newly constructed system elements.
    •    The following link provides additional information:
             o The Handbook of Software Reliability Engineering , published by IEEE
                Computer Society Press and McGraw-Hill Book Company (specifically,
                Chapter 13 ).
Medical devices and systems must comply with the SEP, in terms of Health Insurance
Portability and Accountability Act (HIPAA) and DIACAP information protection
procedures and measures. These procedures and measures ensure the software
complies with the security standards specified in the Health Insurance Portability and
Accountability Act of 1996 ( Public Law 104.191 ) as well as Subtitle D of the Health
Information Technology for Economic and Clinical Health (HITECH) Act, Title VIII of
Division A and Title IV of Division B of the American Recovery and Reinvestment Act of
2009 ( Public Law 111.5 ). Most medical devices will require IM/IT testing and validation
of information security protocols. Given that requirement, programs should start test
planning as early as possible. Programs must also validate FDA clearance prior to any
medical software implementation.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      817
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
9.7.9. Post Implementation Review (PIR)
Subtitle III of Title 40 of the United States Code (formerly known as Division E of the
Clinger-Cohen Act) requires that Federal Agencies ensure that outcome-based
performance measurements are prescribed, measured, and reported for IT (including
NSS) programs. DoDI 5000.02 requires that PIRs be conducted for MAIS and MDAP
programs in order to collect and report outcome-based performance information. The
T&E community will participate in the planning, execution, analysis, and reporting of
PIRs, whose results will be used to confirm the performance of the deployed systems
and possibly to improve the test planning and execution for follow-on increments or
similar systems. For further information, refer to the Acquisition Community Connection
or Chapter 7 .
SoS testing can result in unexpected interactions and unintended consequences. T&E
of SoS must not only assess performance to desired capability objectives, but must also
characterize the additional capabilities or limitations due to unexpected interactions. The
SoS concept should include the system in the broadest sense, from mission planning to
sustainment. SoS is a new and evolving area for development, acquisition, and T&E.
For further information refer to the Systems Engineering Guide for Systems of Systems
, dated August 2008.
Operational Test and Evaluation adequacy encompasses both test planning and test
execution. Considerations include the following:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      818
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Realistic combat-like conditions
    •    Equipment and personnel under realistic stress and operations tempo
    •    Threat representative forces
    •    End-to-end mission testing
    •    Realistic combat tactics for friendly and enemy
    •    Operationally realistic environment, targets, countermeasures
    •    Interfacing systems
    •    Articles off production line preferred
    •    Production representative materials and process
    •    Representative hardware and software
    •    Representative logistics, maintenance manuals
    •    Sample size
    •    Size of test unit
    •    Threat portrayal
    •    Properly trained personnel, crews, unit
    •    Supported by typical support personnel and support package
    •    Missions given to unit’s (friendly and hostile)
    •    Production representative system for IOT&E
    •    Adequate resources
    •    Representative typical users
The acquisition and management of medical materiel must ensure quality, availability,
and economy in meeting the clinical requirements of the Military Health Systems (MHS).
Medical programs, by nature, consist almost exclusively of GOTS, COTS and NDI (non-
developmental item) items; and with the inclusion of other government agencies
participation (i.e., FDA) follow a similar acquisition strategy to other T&E programs. PMs
must not disregard T&E of COTS, NDI, and GFE. The operational effectiveness,
operational suitability, and operational capabilities of these items and any military-
unique applications must be tested and evaluated before a FRP or fielding decision.
The ITT will plan to take maximum advantage of pre-existing T&E data to reduce the
scope and cost of government testing.
PMs, Joint and Service procurement agencies, Service/Defense Agency T&E activities,
and other governmental organizations assist with development of operational testing
and performance evaluation criteria for medical materiel evaluation; for both
developmental and non-developmental programs, as stipulated in DoDI 6430.02 ,
Defense Medical Materiel Program, dated August 17, 2011. Testing of medical devices,
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      819
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
due to the reliance on COTS items, may not involve the rigorous DT&E imposed on
other systems. Unless developed for military use, PMs normally limit DT&E to
airworthiness and environmental testing to ensure the device does not fail due to
austere or harsh conditions imposed by the operational environments or interfere with
the aircrafts operating environment. Programs can integrate this testing, or perform it
alongside, operational testing events to determine the operational effectiveness and
operational suitability of the device. Often, this usability question can identify the
difference between various devices of like construction or capability.
Lead DT&E test organizations can perform medical item testing, as delineated by the
individual Service/Defense Agency, and may not require the approval or input of the
Service/Defense Agency OTA. Defer to Service/Defense Agency guidelines for these
processes.
Based on the FY 2012 NDAA , Section 835, a Chief Developmental Tester will be
designated for MDAP and MAIS programs. PMs for MDAP programs shall designate a
government test agency as the Lead DT&E organization. All of these designations shall
be made as soon as practical after the Materiel Development Decision (MDD). They
shall be maintained until the program is removed from OSD T&E oversight or as
agreed.
The Lead DT&E organization shall be separate from the program office. The Lead
DT&E organization shall be responsible for providing technical expertise on T&E issues
to the Chief Developmental Tester; conducting DT&E activities as directed by the Chief
Developmental Tester; assist the Chief Developmental Tester in providing oversight of
contractors; and assist the PM and Chief Developmental Tester in reaching technically
informed, objective judgments about contractor and Government T&E planning and
results.
Best practices as derived from lessons learned are available and continuously updated
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      820
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
at the DAU Best Practices Clearinghouse .
Programs shall use DoD Government T&E capabilities and invest in Government T&E
infrastructure unless an exception can be justified as cost-effective to the Government.
PMs shall conduct a cost-benefit analysis for exceptions to this policy and document the
assumptions and results of the CBA in an approved TEMP before proceeding.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      821
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
DEFENSE ACQUISITION GUIDEBOOK
Chapter 10 - Decisions, Assessments, and Periodic Reporting
10.0. Overview
10.0. Overview
10.0.1. Purpose
10.0.2. Contents
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      822
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
10.0.1. Purpose
10.0.2. Contents
The chapter starts with overviews of the major decision points and executive-level
review forums associated with a program. It also discusses Integrated Product Teams
(IPTs) . Other topics include Exit Criteria , Independent Assessments , Information
Sharing and Department of Defense (DoD) Oversight , Management Control , Program
Plans , and Periodic Reports for Major Acquisition Programs and Major Automated
Information Systems programs. The chapter also includes an overview of the Defense
Acquisition Management Information Retrieval System and a discussion of Special
Interest Programs . The chapter closes with discussions of Should-Cost and Acquisition
Program Transition Workshops .
There are two types of decision points for Major Defense Acquisition Programs and
Major Automated Information Systems: milestone decisions and other decision review
points. Each such point results in a decision to initiate, continue, advance, change
direction in, or terminate a project or program work effort or phase. The type and
number of decision points may be tailored to program needs. The Milestone Decision
Authority approves the program structure, including the type and number of decision
points, as part of the program (technology development or acquisition) strategy .
Major decision points (including milestone decisions) authorize entry into the major
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      823
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
acquisition process phases:
Additionally, the principles of BCL can be applied at the increment and at the release
level. (There may be multiple releases within an increment.) Multiple increments may
also be approved concurrently if they have well defined and approved requirements, are
fully funded, and have appropriate entrance and exit criteria. For Increment two (2) and
beyond, the Milestone Decision Authority must grant Authorization to Proceed (ATP)
and document it in an Acquisition Decision Memorandum (ADM). ATP serves as the
initiation of the 5-year period for time-certain delivery of capability to ensure compliance
with section 2445(c) of title 10, United States Code .
Decision reviews assess progress and authorize (or halt) further program activity. The
review process associated with each decision point typically addresses the program
affordability and cost effectiveness; program progress, risk, and trade-offs; strategy,
including maintaining competition and the business arrangement (contract type and
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      824
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
incentive structure), program funding, and the development of exit criteria for the next
phase or effort.
The regulatory information required to support both milestone decision points and other
decision reviews should be tailored to support the review, but must be consistent with
the requirements specified in DoD Instruction 5000.02 .
The Milestone Decision Authority for an MDAP signs a certification memorandum for
record prior to Milestone A and Milestone B as specified in sections 2366a and 2366b of
title 10, United States Code.
A major defense acquisition program may not receive Milestone A approval until the
Milestone Decision Authority certifies, after consultation with the Joint Requirements
Oversight Council on matters related to program requirements and military needs, to the
following, without modification, from 10 USC 2366a , as amended by Public law 111-23,
"Weapon Systems Acquisition Reform Act of 2009" , and the FY 2012 NDAA :
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      825
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
 Figure 10.1.2.1.F1. Sample Required Statement for Milestone Decision Authority
            Certification Memorandum Prior to Milestone A Approval .
As required by section 2366a of title 10, United States Code, I have consulted with the
Joint Requirements Oversight Council (JROC) on matters related to program
requirements and military needs for the ( name of program ) and certify that:
(2) the program is being executed by an entity with a relevant function as identified by
the Secretary of Defense;
(4) an analysis of alternatives has been performed consistent with the study guidance
developed by the Director, Cost Assessment and Program Evaluation (DCAPE);
(5) a cost estimate for the program has been submitted, with the concurrence of the
DCAPE, and the level of resources required to develop, procure, and sustain the
program is consistent with the priority level assigned by the JROC; and
(6) [only include if the system duplicates a capability already provided by an existing
system] the duplication provided by this system and ( name of existing system )
program is necessary and appropriate.
A major defense acquisition program may not receive a Milestone B approval until the
Milestone Decision Authority certifies, without modification, from 10 USC 2366b of title
10, United States Code and as amended by Public law 111-23, "Weapon Systems
Acquisition Reform Act of 2009" , and the FY 2012 NDAA , that:
    1. I have received a business case analysis and certify on the basis of the analysis
       that:
           1. the program is affordable when considering the ability of the Department
              of Defense to accomplish the program's mission using alternative
              systems;
           2. appropriate tradeoffs among cost, schedule, and performance objectives
              have been made to ensure that the program is affordable when
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      826
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
               considering the per unit cost and total acquisition cost in the context of the
               total resources available during the period covered by the future-years
               defense program submitted during the fiscal year in which the certification
               is made;
           3. reasonable cost and schedule estimates have been developed to execute,
               with the concurrence of the Director, Cost Assessment and Program
               Evaluation, the product development and production plan under the
               program;
           4. funding is available to execute the product development and production
               plan under the program, through the period covered by the future-years
               defense program submitted during the fiscal year in which the certification
               is made, consistent with the estimates described in subparagraph (C) for
               the program; and
    2. I have received the results of the preliminary design review and conducted a
       formal post-preliminary design review assessment, and certify on the basis of
       such assessment that the program demonstrates a high likelihood of
       accomplishing it’s intended mission; and
    3. I further certify that:
           1. appropriate market research has been conducted prior to technology
               development to reduce duplication of existing technology and products:
           2. the Department of Defense has completed an analysis of alternatives with
               respect to the program;
           3. the Joint Requirements Oversight Council has accomplished it’s duties
               with respect to the program pursuant to section 181(b) of title 10 United
               States Code , including an analysis of the operational requirements for the
               program;
           4. the technology in the program has been demonstrated in a relevant
               environment as determined by the Milestone Decision Authority on the
               basis of an independent review and assessment by the Assistant
               Secretary of Defense, Research and Engineering;
           5. life-cycle sustainment planning, including corrosion prevention and
               mitigation planning, has identified and evaluated relevant sustainment
               costs, throughout development, production, operation, sustainment, and
               disposal of the program, and any alternatives, and that such costs are
               reasonable and have been accurately estimated;
           6. an estimate has been made of the requirements for core depot-level
               maintenance and repair capabilities, as well as the associated logistics
               capabilities and the associated sustaining workloads required to support
               such requirements; and
           7. the program complies with all relevant policies, regulations, and directives
               of the Department of Defense.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      827
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
 Figure 10.1.2.2 F1. Sample Required Statement for Milestone Decision Authority
             Certification Memorandum Prior to Milestone B Approval
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      828
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
MEMORANDUM FOR THE RECORD
    1. I have received a business case analysis and certify on the basis of the analysis
       that:
(A) the program is affordable when considering the ability of the Department of Defense
to accomplish the program's mission using alternative systems;
(B) appropriate tradeoffs among cost, schedule, and performance objectives have been
made to ensure that the program is affordable when considering the per unit cost and
total acquisition cost in the context of the total resources available during the period
covered by the future-years defense program submitted during the fiscal year in which
the certification is made;
(C) reasonable cost and schedule estimates have been developed to execute, with the
concurrence of the Director, Cost Assessment and Program Evaluation, the product
development and production plan under the program;
(D) funding is available to execute the product development and production plan under
the program, through the period covered by the future-years defense program submitted
during the fiscal year in which the certification is made, consistent with the estimates
described in subparagraph (C) for the program; and
    2. I have received the results of the preliminary design review and conducted a
       formal post-preliminary design review assessment, and certify on the basis of
       such assessment that the program demonstrates a high likelihood of
       accomplishing it’s intended mission; and
    3. development, production, operation, sustainment, and disposal of the program,
       and any alternatives, and that such costs are reasonable and have been
       accurately estimated; I further certify that:
           1. appropriate market research has been conducted prior to technology
              development to reduce duplication of existing technology and products:
           2. the Department of Defense has completed an analysis of alternatives with
              respect to the program;
           3. the Joint Requirements Oversight Council has accomplished it’s duties
              with respect to the program pursuant to section 181(b) of title 10 United
              States Code, including an analysis of the operational requirements for the
              program;
           4. the technology in the program has been demonstrated in a relevant
              environment as determined by the Milestone Decision Authority on the
              basis of an independent review and assessment by the Assistant
              Secretary of Defense, Research and Engineering;
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      829
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
              5. life-cycle sustainment planning, including corrosion prevention and
                 mitigation planning, has identified and evaluated relevant sustainment
                 costs, throughout
              6. an estimate has been made of the requirements for core depot-level
                 maintenance and repair capabilities, as well as the associated logistics
                 capabilities and the associated sustaining workloads required to support
                 such requirements; and
              7. the program complies with all relevant policies, regulations, and directives
                 of the Department of Defense.
The DAB is the Departments senior-level review forum for critical acquisition decisions
concerning Acquisition Category (ACAT) ID programs. The DAB is also the principal
review forum enabling the Under Secretary of Defense for Acquisition, Technology, and
Logistics (USD(AT&L)) to fulfill Chapter 144A of title 10, United States Code
responsibilities concerning ACAT IAM Major Automated Information System programs.
The use of any other forum for USD(AT&L) review of ACAT ID or IAM programs is
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      830
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
discouraged.
The Under Secretary of Defense for Acquisition, Technology, and Logistics (USD
(AT&L)) is the Milestone Decision Authority (MDA) for Acquisition Category (ACAT) ID
programs (and ACAT IAM programs that have not been delegated). The USD(AT&L)
chairs the DAB.
DAB members are the following executives: the Vice Chairman of the Joint Chiefs of
Staff; the Secretaries of the Military Departments; the Under Secretary of Defense
(Policy); the Under Secretary of Defense (Comptroller); the Under Secretary of Defense
(Personnel & Readiness); the Under Secretary of Defense (Intelligence); the DoD Chief
Information Officer; the Director, Operational Test & Evaluation; the Director, Cost
Assessment and Program Evaluation; the Deputy Chief Management Officer (for
Defense Business Systems only), and Director, Acquisition Resources & Analysis (as
the DAB Executive Secretary).
DAB Reviews are conducted for ACAT ID and IAM programs at major decision points,
including; the Materiel Development Decision, the Technology Development decision,
the pre-Engineering and Manufacturing Development (EMD) review, the EMD decision,
the Production decision, the Full-Rate Production decision Review/Full Deployment
decision Review, at Interim Program Reviews, and at other times as necessary.
Whenever possible, these reviews should take place in the context of the existing
Integrated Product Team and acquisition milestone decision review processes. An
Acquisition Decision Memorandum (ADM) signed by the USD(AT&L) or other delegated
decision authority documents the decision(s) and program direction resulting from the
review. Any memorandum the USD(AT&L) signs concerning ACAT ID or IAM programs
is referred to as an ADM and must be staffed by the DAB Executive Secretary (Director,
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      831
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Acquisition Resources and Analysis).
The USD(AT&L) is the Defense Acquisition Executive (DAE) and generally chairs the
DAB unless he has otherwise delegated the chair for a particular program or event.
However, ACAT ID and IAM decision and program reviews should be referred to as
"DAB Reviews" or "DAB Meetings" and not "DAE Reviews."
The OIPT Leader is expected to shape the DAB briefing to ensure that it captures and
objectively represents the unresolved issues still requiring discussion, the data to
support such discussion, and all other critical information necessary to conduct a
successful DAB review-above all, information pertaining to the affordability and cost
effectiveness of the program. At the beginning of each DAB, the OIPT leader will state
the decision sought (or other purpose for the review) and immediately tee up the
unresolved issues. The OIPT leader will ensure that evidentiary arguments (pro and
con) are presented and supporting data will be presented by the appropriate principal
DAB member or advisor. Following the discussion of the issues and the affordability and
cost effectiveness of the program, the remaining mandatory information charts will be
presented and reviewed.
A notional set of DAB Milestone Decision briefing charts is available for use. It is
expected that, except for the limited number labeled mandatory, these charts will be
used as a guide only and will be appropriately tailored for the specific program and
decision under consideration. A set of information checklists is also available to aid in
functional reviews of required information during the DAB preparation process.
The decisions and direction resulting from of each milestone and other major decision
point reviews must be documented in an ADM. All ACAT ID and ACAT IAM ADMs are
written by the office of the Director, Acquisition Resources and Analysis (ARA) and the
pertinent Overarching Product Team (OIPT) Leader. ARA staffs all ADMs for
coordination. Prior to release for formal staffing, ARA submits each ADM to the Principal
Deputy Under Secretary for Defense (Acquisition, Technology, and Logistics)
(PDUSD(AT&L)) or the Under Secretary for Defense (Acquisition, Technology, and
Logistics) for initial review.
All ADM-directed actions are tracked and monitored by the OIPT leaders and reported
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      832
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
for closure, compilation, and summation in the recently established Defense Acquisition
Executive (DAE) Action Tracker (DAT) automated system
(https://ebiz.acq.osd.mil/DAT). ARA maintains the DAT system and will periodically
review the status of overdue ADM actions with the PDUSD(AT&L), the Component
Acquisition Executives, the Assistant Secretary of Defense (Acquisition), and the OIPT
Leaders.
Programs must be adequately reviewed far enough ahead of a DAB meeting so that all
issues associated with the desired decision can be identified and, optimally, resolved
prior to the DAB review. Any issues that cannot be resolved prior to the DAB review
should be well defined and presented with the relevant data needed to decide on a
course of action among the available alternatives. Resolving any remaining issues
should be the focus of the DAB meeting itself.
Early in the DAB preparation process, the Assistant Secretary of Defense (Acquisition)
(ASD(A)) will conduct a DAB Planning Meeting (DPM) with the Overarching Integrated
Product Team (OIPT) Leader and a service or agency representative to discuss the
pending decision and any open issues that may be anticipated to exist at the time of the
DAB.
In order to ensure DAB reviews focus on issues and the data that affects issue
resolution, the Principal Deputy Under Secretary of Defense (Acquisition, Technology,
and Logistics) (PDUSD(AT&L)) or the Under Secretary of Defense (Acquisition
Technology, and Logistics) (USD(AT&L)) will hold a DAB Readiness Meeting (DRM) as
soon as possible after the final pre-DAB OIPT meeting-approximately one work week
before each scheduled DAB. The DRM will focus on the purpose of the DAB, discuss
and consider any outstanding issues on the specific program(s), and determine the
readiness of the program(s) to proceed to a DAB for a discussion/decision.
Based upon the results of the DRM, the PDUSD(AT&L) or the USD(AT&L) will
determine the whether to proceed as scheduled; to postpone the DAB while additional
information is obtained, or whether the decision may be made and documented in an
Acquisition Decision Memorandum without convening a formal DAB meeting (a.k.a. a
paper DAB). If there are no issues associated with the requested decision, then a formal
meeting should not be necessary.
The nominal timeline (in business days) to support the DPM, DRM and DAB is listed
below:
0 DAB
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      833
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
-3 DAB Read-ahead submitted
-5 DRM
The OIPT Chair will conduct meetings and form working groups as needed to support
the DAB preparation process.
The DPM is a short informal meeting conducted by the Assistant Deputy Secretary of
Defense ( Acquisition) (ASD(A)) approximately two months before the scheduled DAB
review. The DPM serves as a heads up for that upcoming review and provides an
opportunity to ensure that the Overarching Integrated Product Team (OIPT) Lead and
the Component Acquisition Executive (CAE) staff are prepared to adequately cover any
concerns that the Under Secretary of Defense, Acquisition Technology, and Logistics
may have at the DAB review.
The purpose is to give the CAE and the OIPT Lead time to examine such potential
issues and any actions needed to deal with major concerns that have already been
raised. Content for the DPM will be at the discretion of the OIPT Chair and service (or
agency) presenting the program for DAB review.
The OIPT chair, in coordination with the relevant service or agency will schedule this
meeting, which will nominally be at least two to three months before the DAB is
scheduled.
Attendance at the DPM is limited to the OIPT lead plus one staff member, two or three
people representing the pertinent CAE(s), and the DAB Executive Secretary plus one
staff member--unless otherwise directed, or approved, by the ASD(A).
The DRM is a small, informal meeting conducted by the Principal Deputy Under
Secretary of Defense, Acquisition Technology, and Logistics (PDUSD(AT&L)) or the
Under Secretary of Defense, Acquisition Technology, and Logistics (USD(AT&L))
approximately two weeks before the DAB review and after the Overarching Integrated
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      834
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Product Team (OIPT) meeting. The purpose of the DRM is for the PDUSD(AT&L) or the
USD(AT&L) to review the OIPT results to understand any remaining open issues that
the DAB would have to consider and-to review the proposed DAB presentation,
including materials/data necessary to resolve any issues that would be presented to the
DAB to support the decision.
Content for the DRM will be specific to the decision sought for the particular program
and will be issue-focused. The actual briefing material and backup material for the DAB
itself should be ready for review-with the presentation in final form. The proposed DAB
brief and the OIPT Leaders report should be included in the DRM read ahead.
Attendance at the DRM is limited to the OIPT lead plus one staff member, two or three
people representing the pertinent CAE(s), and the DAB Executive Secretary plus one
staff member--unless otherwise directed, or approved, by the Assistant Secretary of
Defense (Acquisition). (On an as required basis, other OSD representatives may also
be requested to attend to discuss unresolved issues planned to be addressed at the
DAB review.)
The DRM is not intended to be a decision meeting; however, in some cases, it may lead
to a recommendation or decision to conduct a "paper DAB" review.
The Joint Requirements Oversight Council (JROC) reviews and approves capabilities
documents designated as JROC interest and supports the acquisition review process.
The JROC is composed of the Vice Chairman of the Joint Chiefs of Staff, who is the
Chairman of the Council; the Service Vices/Assistant Commandant; and Combatant
Commanders (or Deputies) when matters related to the area of responsibility or
functions of that command will be under consideration by the Council.
In accordance with the CJCS Instruction 3170.01 , the Joint Staff reviews all Joint
Capabilities Integration and Development System (JCIDS) documents and assigns a
Joint Potential Designator. The JROC validates capability needs. The JROC also
validates the key performance parameters when it approves the associated capabilities
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      835
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
document. The JROC charters Functional Capabilities Boards (FCBs). The boards are
chaired by a JROC-designated chair and, for appropriate topics, co-chaired by a
representative of the Milestone Decision Authority.
Functional Capabilities Boards are the lead coordinating bodies to ensure that the joint
force is best served throughout the JCIDS and acquisition processes. The JCIDS
process encourages early and continuous collaboration with the warfighter and
acquisition communities to ensure that new capabilities are conceived and developed in
the joint warfighting context. The JROC, at its discretion, may review any JCIDS issues
which may have joint interest or impact. The JROC will also review programs at the
request of, and make recommendations as appropriate to, the Secretary of Defense,
Deputy Secretary of Defense, and the Under Secretary of Defense (Acquisition,
Technology, and Logistics).
The DBSMC was established by the Secretary of Defense under authority delegated
pursuant to section 186 of title 10, United States Code and in accordance with DoDI
5105.18 ,
The DBSMC advises the DBSMC Chair who is responsible for approving Certification
Authority (CA) certification of funds associated with Defense Business System
modernization efforts.
The IRBs are responsible for advising the Milestone Decision Authority. Required
acquisition decision documentation is submitted to the IRB membership no later than 30
calendar days prior to the IRB. IRBs review :
The DoD Components are required to establish or employ decision bodies with similar
responsibilities for DBS that do not meet the Major Automated Information System
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      836
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
threshold.
The OSD-level decision review processes discussed in this section of the Guidebook
deal specifically with ACAT ID and ACAT IAM programs, selected Pre-Major Defense
Acquisition Programs/Pre-Major Automated Information System Programs, and Under
Secretary of Defense (Acquisition, Technology, and Logistics) Special Interest
Programs. DoD Component Acquisition Executives will develop tailored procedures that
meet statutory intent for programs under their cognizance.
    1. Each CSB must meet at least annually to review all requirements changes and
       any significant technical configuration changes for ACAT I and IA programs in
       development that have the potential to result in cost and schedule impacts to the
       program. Such changes will generally be rejected, deferring them to future blocks
       or increments. Changes shall not be approved unless funds are identified and
       schedule impacts mitigated.
    2. Each Program Manager, in consultation with the cognizant PEO, must, on a
       roughly annual basis, identify and propose a set of descoping options, with
       supporting rationale addressing operational implications, to the CSB that reduce
       program cost or moderate requirements. If the program is an ACAT ID or IAM
       program, the CSB chair must recommend to the Milestone Decision Authority
       which of these options should be implemented. Final decisions on descoping
       option implementation shall be coordinated with the Joint Staff and military
       department requirements approval officials.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      837
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
10.3.2.2. Overarching Integrating Product Team (OIPT) Leaders
IPPD facilitates meeting cost and performance objectives from product concept through
production, including field support. The 10 tenets of IPPD can be summarized into the
following 5 principles:
    •    Customer Focus
    •    Concurrent Development of Products and Processes
    •    Early and Continuous Life-Cycle Planning
    •    Proactive Identification and Management of Risk
    •    Maximum Flexibility for Optimization and Use of Contractor Approaches
Defense acquisition works best when all of the DoD Components work together.
Cooperation and empowerment are essential. Per Department of Defense Directive
5000.01 , the Department's acquisition community shall implement the concepts of
Integrated Product and Process Development (IPPD) and IPTs as extensively as
possible.
IPTs are an integral part of the Defense acquisition oversight and review process. For
Acquisition Category (ACAT) ID and IAM programs, there are generally two levels of
IPTs: the Working-Level Integrated Product Team (WIPT) and the Overarching
Integrated Product Team (OIPT) . Each program should have an OIPT and at least one
WIPT. WIPTs should focus on a particular topic such as cost/performance, program
baseline, acquisition strategy, test and evaluation, or contracting. An Integrating
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      838
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Integrated Product Team (IIPT), which is itself a WIPT, should coordinate WIPT efforts
and cover all program topics, including those not otherwise assigned to another IPT.
IPT participation is the primary way for any organization to participate in the acquisition
program. IIPTs are essential for ACAT ID and IAM programs, in that they facilitate OSD
Staff-level program insight into MDAPs and MAIS programs at the program level and
provide the requisite input to the OIPT.
Normally, all Acquisition Category (ACAT) ID and IAM programs will have an OIPT to
provide assistance, oversight, and review as the program proceeds through its
acquisition life cycle.
First and foremost, Office of the Secretary of Defense (OSD) OIPTs are teams expected
to collectively assist the Defense Acquisition Executive (DAE) in making sound
investment decisions for the Department and to ensure programs are structured and
resourced to succeed. Success is defined as affordable, executable programs that
provide the most value achievable for the resources invested by the Department.
OSD OIPTs are not decision bodies and their respective leaders do not supplant the
authority and responsibilities of the Program Manager, Program Executive Officer,
Component Acquisition Executive, or DAE. The acquisition chain of command is
expected to thoroughly prepare programs for decisions and to execute those decisions.
OSD OIPTs bring independent judgment and perspectives from various staff offices and
provide a measure of due diligence in support of DAE decisions. They often bring
different perspectives than the Components and should be concerned not only with the
programmatic, technical, and business aspects of a program but also with critically
examining and considering the program in the broader context to include joint portfolios,
design and performance trade-space, overall risk (technology, integration/engineering,
schedule, and cost), affordability, competitive opportunities, industrial base implications,
and the nature of the business decision under consideration.
OSD OIPTs also have a key role in helping programs complete the requirements of the
statutory and regulatory acquisition framework, much of which involves documentation
the team members review in support of the decision process. Typically, these
documents have been reviewed within a Service and at working levels of the OSD staff
and Service staffs to ensure they reflect sound planning and assessments before they
are submitted for final review. These documents should generally not be prepared
solely for staff review and approval, but be intended primarily for use within the program
as planning and management tools that are highly specific to the program and tailored
to meet program needs. They should be prepared and reviewed with this goal in mind.
OSD OIPT meetings should be the culmination of the staffing process and lead to well-
staffed and objectively presented decision options on any open issues for discussion at
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      839
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
the Defense Acquisition Board review and subsequent acquisition decisions. To work
effectively, all OIPT members should attempt to resolve issues at the lowest possible
level.
To perform their work, OSD OIPTs and their members should have access to all the
data necessary to do their jobs effectively. Program offices and Component staffs are
expected to provide data needed to resolve issues and to support DAE decisions in a
timely manner.
For those programs where the Under Secretary of Defense for Acquisition, Technology,
and Logistics (USD(AT&L)) is the Milestone Decision Authority, OIPTs are a well-
established and integral part of the defense acquisition oversight and milestone decision
review process. While OIPTs are not decision-making bodies, they provide a
mechanism to coordinate and conduct staff preparation for USD(AT&L) program
decisions and to help execute those decisions.
There are currently five OIPT leaders in the Office of the Secretary of Defense that are
responsible for broadly defined portfolios of programs and capabilities. Programs with
the Under Secretary of Defense (Acquisition, Technology, and Logistics) (USD(AT&L))
as the Milestone Decision Authority are normally assigned to one of these OIPT leaders
as the lead staff element with the broad responsibility for the program:
OSD OIPT leaders form and lead OIPTs to review the programs coming forward to the
Defense Acquisition Board (DAB) for a Defense Acquisition Executive (DAE) decision.
OIPT leaders also prepare content for discussions at DAB Planning Meetings and DAB
Readiness Meetings in collaboration with the responsible Component, the DAB
Executive Secretary, and any OIPT members with outstanding issues. OIPT Leaders
are responsible for coordinating staff inputs, facilitating the resolution of issues at lower
levels when possible, and for ensuring that objective and complete data is presented to
the DAE in support of DAE decisions, including milestone decisions.
OSD OIPT leaders are expected, with the assistance of the OIPT members, to maintain
good situational awareness of program execution status and, with the Component
Acquisition Executives (CAEs) to keep the DAE informed of any program issues. The
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      840
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Defense Acquisition Executive Summary (DAES) process serves as one mechanism to
monitor programs and elevate issues. DAES meetings are forums for sharing and
learning across the senior levels of the acquisition community. However, OIPT leaders
and OIPT members should not delay surfacing problems awaiting a DAES cycle. Bad
news does not get better with age and the earlier issues are addressed, the greater the
opportunity to remediate them. Similarly, good outcomes and best practices should also
be reported and widely shared. Monitoring program execution should not generate
unnecessary meetings, but rather, the evolving tools, data, and monitoring mechanisms
that the Components and the Office of the Secretary of Defense have in place should
accomplish this function. In general, and consistent with their responsibilities, OIPT
leaders (and all staff members) should work to minimize the overhead burden placed on
Program Managers. The OIPT leaders are also expected to track and monitor to
successful completion all Acquisition Decision Memorandum-directed actions and notify
the DAE of issues or events that would affect their completion.
Members should raise issues at the earliest possible opportunity and work to resolve
those issues expeditiously. It is a disservice to the programs and process for issues to
remain hidden or for issues to arise unexpectedly at senior-level decision meetings such
as the DAB. If an OIPT member feels an issue is not resolved satisfactorily, the DAE
should be informed. OIPT members with differing views will be part of any discussion
and afforded the opportunity to express their views with supporting information directly if
desired. Any issue raised should be logically presented with appropriately detailed
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      841
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
technical or other relevant data to allow for an informed decision.
Table 10.3.2.3.T1 below is a list of nominal organizational members for a typical OSD
OIPT. Membership can be adjusted as appropriate by OIPT leaders.
Vice Chairman of the Joint Chiefs of Staff/J-8 Office of the Deputy Assistant Secretary
                                               of Defense for Developmental Test and
                                               Evaluation
Office of the Under Secretary of Defense for Office of the Director for Chemical and
Policy                                         Material Risk Management
Office of the Under Secretary of Defense       Office of the Deputy Assistant Secretary
(Comptroller)                                  of Defense (Manufacturing and Industrial
                                               Base Policy)
Office of the Under Secretary of Defense for Office of the Assistant Secretary of
Personnel and Readiness                        Defense for Logistics and Materiel
                                               Readiness
Office of the Under Secretary of Defense for Office of the Assistant Secretary of
Intelligence                                   Defense for Operational Energy Plans
                                               and Programs
Office of the Director, Operational Test and   Office of the Deputy Assistant Secretary
Evaluation                                     of Defense Research
Office of the Director, Cost Analysis and      Office of the Deputy Assistant Secretary
Program Evaluation                             of Defense Systems Engineering
Office of the Director, Acquisition Resources Cognizant Program Executive Officer(s)
and Analysis
Office of the Director, Defense Pricing        Cognizant Program Manager
Office of the Director, Defense Procurement Office of the Army Acquisition Executive
and Acquisition Policy
Office of the Director, Performance            Office of the Navy Acquisition Executive
Assessment and Root Cause Analyses
Office of the Director, International          Office of the Air Force Acquisition
Cooperation                                    Executive
Office of the Chief Information Officer
The cognizant OIPT leader will provide a written report to the Defense Acquisition
Executive not more than 10 business days after the OIPT meeting and not less than 15
business days prior to a scheduled Defense Acquisition Board (DAB) date (i.e., well
before the DAB Readiness Meeting ). The OIPT Report will document an integrated
program assessment that takes OIPT members independent assessments into
consideration. It will also provide a recommendation for the decision(s) to be made and
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      842
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
include a discussion of all unresolved issues. OIPT leaders will ensure all OIPT member
perspectives and concerns (including dissenting views) are accurately represented.
OIPT members, at their discretion, may provide attachments to the OIPT report
reflecting their individual perspectives and recommendations and providing the basis for
those views.
The OIPT leader will assist the Program Manager and Program Executive Officer in
preparing program decision materials for the DAB. DAB briefings and supporting
material should contain all the data necessary to support the pending decisions
presented in a logical straightforward manner using the DAB templates as a starting
point.
The Program Manager (PM), or designee, in collaboration with the OSD staff specialists
from the offices of the OIPT Leader and other key stakeholders for the assigned
program, should collaboratively form IIPTs and WIPTs as necessary. IIPTs and WIPTs
should meet only as required to help the program manager plan program structure and
documentation and resolve issues. While there is no one-size-fits-all WIPT approach,
the following basic tenets should apply:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      843
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         inappropriate for them to hear.
    •    Support contractors may participate in WIPTs and IIPTs, but unless specifically
         authorized by the organization they represent, they may not commit the staff
         organization they support to a specific position. The organizations they support
         are responsible for ensuring the support contractors are employed in ways that
         do not create the potential for a conflict of interest. Contractors supporting staff
         organizations may participate in Overarching Integrated Product Team (OIPT)
         discussions; however, they will not be permitted to represent the position of the
         supported organization and they may be asked to sign non-disclosure statements
         prior to deliberations.
Given the sensitive nature of OIPT discussions, industry representatives and support
contractors may not be permitted to participate in certain OIPT discussions. However,
the OIPT leader may permit contractors to make presentations to the OIPT, when such
views will better inform the OIPT and will not involve the contractors directly in
Government decision making.
Each Milestone Decision Authority (MDA) should use exit criteria for ACAT I and ACAT
IA programs during an acquisition phase. Prior to each milestone decision point and at
other decision reviews, the Program Manager will develop and propose exit criteria
appropriate to the next phase or effort of the program. The Overarching Integrated
Product Team will review the proposed exit criteria and make a recommendation to the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      844
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
MDA. Exit criteria approved by the MDA will be published in the Acquisition Decision
Memorandum.
Section 2334 of title 10, United States Code , requires the Director, Cost Assessment
and Program Evaluation (DCAPE) to conduct independent cost estimates (ICEs) on
Major Defense Acquisition Programs (MDAPs) and Major Automated Information
Systems (MAIS) programs for which the Under Secretary of Defense (Acquisition,
Technology, and Logistics) is the Milestone Decision Authority. The statute also
requires DCAPE to review Department of Defense (DoD) Component cost estimates
and cost analyses conducted in connection with MDAPs and MAIS programs.
Further, the statute gives DCAPE the authority to prescribe the policies and procedures
for the conduct of all cost estimates for DoD acquisition programs and issue guidance
relating to the full consideration of life-cycle management and sustainability costs.
The Director, Cost Assessment and Program Evaluation (DCAPE) conducts ICEs and
cost analyses for MDAPs for which the Under Secretary of Defense for Acquisition,
Technology, and Logistics (USD(AT&L)) is the Milestone Decision Authority in advance
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      845
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
of:
(1) Any decision to enter low rate initial production, or full rate production.
(2) Any certification pursuant to sections 2366a , 2366b , or 2433a of title 10, United
States Code.
(3) At any other time considered appropriate by the DCAPE or upon the request of the
USD(AT&L).
The Director, Cost Assessment and Program Evaluation (DCAPE), conducts ICEs and
cost analyses for MAIS programs for which the Under Secretary of Defense
(Acquisition, Technology and Logistics) (USD(AT&L)) is the Milestone Decision
Authority in advance of:
(1) Any report pursuant to section 2445c(f) of title 10, United States Code.
(2) At any other time considered appropriate by the DCAPE or upon the request of the
USD(AT&L).
The Director, Cost Assessment and Program Evaluation (DCAPE) participates in the
discussion of any discrepancies related to cost estimates for Major Defense Acquisition
Programs (MDAPs) and Major Automation Information System (MAIS) programs,
comments on deficiencies regarding the methodology or the execution of the estimates,
concurs with the choice of the cost estimate used to support the Acquisition Program
Baseline or any of the cost estimates identified in paragraphs 10.5.1.1. and 10.5.1.2.
and participates in the consideration of any decision to request authorization of a multi-
year procurement contract for a MDAP.
The Director, Cost Assessment and Program Evaluation (DCAPE) and the Secretary of
the Military Department concerned or the head of the Defense Agency concerned (as
applicable) state the confidence level used in establishing the cost estimate for Major
Defense Acquisition Programs (MDAPs) and Major Automated Information System
(MAIS) programs, ensure that the confidence level provides a high degree of confidence
that the program can be completed without the need for significant adjustment to
program budgets, and provides the rationale for selecting the confidence level. The
confidence level statement shall be included in the Acquisition Decision Memorandum
approving the Acquisition Program Baseline, and in any documentation of cost
estimates for MDAPs or MAIS programs prepared in association with the events
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      846
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
identified in paragraphs 10.5.1.1. and 10.5.1.2. The confidence level statement shall
also be included in the next Selected Acquisition Report prepared in compliance with
section 2432 of title 10, United States Code , or in the next quarterly report prepared in
compliance with section 2445c of title 10, United States Code .
A TRA focuses on the programs critical technologies (i.e., those that may pose major
technological risk during development, particularly during the Engineering and
Manufacturing Development (EMD) phase of acquisition). Technology Readiness
Levels (TRLs) can serve as a helpful knowledge-based standard and shorthand for
evaluating technology maturity, but they must be supplemented with expert professional
judgment.
The program manager should identify critical technologies, using tools such as the Work
Breakdown Structure. In order to provide useful technology maturity information to the
acquisition review process, technology readiness assessments of critical technologies
and identification of critical program information (CPI) must be completed prior to
Milestone Decision points B and C.
The TRA final report for MDAPs must be submitted to ASD(R&E) for review to support
the requirement that ASD(R&E) provide an independent assessment to the Milestone
Decision Authority.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      847
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
10.5.2.2. Technology Readiness Levels (TRLs)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      848
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    7. System prototype             Prototype near, or at, planned operational
    demonstration in an operational system. Represents a major step up from TRL 6,
    environment.                    requiring demonstration of an actual system
                                    prototype in an operational environment such as
                                    an aircraft, vehicle, or space. Examples include
                                    testing the prototype in a test bed aircraft.
    8. Actual system completed and Technology has been proven to work in its final
    qualified through test and      form and under expected conditions. In almost all
    demonstration.                  cases, this TRL represents the end of true
                                    system development. Examples include
                                    developmental test and evaluation of the system
                                    in its intended weapon system to determine if it
                                    meets design specifications.
    9. Actual system proven through Actual application of the technology in its final
    successful mission operations. form and under mission conditions, such as
                                    those encountered in operational test and
                                    evaluation. Examples include using the system
                                    under operational mission conditions.
The use of TRLs enables consistent, uniform, discussions of technical maturity across
different types of technologies. Decision authorities will consider the recommended
TRLs (or some equivalent assessment methodology, e.g., Willoughby templates) when
assessing program risk. TRLs are a measure of technical maturity. They do not discuss
the probability of occurrence (i.e., the likelihood of attaining required maturity) or the
impact of not achieving technology maturity.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      849
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
10.5.8. Enterprise Risk Assessment Methodology (ERAM)
P.L. 111-23, the Weapon Systems Acquisition Reform Act of 2009 , established conduct
of PDR before MS B as a mandatory requirement for all MDAPs. The Program Manager
(PM) shall plan a Preliminary Design Review (PDR); PDR planning shall be reflected in
the Technology Development Strategy (TDS), details should be provided in the Systems
Engineering Plan (SEP), and shall be conducted consistent with the policies specified in
DoD Instruction 5000.02 . The plan for PDR will be reflected in the TDS to be approved
by the MDA at MS A. Post-PDR assessments will be conducted in association with MS
B preparations and will be formally considered by the Milestone Decision Authority
(MDA) at the MS B 2366b certification review.
PDRs before MS B for other than MDAPs will be approved by the MDA when consistent
with TDS or Acquisition Strategy objectives. When the PDR is conducted before MS B,
a post-PDR assessment will be conducted in association with the MS B review and
formally considered by the MDA at the MS B review. If the PDR is conducted after MS
B, the MDA will conduct a post-PDR assessment at a time reflected in the approved
acquisition strategy.
If a PDR has not been conducted prior to Milestone B (non-MDAPs), the PM shall plan
for a PDR as soon as feasible after program initiation. PDR planning shall be reflected
in the Acquisition Strategy and conducted consistent with the policies specified in
paragraph 5.d.(6) of DoD Instruction 5000.02 .
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      850
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
       to full detail design; and
    5. A recommendation from the PDR as to the approval of the program's system
       allocated baseline to support detail design.
The PDR Report shall be provided to the MDA prior to Milestone B and include
recommended technical requirements trades based upon an assessment of cost,
schedule, and performance risk.
When the system-level PDR is conducted after Milestone B (for non-MDAPs only), the
Program Manager (PM) shall plan and the Milestone Decision Authority (MDA) shall
conduct a formal Post-PDR Assessment Decision Review. The MDA shall conduct a
formal program assessment and consider the results of the PDR and the PM's
assessment in the PDR Report, and determine whether remedial action is necessary to
achieve Acquisition Program Baseline objectives. The results of the MDA's Post-PDR
Assessment shall be documented in an Acquisition Decision Memorandum. The Post-
PDR assessment shall reflect any requirements trades based upon the PM's
assessment of cost, schedule, and performance risk.
The Milestone Decision Authority (MDA) may assess the programs design maturity and
technical risks following the system-level Critical Design Review (CDR ) .
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      851
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                     performance.
    2. All PMs shall continue to document CDRs in accordance with Component best
       practices.
The CDR risk assessment checklist is designed as a technical review preparation tool,
and should be used as the primary guide for assessing risk during the review. This
checklist is available on the Systems Engineering Community of Practice.
The MDA shall review the Post-CDR Report (or Assessment for an MDAP) and the
PM's resolution/ mitigation plans and determine whether additional action is necessary
to satisfy EMD Phase exit criteria and to achieve the program outcomes specified in the
APB. The results of the MDA's Post-CDR Assessment Decision Review shall be
documented in an ADM staffed by the DAB Executive Secretary.
For space programs, an IPA must be provided to support each milestone, at the Post-
System Design Review Assessment, and at any other time as directed by the MDA.
IPAs may also be used to assess other types of programs.
The Director, PARCA (D, PARCA) was established by the Weapon Systems Acquisition
Reform Act of 2009 (section 103 of P.L. 111-23,) to conduct and oversee performance
assessments and root cause analyses for Major Defense Acquisition Programs
(MDAPs). (Note: D, PARCA has no program execution responsibility.)
Per section 103 P.L. 111-23, the Director, Performance Assessments and Root Cause
Analyses (D, PARCA) is required to conduct assessments and analyses periodically or
when requested by senior Department officials. At a minimum, the D, PARCA must also
advise acquisition officials on performance issues regarding an MDAP that may arise:
Also, per section 205 P.L. 111-23 , in the case of a program that receives a Nunn-
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      852
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
McCurdy certification, the D, PARCA must also assess the program not less often than
semi-annually, in the year following a new milestone approval.
Per section 103 P.L. 111-23 , the Director, Performance Assessments and Root Cause
Analyses (D, PARCA) is required to conduct Root Cause Analyses (RCAs) for MDAPs
to determine the underlying cause or causes for shortcomings in cost, schedule, and
performance including the role of unrealistic performance expectations, unrealistic
baseline estimates for cost and schedule, immature technologies, unanticipated
requirements changes, quantity changes, poor program management, funding
instability, or any other matters. The RCAs are used to inform senior Departmental
leadership of issues and are included as one-pagers in the Nunn McCurdy certification
packages sent to Congress.
The Business Capability Lifecycle (BCL) model for Defense Business Systems (DBS)
utilizes an independent risk assessment, known as ERAM, as mandatory input to MS A
and B decisions for Major Automated Information Systems (MAIS) DBS.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      853
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
10.6.3. Classification and Management of Sensitive Information
The Program Manager, the DoD Component, or the Office of the Secretary of Defense
(OSD) staff prepares most program information. Some information requires approval by
an acquisition executive or other senior decision authority. Other information is for
consideration only. In most cases, information content and availability are more
important than format.
Unless otherwise specified, all plans, waivers, certifications and reports of findings
referred to in this Guidebook are exempt from licensing under one or more exemption
provisions of DoD 8910.1-M .
Program Managers (PMs) will comply with recordkeeping responsibilities under the
Federal Records Act for the information collected and retained in the form of electronic
records (See DoD Directive 5015.2 .). Electronic record-keeping systems should
preserve the information submitted, as required by section 3101 of title 44, United
States Code and implementing regulations. Electronic record-keeping systems should
also provide, wherever appropriate, for the electronic acknowledgment of electronic
filings that are successfully submitted. PMs must consider the record-keeping
functionality of any systems that store electronic documents and electronic signatures to
ensure users have appropriate access to the information and can meet the Agency's
record-keeping needs.
Program Managers (PMs) must review their programs to identify and document critical
program information (CPI) requiring protection ( DoD Instruction 5200.39 ). (PMs) must
also review their programs to identify controlled unclassified information (CUI). CUI
includes "FOUO" information as defined in DoD Directive 5230.24 and information with
other approved markings requiring dissemination controls that are exempt from
mandatory disclosure under the Freedom of Information Act (e.g., DoD 5400.7-R , DoD
Directive 5230.25 , and Export Control Act ).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      854
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
When necessary, PMs develop Security Classification Guides in accordance with DoD
5200.1-R.
10.9.1.1. Trade-Offs
Program plans describe the detailed activities of the acquisition program. Except as
specified by DoD Instruction 5000.02 , the Program Manager (in coordination with the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      855
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Milestone Decision Authority and Program Executive Officer) should determine the type
and number of program plans needed to manage program execution.
A separate APB is required for each increment of an MDAP or MAIS program, and each
sub-program of an MDAP. Increments can be used to plan concurrent or sequential
efforts to deliver capability more quickly and in line with the technological maturity of
each increment. (When an MDAP requires the delivery of two or more categories of end
items that differ significantly in form and function, subprograms may be established.)
Program goals consist of an objective value and a threshold value for each Key
Performance Parameter and Key System Attribute parameter. Cost, schedule, and
performance are intrinsically linked and the objective and threshold values of all
program goals should be developed with these relationships in mind. The PM is
responsible for managing the trade space between program objectives and thresholds
within the bounds of cost, schedule, and performance.
Objective values represent the desired operational goal associated with a performance
attribute beyond which any gain in utility does not warrant additional expenditure.
Generally, the objective value is an operationally significant increment above the
threshold. An objective value may be the same as the threshold when an operationally
significant increment above the threshold is not useful.
Thresholds represent the minimum acceptable operational values below which the utility
of the system becomes questionable. For performance, a threshold represents either a
minimum or maximum acceptable value, while for schedule and cost, thresholds would
normally represent maximum allowable values. The failure to attain program thresholds
may degrade system performance, delay the program (possibly impacting related
programs or systems), or make the program too costly. The failure to attain program
thresholds, therefore, places the overall affordability of the program and/or the capability
provided by the system into question.
As noted above, each APB parameter must have both an objective and a threshold. For
each performance parameter, if no objective is specified, the threshold value will serve
as the objective value, and if no threshold is specified, the objective value will serve as
the threshold value. For schedule and cost parameters, there are specified default
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      856
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
threshold values. The default threshold for schedule is the objective value plus 6
months; the default threshold for cost is the objective value plus 10 percent of the
objective value. Despite these guidelines, the PM may propose (with justification) an
appropriate threshold value to optimize program trade space, subject to MDA and user
approval.
The PM derives the APB from the users' performance requirements, schedule planning
and requirements, and best estimates of total program cost consistent with projected
funding. The sponsor of a capability needs document (i.e., Capability Development
Document or Capability Production Document ) provides an objective and a threshold
for each attribute that describes an aspect of a system or capability to be developed or
acquired. The PM will use this information to develop an optimal product within the
available trade space. APB parameter values should represent the program as it is
expected to be developed, produced and/or deployed, sustained and funded.
Per section 2435 of title 10 United States Code , the Department of Defense may not
obligate funds for Major Defense Acquisition Programs after entry into Engineering and
Manufacturing Development without an MDA-approved APB unless the Under
Secretary of Defense for Acquisition, Technology, and Logistics (USD(AT&L))
specifically approves the obligation. DoD Instruction 5000.02 extends this policy to
Major Automated Information System (MAIS) programs.
The Milestone Decision Authority (MDA) is the approval authority for the APB. The APB
requires the concurrence of the Program Executive Officer for all Acquisition Category
(ACAT) programs, and the concurrence of the DoD Component Acquisition Executive
for ACAT ID and IAM programs.
The Program Manager (PM), in coordination with the user/sponsor, prepares the APB
for program initiation. The PM can propose a revision of the APB for approval at each
major milestone review and as the program enters full rate production/deployment.
The PM may also propose , for consideration by the Milestone Decision Authority
(MDA), a revision of the APB that reflects the result of a major program restructure that
occurs between milestone events and is fully funded. The MDA will decide whether or
not to approve such a proposal.
All ACAT ID and IAM program APBs and Joint Requirements Oversight Council Interest
program APBs must be submitted to the office of the Under Secretary of Defense for
Acquisition, Technology and Logistics (USD(AT&L))-specifically the office of the
Director, Acquisition Resources and Analysis (ARA)-for action. ARA will coordinate
ACAT ID and IAM APBs with the appropriate Department stakeholders, minimally
including Defense Acquisition Board principals and advisors, prior to forwarding for
MDA approval.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      857
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
10.9.1.1. Trade-Offs
The best time to reduce total ownership cost and program schedule is early in the
acquisition process. Continuous cost/schedule/performance trade-off analyses can help
attain cost and schedule reductions.
Cost, schedule, and performance may be traded within the "trade space" between the
objective and the threshold without obtaining Milestone Decision Authority (MDA)
approval. Making trade-offs outside the trade space (i.e., decisions that result in
acquisition program parameter changes) require approval of both the MDA and the
capability needs approval authority. Validated Key Performance Parameters may not be
traded-off without approval by the validation authority. The PM and the user should work
together on all trade-off decisions.
Configuration Steering Boards (CSBs) are a core part of managing the cost, schedule,
and performance trade space for acquisition programs.
The Program Manager (PM) should immediately notify the Milestone Decision Authority
(MDA) via a Program Deviation Report when the PM's current estimate exceeds one or
more APB threshold value for cost, schedule, and/or performance.
Only the MDA can approve a revision to the APB. Before undertaking revisions to an
APB for a Major Defense Acquisition Program (MDAP), consultation with office of the
Under Secretary of Defense for Acquisition, Technology and Logistics (USD(AT&L))-
specifically the office of Acquisition Resources and Analysis (ARA)-and the Overarching
Integrated Product Team leader is recommended.
For MDAPs, both "original" and current APBs are maintained. The original APB cost
estimate may be revised only if a breach occurs that exceeds the critical unit cost
threshold for the program . The "critical" unit cost threshold, as it relates to the
original APB, is defined to be an increase of at least 50 percent over the original
Program Acquisition Unit Cost (PAUC) or the original Average Procurement Unit Cost
(APUC) for the program. The "critical" unit cost threshold, as it relates to the current
APB, is defined to be an increase of at least 25 percent over the current PAUC or
current APUC for the program.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      858
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
For MAIS programs, only a current APB is maintained, but the Original Estimate
reported in the MAIS Annual Report (MAR) serves a similar purpose as an Original APB
Baseline. (The MAR Original Estimate unlike the APB can be revised only after a
Critical Change Report has been submitted to Congress. MAIS Critical Change
thresholds are: cost parameter (Total Acquisition Cost or Total Lifecycle Cost) 25
percent or greater, schedule parameter of 12 months or greater, or failure to meet a key
performance threshold.)
For both MDAP and MAIS programs, the current APB shall be revised at major
milestone decisions, and at the full-rate production decision (full deployment decisions
for MAIS). Other than these occasions, a revision to the current APB may be considered
only at the discretion of the MDA and only if the revision is a result of a major
program restructure that is fully funded and approved by the MDA, or that occurs as a
result of a program deviation (breach), that is primarily the result of external causes
beyond the control of the PM. A revision to the current APB shall not be authorized if it
is proposed merely to avoid a reportable breach. The determination of whether to revise
the APB will be made by the MDA.
For MDAPs, a "critical" unit cost breach triggers the section 2433a of title 10, United
States Code (a.k.a "Nunn-McCurdy") certification process. In that case, both the current
and original APBs shall be revised to reflect the same new APB values, assuming the
program is certified. For MAIS programs, a Critical Change triggers the similar process
implementing section 2445c of title 10, United States Code
The APB is a key management document which establishes the approved program's
objective and threshold boundaries, and links cost, schedule and performance
parameters. The Program Manager (PM) manages the program within that trade space.
Cost figures should reflect realistic cost estimates of the total program and/or increment.
Budgeted amounts should equal the total cost objectives in the APB. As the program
progresses, the PM can refine procurement costs based on contractor actual (return)
costs from Technology Development, Engineering and Manufacturing Development,
and Low-Rate Initial Production.
The cost parameters of Acquisition Category (ACAT) IA programs are the same as
those for ACAT I programs as noted in the next paragraph with the addition of Defense
Working Capital Funds and Other Funding.
The APB should contain cost parameters (objectives and thresholds) for major elements
of program life-cycle costs (or total ownership costs), as defined in Chapter 3 .
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      859
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
These elements include:
The objective parameters for cost are presented in both base-year and then-year
dollars. The threshold parameters for cost are only presented in base-year dollars.
Schedule parameters should include, as a minimum, the projected dates for major
decision points (such as Milestone A, Milestone B, Milestone C, Full Rate Production,
and the system-level Preliminary Design Review and Critical Design Review), major
testing events, and Initial Operational Capability. To be consistent with Chapter 144A of
title 10, United States Code , the schedule parameters for Major Automated Information
System programs should include: the dates of the Milestone A decision (or MDA
approval of the preferred alternative if there was no Milestone A), the objective and
threshold dates for Milestone B, Milestone C, Full Deployment Decision, and Full
Deployment. If Milestones A, B and/or C are tailored out, the APB shall state the
rationale for the tailoring. Full Deployment dates should be identified as TBD until the
Full Deployment Decision ADM is signed.
The Full Deployment Decision ADM shall establish the Full Deployment objective and
threshold dates, define an identifiable Full Deployment, and designate the acquisition
official who will declare Full Deployment in writing. When Full Deployment is declared,
the PM shall notify the MDA.
The PM may propose, and the MDA may approve, other, specific, critical, and system
events.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      860
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
10.9.3.3. Acquisition Program Baseline (APB) Performance
APB performance parameters should include the key performance parameters identified
in the capability needs document(s) (i.e., Capability Development Document (CDD) and
Capability Production Document (CPD)), and the values and meanings of objectives
and thresholds should be consistent between the APB and the capability document.
(See also CJCS Instruction 3170.01H) The number and specificity of performance
parameters may change over the lifecycle of the acquisition, primarily at major
milestones. At Milestone B (Engineering & Manufacturing Development decision), the
APB should reflect the defined, operational-level measures of effectiveness or
measures of performance to describe needed capabilities, minimally reflecting the CDD.
As a program matures, system-level requirements may become better defined.
Approaching the MS C decision, the APB should reflect the CPD. The MDA may also
add performance parameters to the APB other than the Joint Requirements Oversight
Council (JROC)-validated Key Performance Parameters .
OSD staff will review and comment on APBs for ACAT ID and IAM, Special Interest
programs, and other programs designated by the Defense Acquisition Executive. The
Joint Staff (J-8) will review the cost, schedule, and key performance parameter objective
and threshold values in the APB for JROC Interest programs, and any other programs
of significant joint interest (as determined by the J-8). The J-8 review will ensure that the
objective and threshold values are consistent with the JROC-approved CDD, CPD, and
prior JROC decision(s). The review will also ensure that the baseline provides the
necessary warfighting capabilities affordably and within required time frames. (See also
the CJCS Instruction 3170.01 H and the January 19, 2012 JCIDS Manual .)
Programs using an evolutionary acquisition strategy should design the APB consistent
with the sponsor's capability document(s) and the applicable example approaches
outlined in Table 10.9.4.T1.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      861
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    Table 10.9.4.T1. APB Parameters under an Evolutionary Acquisition Strategy
DoD Instruction 5000.02 requires the Milestone Decision Authority (MDA) to formally
initiate each increment of an evolutionary acquisition program. Program initiation for
follow-on increments may occur at Milestone B or C. Therefore, the program manager
should develop APB documented goals for each program increment or sub-program. An
Increment is a militarily useful and supportable operational capability that can be
developed, produced, deployed, and sustained. Each Increment must have an
Acquisition Program Baseline (APB) with its own set of threshold and objective values
set by the user. (DODI 5000.02, Encl.2, 2.c.) In the context of an IS acquisition, this
means that both threshold and objective values for cost, schedule, and performance
parameters must be established for each Increment.
When an MDAP requires the delivery of two or more categories of end items that differ
significantly in form and function, subprograms may be established for baseline
development and reporting purposes. Section 2430A of title 10, United States Code
stipulates that when one subprogram is designated within an MDAP, all remaining
elements (increments or components) of the program shall also be appropriately
organized into one or more other subprograms.
The decision whether to establish subprograms for an MDAP requires careful analysis
and must be made on a case-by-case basis. Structuring an MDAP with subprograms
should reflect the way the program is being managed, and represent the most efficient
and informative way to convey information about a program to senior defense
acquisition officials as well as to the Congress.
The law requires that the congressional defense committees be notified in writing of any
proposed subprogram designation not less than 30 days before the date such
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      862
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
designation takes effect. The approval of an APB reflecting such designation will be
considered the date that subprogram designation takes effect; therefore, notification to
Congress must occur not less than 30 days before a subprogram APB is approved.
Accordingly, DoD Components must notify the Director, Acquisition Resources and
Analysis of all proposed APBs that reflect new or revised subprogram designation at
least 60 days before the proposed APB is submitted to the Milestone Decision Authority
for approval.
10.10.1.5.1.1. Unit Cost Reporting (UCR) for the Software Component of a Major
Defense Acquisition Program (MDAP)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      863
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
10.10.1.5.2. Unit Cost Report (UCR) Breach Reporting
Periodic reports include only those reports required by statute or the Milestone Decision
Authority (MDA). Except for the reports outlined in this section, the MDA tailors the
scope and formality of reporting requirements.
P. L. 111-23, Weapons Systems Acquisition Reform Act of 2009 , May 22, 2009,
amended section 2430 of title 10 United States Code , revising the definition of a Major
Defense Acquisition Program (MDAP) as follows. A MDAP is a DoD acquisition program
that is not a highly sensitive classified program and:
(2) That is estimated to require an eventual total expenditure for research, development,
test, and evaluation of more than $365 million (based on fiscal year 2000 constant
dollars) or an eventual total expenditure for procurement, including all planned
increments or spirals, of more than $3.19 billion (based on fiscal year 2000 constant
dollars).
(1) The estimated level of resources required to fulfill the relevant joint military
requirement as determined by the JROC, pursuant to section 181 of title 10 United
States Code;
(2) The cost estimate referenced in section 2366a(a)(4) of title 10 United States Code;
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      864
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
(3) The cost estimate referenced in section 2366b(a)(1)(C) of title 10 United States
Code ; and
(4) The cost estimate within a baseline description as required by section 2435 of title
10 United States Code .
The National Defense Authorization Act (NDAA) for FY 2009 amended section 2430 of
title 10 United States Code to give the Department authority to designate subprograms
within MDAPs.
In the DoD acquisition environment, there are two primary instances when establishing
subprograms within an MDAP may be advisable:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      865
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
most efficient and informative way to convey information about a program to senior
defense acquisition officials as well as to Congress. For Acquisition Category (ACAT) ID
MDAPs, the Defense Acquisition Executive will approve the designation of subprograms
based on recommendations from the Overarching Integrated Product Team (OIPT). For
ACAT IC MDAPs, the authority to designate subprograms is delegated to the respective
DoD Component Milestone Decision Authority (MDA). In either case, the
recommendations from the OIPT or the MDAs staff should also include appropriate
guidance on how the relevant statutory and regulatory requirements of DoD Instruction
5000.02 should apply at the subprogram or program level (for example, how to structure
the acquisition strategy or the independent cost estimate for a program with designated
subprograms).
The law requires that the Secretary of Defense (as delegated to the Under Secretary of
Defense (Acquisition, Technology, and Logistics) must notify the congressional defense
committees in writing of any proposed subprogram designation not less than 30 days
before the date such designation takes effect. The approval of an Acquisition Program
Baseline (APB) reflecting such designation will be considered the date that the
subprogram designation takes effect; therefore, notification to Congress must occur not
less than 30 days before a subprogram APB is approved.
Accordingly, DoD Components must notify the Director, Acquisition Resources and
Analysis of all proposed APBs that reflect new or revised subprogram designations at
least 60 days before the proposed APB is submitted to the Milestone Decision Authority
for approval. Once a subprogram structure is established for a Major Defense
Acquisition Program, the Defense Acquisition Executive Summary , Selected
Acquisition Report, and Unit Cost Reports (quarterly and breach) will reflect that
subprogram structure.
In the event a subprogram experiences critical unit cost growth, the certification required
for the program to continue shall be made at the program level-not the subprogram
level.
The prohibition on obligations until the submission of the Selected Acquisition Report
(SAR) for significant breaches, and the certification for critical breaches, will affect all
major contracts of the program, not just those relating to the subprogram that breached.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      866
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
10.10.1.3. Acquisition Program Baseline (APB) Reporting
The Program Manager (PM) must maintain a current estimate of the program being
executed (see definition of "current estimate" in section 10.10.1.3.2 ). The PM must
immediately notify the Milestone Decision Authority when a baseline deviation occurs
based upon the current estimate. A baseline deviation occurs when the current estimate
is greater than the threshold. (See section 2433 of title 10 United States Code )
The current estimate is the latest estimate of program acquisition cost and quantity,
schedule milestone dates, performance characteristic values, and critical technical
parameters of the approved program (i.e., the approved program as reflected in the
currently approved Acquisition Program Baseline (APB), Acquisition Decision
Memorandum, or in any other document containing a more current decision of the
Milestone Decision Authority (MDA) or other approval authority). For cost, the current
estimate is normally the President's Budget plus or minus known changes; for schedule,
it is normally the program manager's best estimate of current schedule milestone dates;
for performance it is normally the program's manager's best estimate of current
performance characteristics values.
Program Managers (PMs) will report the current estimate of each APB parameter
periodically to the MDA. PMs will report current estimates for ACAT I and IA programs
quarterly in the Defense Acquisition Executive Summary. For all other programs, the
cognizant MDA will direct the reporting frequency.
When the Program Manager (PM) has reason to believe that the current estimate for
the program indicates that a performance, schedule, or cost threshold value will not be
achieved, he or she will immediately notify the Milestone Decision Authority (MDA) of
the deviation. Within 30 days of the occurrence of the program deviation, the PM will
submit a Program Deviation Report to the MDA providing the reasons for the program
deviation and a recommendation for the actions that need to be taken to bring the
program back within the baseline parameters (if this information was not included with
the original notification). Within 90 d days of the occurrence of the program deviation,
one of the following should have occurred: the program is back within Acquisition
Program Baseline (APB) parameters; or an OIPT-level or equivalent Component-level
review has been conducted to review the program and make recommendations to the
MDA regarding the parameters that were breached. The MDA will decide, based on
criteria in sections 2433 and 2435 of title 10 United States Code, whether it is
appropriate to approve a revision to the APB. (Generally, APB changes will only be
approved in conjunction with a major milestone decision or as a result of a critical cost
(a.k.a. Nunn-McCurdy) breach. In limited circumstances, the MDA may choose to
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      867
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
approve a change to the current APB as a result of a major program restructure that is
fully funded, or as a result of a program deviation--if the breach is primarily the result of
external causes beyond the Program Managers control. A revision to the current APB
will not be authorized if it is proposed merely to avoid a reportable breach.
If one of the above actions has not occurred within 90 days of the program deviation,
the MDA should hold a formal program review to determine program status and the way
ahead.
In accordance with section 2432 of tile 10, United States Code , the Secretary of
Defense (as delegated to the Under Secretary of Defense (Acquisition, Technology, and
Logistics shall submit a SAR to Congress for all Major Defense Acquisition Programs
(MDAPs). The Program Manager will use the Defense Acquisition Management
Information Retrieval system SAR module application to prepare the SAR.
A SAR provides Congress with the status of total program cost, schedule, and
performance, as well as program unit cost and unit cost breach information for a specific
program. Each SAR will also include a full life-cycle cost analysis for the reporting
program, each of its evolutionary increments, as available, and for its antecedent
program, if applicable. Required content for a SAR is defined in section 2432 of title 10
United States Code and is reflected in the SAR module of the Defense Acquisition
Management Information System by which the SAR information is entered and
submitted electronically.
The SAR for the quarter ending December 31 is the annual SAR. The Program
Manager (PM) will submit the annual SAR within 45 days after the President transmits
the following fiscal year's budget to Congress. Annual SARs will reflect the President's
Budget and supporting documentation. The annual SAR is mandatory for all ACAT I
programs.
The PM will submit quarterly exception SARs for the quarters ending March 31, June
30, and September 30 not later than 45 days after the quarter ends. Quarterly SARs are
reported on an exception basis, as follows:
    •    The current estimate exceeds the Program Acquisition Unit Cost (PAUC)
         objective or the Average Procurement Unit Cost (APUC) objective of the
         currently approved Acquisition Program Baseline (APB) in base-year dollars by
         15 percent or more;
    •    The current estimate exceeds the PAUC or APUC objective of the original APB in
         base-year dollars by 30 percent or more.
    •    The current estimate includes a 6-month, or greater, delay for any schedule
         parameter that occurred since the current estimate reported in the previous SAR;
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      868
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Milestone B or Milestone C approval occurs within the reportable quarter.
Quarterly exception SARs will report the current estimate of the program for cost,
schedule, and performance (see definition of current estimate in section 10.10.1.3.2.
above). Pre-Milestone B programs may submit Research, Development, Test, and
Evaluation (RDT&E)-only reports, excluding procurement, military construction, and
acquisition-related operations and maintenance costs. Department of Defense
Components must notify the Under Secretary of Defense (Acquisition, Technology, and
Logistics) (USD(AT&L)) of the names of the programs for which they intend to submit
RDT&E-only SARs 30 days before the reporting quarter ends. The USD(AT&L) must
also notify Congress 15 days before the reports are due.
Whenever the USD(AT&L) proposes changes to the content of a SAR, he or she must
submit notice of the proposed changes to the Armed Services Committees of the
Senate and House of Representatives. The USD(AT&L) may consider the changes
approved, and incorporate them into the SAR, 60 days after the committees receive the
change notice.
Per section 2433(c)(2) of title10, United States Code , for any Major Defense Acquisition
Program (MDAP) certified subsequent to a critical cost breach, the first SAR for the
program submitted after the President submits a budget in the calendar year following
the year in which the program was restructured must include a description of all funding
changes made as a result of the growth in cost of the program, including reductions
made in funding for other programs to accommodate such cost growth.
Per section 2366b of title 10, United States Code , the SAR for any MDAP receiving a
waiver for one or more Milestone (MS) B certification criteria must prominently and
clearly indicate that such program has not fully satisfied the certification requirements
for MS B, until such time that the Milestone Decision Authority makes a determination
that the program has satisfied all such certification requirements.
In accordance with section 2432 of title 10, United States Code , the Secretary of
Defense may waive the requirement for submission of a SAR for a program for a fiscal
year if:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      869
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
section 1105 of title 31, United States Code in that fiscal year.
The Under Secretary of Defense (Acquisition, Technology, and Logistics) will consider
terminating reporting of SAR data when 90 percent of expected production deliveries or
planned acquisition expenditures have been made, or when the program is no longer
considered an ACAT I program in accordance with section 2432 of tile 10, United States
Code .
In accordance with section 2433 of title 10, United States Code , the Program Manager
will prepare UCRs for all ACAT I programs submitting Selected Acquisition Reports,
except pre-Milestone B programs that are reporting Research, Development, Test &
Evaluation costs only.
The Program Manager (PM) will report the unit costs of the program to the Component
Acquisition Executive on a quarterly basis through the electronic Defense Acquisition
Executive Summary (DAES) submission process. The PM will submit the update in
accordance with DAES submission procedures. Reporting will begin with submission of
the initial Selected Acquisition Report (SAR), and terminate with submission of the final
SAR. Content of the unit cost report is specified in section 2433 of title 10, United States
Code.
    1. The program acquisition unit cost for the program (or for each designated major
       subprogram under the program).
    2. In the case of a procurement program, the current estimate of the Program
       Acquisition Unit Cost and the Average Procurement Unit Cost (in base-year
       dollars) for the program (or for each designated major subprogram under the
       program);
    3. Any earned value management cost and schedule variances, for each of the
       major contracts since entering the contract;
    4. Any changes from program schedule milestones or program performances
       reflected in the baseline description established under section 2435 of title 10,
       United States Code that are known, expected, or anticipated by the program
       manager.
    5. Any significant changes in the total program cost for development and
       procurement of the software component of the program or subprogram, schedule
       milestones for the software component of the program or subprogram, or
       expected performance for the software component of the program or subprogram
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      870
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         that are known, expected, or anticipated by the program manager.
10.10.1.5.1.1. Unit Cost Reporting (UCR) for the Software Component of a Major
Defense Acquisition Program (MDAP)
Section 2433(b)(5) of title 10, United States Code requires reporting of any significant
changes in the total program cost for development and procurement of the software
component of the program or subprogram, schedule milestones for the software
component of the program or subprogram, or expected performance for the software
component of the program or subprogram that are known, expected, or anticipated by
the program manager.
Under this reporting framework, the Initial Government Report (IGR) and/or the
contractors Initial Developer Report (IDR) should be used as the baselines to develop a
cost and schedule software component estimate. (The IGR and IDR are established
within 120 days of the contract award, or within 60 days of beginning a software
release, and are updated at the completion of a software increment to reflect the actual
resources incurred). Note that the SRDR includes only software resource requirements
(staffing and schedule), not cost explicitly. However, PMs can, and should, use these
parameters to compute a cost estimate.
The PMs software component estimate must be documented in the Acquisition Program
Baseline and used as the basis for determining whether there are any significant
changes in the total program cost for development and procurement of the software
component of the program or subprogram, schedule milestones for the software
component of the program or subprogram, or expected performance for the software
component of the program or subprogram that are known, expected, or anticipated by
the program manager. Any such changes must be addressed in the UCR.
Any PM with an APB for an MDAP (or it’s subprogram) that does not currently include a
software component estimate must complete the estimate and report it in the unit cost
portion of the next program (or subprogram) SAR. A footnote must be included to
indicate that this estimate will be the baseline against which future change in the
software component cost will be compared.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      871
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
10.10.1.5.2. Unit Cost Report (UCR) Breach Reporting
If the program manager of a major defense acquisition program determines at any time
during a quarter that there is reasonable cause to believe that the Program Acquisition
Unit Cost for the program (or for a designated major subprogram under the program) or
the Average Procurement Unit Cost for the program (or for such a subprogram), as
applicable, has increased by a percentage equal to or greater than the significant cost
growth threshold or the critical cost growth threshold, the breach must be reported in
accordance with section 2433 of title 10 United States Code .
When one or more problems with the software component of the Major Acquisition
Defense Program, or any designated major subprogram under the program, has
significantly contributed to the increase in program unit costs, the action taken and
proposed to be taken to solve such problems must also be included in the Selected
Acquisition Report (SAR). The only exception to that requirement occurs when a
program acquisition unit cost increase or a procurement unit cost increase for a major
defense acquisition program or designated major subprogram results in a termination or
cancellation of the entire program or subprogram.
The Program Manager will notify the Component Acquisition Executive (CAE)
immediately, whenever there is a reasonable cause to believe that the current estimate
of either the Program Acquisition Unit Cost (PAUC) or Average Procurement Unit Cost
(APUC) (in base-year dollars) of a Major Defense Acquisition Program, or designated
subprogram, has increased by at least 15 percent over the PAUC or APUC objective of
the currently approved Acquisition Program Baseline (APB), respectively, or has
increased by at least 30 percent over the PAUC or APUC of the original/revised original
APB.
If the CAE determines that there is an increase in the current estimate of the PAUC or
APUC objective of at least 15 percent over the currently approved APB, or an increase
of at least 30 percent over the original APB, the CAE, based on the PMs notification,
shall inform the cognizant Head of the DoD Component of this determination. If the
cognizant Head of the DoD Component subsequently determines that there is, in fact,
an increase in the current estimate of the PAUC or APUC of at least 15 percent over the
currently approved APB, or an increase in the current estimate of the PAUC or APUC of
at least 30 percent over the original APB, the Head of the DoD Component will notify
Congress, in writing, of the determination of a significant cost breach. The notification
will be made not later than 45 days after the end of the quarter, in the case of a
quarterly report; or not later than 45 days after the date of the report, in the case of a
report based on reasonable cause. In either case, notification will include the date that
the Head of the DoD Component made the determination. In addition, the Head of the
DoD Component will submit a Selected Acquisition Report (SAR) for either the fiscal
year quarter ending on or after the determination date, or for the fiscal year quarter that
immediately precedes the fiscal year quarter ending on or after the determination date.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      872
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
This SAR shall contain the additional, breach-related information.
The cognizant Head of the DoD Component shall also inform the Under Secretary of
Defense (Acquisition, Technology, and Logistics) of the significant cost breach
determination not later than five working days prior to submitting the congressional
notification.
Per section 2433a of title 10 United States Code , the Program Manager shall notify the
Department of Defense Component Acquisition Executive (CAE) immediately,
whenever there is a reasonable cause to believe that the current estimate of either the
Program Acquisition Unit Cost (PAUC) or Average Procurement Unit Cost (APUC)
objective of a Major Defense Acquisition Program (MDAP),, or designated subprogram
(in base-year dollars) has increased by at least 25 percent over the PAUC or APUC
objective of the currently approved Acquisition Program Baseline (APB) estimate, or at
least 50 percent over the PAUC or APUC objective of the original/revised original APB
(aka Nunn-McCurdy breach).
If the CAE determines that there is an increase in the current estimate of the PAUC or
APUC objective of at least 25 percent over the currently approved APB, or an increase
in the current estimate of PAUC or APUC objective of at least 50 percent over the
original APB, the CAE, based upon the PMs notification shall inform the cognizant Head
of the DoD Component of this determination. If the cognizant Head of the DoD
Component subsequently determines that there is, in fact, an increase in the current
estimate of the PAUC or APUC of at least 25 percent over the currently approved APB,
or an increase in the PAUC or APUC of at least 50 percent over the original APB, the
Head of the DoD Component shall notify Congress, in writing, of the determination of a
critical cost breach. The notification shall be not later than 45 days after the end of the
quarter, in the case of a quarterly report; or not later than 45 days after the date of the
report, in the case of a report based on reasonable cause. In either case, notification
shall include the date that the Head of the DoD Component made the determination. In
addition, the Head of the DoD Component shall submit a Selected Acquisition Report
(SAR) for either the fiscal year quarter ending on or after the determination date, or for
the fiscal year quarter that immediately precedes the fiscal year quarter ending on or
after the determination date. This SAR shall contain the additional critical cost breach-
related information.
The cognizant Head of the DoD Component shall also inform the Under Secretary of
Defense (Acquisition, Technology, and Logistics) (USD(AT&L)) of the critical cost
breach determination not later than five working days prior to submitting the
congressional notification.
Per section 2433a of title 10, United States Code the USD(AT&L), after consultation
with the Joint Requirements Oversight Council regarding program requirements, shall
determine the root cause or causes of the critical cost growth in accordance with
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      873
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
applicable statutory requirements and DoD policies, procedures, and guidance based
upon the root cause analysis conducted by the Director, Performance Assessments and
Root Cause Analyses (DPARCA); and in consultation with the Director, Cost
Assessment and Program Evaluation (DCAPE), shall carry out an assessment of:
    1. The projected cost of completing the program if current requirements are not
       modified;
    2. The projected cost of completing the program based on reasonable modification
       of such requirements;
    3. The rough order of magnitude of the costs of any reasonable alternative system
       or capability; and
    4. The need to reduce funding for other programs due to the growth in cost of the
       program.
After conducting the reassessment, the USD(AT&L) shall terminate the program unless
the USD(AT&L) submits a written certification to Congress before the end of the 60-day
period beginning on the day the SAR containing the unit cost information is required to
be submitted to Congress. The certification must state:
The written certification shall be accompanied by a report presenting the root cause
analysis and assessment and the basis for each determination made in accordance with
the five certification criteria listed above together with supporting documentation.
If the USD(AT&L) elects not to terminate a MDAP that has experienced critical cost
growth, the USD(AT&L) shall:
    1. Restructure the program in a manner that addresses the root cause or causes of
       the critical cost growth, as identified by the actions described above, and ensure
       that the program has an appropriate management structure as set forth in the
       written certification;
    2. Rescind the most recent milestone approval for the program or designated
       subprograms and withdraw any associated certification(s) pursuant to section
       2366a or 2366b of title 10, United States Code;
    3. Require a new milestone approval for the program or designated subprograms
       before taking any contract action to enter a new contract, exercise an option
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      874
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
       under an existing contract, or otherwise extend the scope of an existing contract
       under the program, except to the extent determined necessary by the MDA, on a
       non-delegable basis, to ensure that the program can be restructured as intended
       by the Secretary of Defense without unnecessarily wasting resources.; and
    4. Include in the report a description of all funding changes made as a result of the
       growth in cost of the program, including reductions made in funding for other
       programs to accommodate such cost growth. (The report specified here is the
       first SAR for the program submitted after the President submits a budget in the
       calendar year following the year in which the program was restructured.)
If, subsequent to a critical breach and based on a cost assessment and root cause
analysis, the MDA determines that after eliminating the cost increase attributed to a
quantity change the remaining increase to the PAUC is 5 percent or less to the current
baseline and 10% or less to the original baseline, the following two requirements from
section 2433a of title 10, United States Code may be waived:
This waiver is only applicable if the change in quantity was not made as a result of an
increase in program cost, a delay in the program, or a problem meeting program
requirements.
Additionally, for each MDAP that has exceeded the critical unit cost thresholds, but has
not been terminated, the DPARCA shall conduct semi-annual reviews until 1 year after
the date a new milestone approval is received. The DPARCA shall report the results of
the semi-annual reviews to the USD(AT&L) and summarize the results in the Director's
next annual report.
If an MDAP is terminated after experiencing a critical unit cost breach, the USD(AT&L)
shall submit to Congress a written report with the following information:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      875
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
either a PAUC or APUC increase of at least 25 percent over the current APB or at least
50 percent over the original/revised APB and a SAR containing the additional unit cost
breach information and a certification by the USD(AT&L) is not submitted to Congress
as required, funds appropriated for Research, Development, Test & Evaluation,
procurement, or military construction may not be obligated for a major contract under
the program.
A critical cost breach to the PAUC or APUC that results from the termination or
cancellation of an entire program will not require a critical cost breach certification by
the USD(AT&L).
Section 2366a of title 10, United States Code requires the Milestone Decision Authority
(MDA) to certify that a cost estimate for the program has been submitted, with the
concurrence of the Director of Cost and Program Evaluation, and that the level of
resources required to develop and procure the program is consistent with the priority
level assigned by the Joint Requirements Oversight Council (JROC).
Section 2366a also requires the Program Manager (PM) to notify the MDA if:
    •    The projected cost of the certified program, at any time before Milestone B,
         exceeds the cost estimate submitted at the time of certification by at least 25% or
    •    The time period required for delivery of an IOC exceeds the schedule objective
         established in accordance with section 181(b)(5) of title 10, United States Code
         by more than 25%.
The MDA, in consultation with the JROC, must then determine whether the level of
resources required to develop and procure the program remains consistent with the
priority assigned by the JROC. The MDA may withdraw the MS A certification or rescind
the MS A approval if the MDA determines that such action is in the interest of national
defense.
Not later than 30 days after the PM submits a notification to the MDA, the MDA must
submit a report to the congressional defense committees that:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      876
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                          are reasonable; and
                       the management structure for the program is adequate to manage
                          and control program development cost and schedule.
              o    A plan for terminating the development of the program or withdrawal of
                   Milestone A approval, if the Milestone Decision Authority determines that
                   such action is in the interest of national defense.
For programs that are expected to be Major Defense Acquisition Programs, the Office of
the Under Secretary of Defense for Acquisition Technology and Logistics (OUSD(AT&L)
will ensure that the program cost estimate and the IOC objective are documented in the
Milestone A Acquisition Decision Memorandum (ADM).
Program Managers are required to submit current program status with respect to the
original cost estimate and IOC objective as captured in the MS A ADM on a quarterly
basis via the Defense Acquisition Executive Summary tool in the Defense Acquisition
Information Management Retrieval System. Reporting will begin in the first quarter
following the Milestone A decision approval and will continue until Milestone B approval
is granted for the program.
10.11.1.5. Ending the Requirement to Report under Chapter 144A of title 10 United
States Code; Close-out Reports
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      877
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
10.11.2.2. Submitting the Major Automated Information System (MAIS) Annual
Report (MAR)
The FY07 National Defense Authorization Act (NDAA), Section 816, instituted a
reporting regime requiring MAIS programs to submit annual and quarterly reports. This
was codified in Chapter 144A of title 10, United States Code and has been amended
several times.
Briefly, the statute defines dollar thresholds for Major Automated Information System
(MAIS) programs and other investments required to report . A MAIS Annual Report
(MAR) is due to Congress 45 days after submission of the President's Budget, and each
quarter a MAIS Quarterly Report (MQR) is due to "a senior Department of Defense
official responsible for a MAIS program," hereafter referred to as the Senior Official.
The statute also describes reports that are due to the congressional defense
committees if a Program Manager (PM) estimates a Significant or Critical Change and
the Senior Official agrees. As shown in table 10.11.T1, below, Significant and Critical
Changes can occur in performance, schedule, and/or cost.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      878
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                           Table 10.11.T1. Significant and Critical Changes
                                        Significant                     Critical
        Cost                            15-25% increase                 25% increase
- total acquisition
        - total life-cycle
        Schedule                        >6 month - 1 year               1 year delay
                                        delay
                                                 Failed to achieve FDD within 5
                                                 years after the MS A decision or
                                                 the date when the preferred
                                                 alternative was selected and
                                                 approved by the MDA. ( See
                                                 10.11.5.2 )
        Performance        Significant adverse Undermines the ability of the
                           change in expected system to perform mission as
                           performance.          originally intended (i.e., did not
                                                 meet a KPP threshold)
        Report to          Notification due 45 Program Evaluation and Report
        congressional      days after the MQR due 60 days after the MQR was
        defense committees was due in the office due in the office of Senior
                           of Senior Official    Official
For additional information please see the Chapter 144A Key Documents and
References. A complete copy of this DAG implementation guidance is also available
there.
Chapter 144A of title 10 United States Code requires annual and quarterly reports for
each MAIS program and each other major information technology investment program
for which funds are requested by the President in the budget.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      879
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
10.11.1.1. Major Automated Information System (MAIS) Programs versus
Increments
In the Defense acquisition context the terms "Program" and "Increment" refer to the
management structure of the acquisition effort. Information System (IS) acquisitions
require a short cycle time, so the Increment has become the basic unit for management
of an Information System (IS) acquisition.
Increment -the Increment is "a militarily useful and supportable operational capability
that can be developed, produced, deployed, and sustained. Each Increment must have
an Acquisition Program Baseline (APB) with its own set of threshold and objective
values set by the user." (DODI 5000.02, Encl.2, 2.c.) In the context of an IS acquisition,
this means that both threshold and objective values for cost, schedule, and performance
parameters must be established for each Increment.
Program -the term "Program" in the IS context will refer to the summation of a
succession of Increments, and is a consolidation of acquisition efforts that is useful for
Planning, Programming, Budgeting, and Execution System purposes. An IS "Program"
does not have its own APB, rather each "Program" Increment has its own APB and is a
separate acquisition program (as defined in DoDD 5000.01).
For a more complete discussion of Programs and Increments, see the AIS Acquisition
Terms of Reference and Definitions .
A MAIS Program is defined in Chapter 144A of title 10 United States Code as "a
Department of Defense acquisition program for an Automated Information System
(either as a product or a service) that is either:
The MAIS threshold definition is statutory (per title 10 U.S.C. Chapter 144A ) and
explained in Table 1 of DoD Instruction 5000.02 :
    •    $32 million in fiscal year (FY) 2000 constant dollars for all expenditures, for all
         increments, regardless of the appropriation or fund source, directly related to the
         AIS definition, design, development, and deployment, and incurred in any single
         fiscal year; or
    •    $126 million in FY 2000 constant dollars for all expenditures, for all increments,
         regardless of the appropriation or fund source, directly related to the AIS
         definition, design, development, and deployment, and incurred from the
         beginning of the Materiel Solution Analysis Phase through deployment at all
         sites; or
    •    $378 million in FY 2000 constant dollars for all expenditures, for all increments,
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      880
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         regardless of the appropriation or fund source, directly related to the AIS
         definition, design, development, deployment, operations and maintenance, and
         incurred from the beginning of the Materiel Solution Analysis Phase through
         sustainment for the estimated useful life of the system.
Chapter 144A of title 10 United States Code extends coverage of the reporting
requirements to pre-MAIS Programs and other investments in Automated Information
System (AIS).
Section 817 of the Fiscal Year 2010 National Defense Authorization Act amended
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      881
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Section 2445d of title 10 U.S.C . giving the Secretary of Defense authority to designate
a program that both meets the definition of a MAIS and meets or exceeds the cost
threshold for an MDAP, to be treated only as a MAIS or only as an MDAP.
While these criteria will be employed as a general rule, other factors will also be
considered in determining whether to designate a program a MAIS or an MDAP, and
will be applied on a case-by-case basis.
10.11.1.5. Ending the Requirement to Report under Chapter 144A of title 10 United
States Code ; Close-out Reports
Many reasons exist to suggest the need for a program to report under Chapter 144A
should not arise or has come to an end. The Under Secretary of Defense (Acquisition,
Technology, and Logistics) or his designee will make this determination based on
consideration of the facts, including:
    •    The program does not or no longer meets the definitions presented above in
         10.11.1.2 ;
    •    The program has been terminated*; or
    •    The program has achieved full deployment (FD)**.
NOTES:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      882
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
** Full Deployment is achieved according to the terms of an Acquisition Decision
Memorandum (ADM) which documents a Full Deployment Decision (FDD). The FDD
ADM should:
Chapter 144A of title 10 United States Code requires the Secretary of Defense to
"submit to Congress each calendar year, not later than 45 days after the President
submits to Congress the budget justification documents regarding cost, schedule and
performance for each [ Program Required to Report] for which funds are requested by
the President in the budget." DoD meets this requirement by preparing for each
program a report called the MAIS Annual Report (MAR). The MAR should be
unclassified. If the required information is classified, then the classified data is replaced
with the word "CLASSIFIED."
The MAR is prepared using the Defense Acquisition Management Information Retrieval
(DAMIR) tool. A separate MAR for each Increment is prepared by the Program Manager
and consists of the following sections: Program Information, Points of Contact, Program
Description, Business Case, Program Status, Schedule, Performance Characteristics,
and Cost. Do not report Increments that have submitted a close-out MAR. The DAMIR
MAR Users Guide explain how to prepare the report.
Program Managers should submit the MAR via the Defense Acquisition Management
Information Retrieval (DAMIR) tool to the DoD Component Acquisition Executive (CAE)
(or equivalent official). The CAE's designated representative will then release the
unclassified reports through the established DAMIR hierarchy.
Components will submit Final Draft reports as detailed above for Office of Secretary of
Defense (OSD)-level review and coordination by the second Friday of January each
year. The Office of the Under Secretary of Defense (Acquisition, Technology, and
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      883
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Logistics) (OUSD(AT&L)) will coordinate the OSD-level review and provide feedback to
the Components through issue resolution teleconferences held during the second week
of February.
Components will release a Final MAR not later than the last Friday in February.
OUSD(AT&L) will prepare and coordinate transmittal letter and release (via DAMIR) the
final MARs to Congress no later than 45 days after submission of the President's
Budget (normally the first Monday in February). Table 10.11.2.2.T1 describes a typical
reporting cycle.
                                                                                                 Typical Target
        Event                                                   Responsible Party
                                                                                                 Date
        Train the Component Trainers                            OUSD(AT&L)                       Nov 15
        Task Components for MAR cycle                           OUSD(AT&L)                       Dec 10
        Submit final Draft MARs                                 Components                       Jan 15
        Review and consolidate feedback to
                                                                OSD staff                        Feb 5
        the OSD acquisition analyst
        OSD/Component issue resolution
                                                                OSD & Components Feb 10
        teleconferences
        Release Final MARs to OSD                               Components                       Feb 25
        Hold final OSD MAR Reviews                              OUSD(AT&L)                       Mar 2
        Coordinate MAR package within
                                                                OUSD(AT&L)                       Mar 3-Mar 10
        OSD
        Staff MAR package to USD(AT&L)
                                                                OUSD(AT&L)                       Mar 12
        for signature
        Sign MAR transmittal letters                            USD(AT&L)                        Mar 18
        Deliver MAR "transmittal" letters to
                                                                OUSD(AT&L)                       Mar 20
        Congress; release MARs via DAMIR
Chapter 144A of title 10 United States Code requires the Program Manager to submit a
written MQR to the Senior Official that identifies any variance from the projected
schedule, cost, or key performance parameters as baselined in the Major Automated
Information System (MAIS) Annual Report (MAR). All Programs Required to Report ,
once having submitted a MAR, will submit MQRs even if they have not experienced any
variance from their cost, schedule or performance baseline.
Although a separate report, the MQRs follow the Defense Acquisition Executive
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      884
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Summary (DAES) submission cycle and bear the same date as the program's DAES.
The Component Acquisition Executives representative should release the MQRs (via
the Defense Acquisition Management Information Retrieval tool) to the Senior Official
(and to the OSD Lead) on the last business day of every third month, maintaining the
DAES group reporting rotation . The OSD Lead may review MQRs to assist
Components in compliance with Chapter 144A of title 10 United States Code and this
guidance.
The Defense Acquisition Management Information Retrieval (DAMIR) tool will adapt the
most recent MAR or MQR (if MQR is more recent than the MAR) to create each new
MQR. Instructions for the MQR can be found in the DAMIR MQR Users Guide .
The Program Manager should update information that has changed and summarize any
program variances not previously reported in an MQR in the Program Status section.
The "Current Estimate or Actual" columns for each of the cost, schedule, and
performance factors should be updated to reflect the Current Estimate on the as-of-date
of the MQR.
The Program Manager's (PMs) Current Estimate is the latest estimate of program
acquisition cost, schedule milestone dates, and performance characteristic values of the
approved program (i.e., the approved program as reflected in the currently approved
Acquisition Program Baseline, Acquisition Decision Memorandum, or in any other
document containing a more current decision of the Milestone Decision Authority or
other approval authority).
    •    For cost, the current estimate is normally the President's budget plus or minus
         fact of life changes.
    •    For schedule, the Current Estimate is normally the PMs best estimate of current
         schedule milestone dates.
    •    For performance, it is normally the PM's best estimate of current performance
         characteristic values.
Program Managers (PMs) are responsible for reporting the execution status of their
programs to their acquisition management chain: Program Executive Officer,
Component Acquisition Executive, Milestone Decision Authority, and-for Chapter 144A
Quarterly Reports purposes-the Senior Official. If a PM becomes aware the program will
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      885
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
experience a variance exceeding a Significant or Critical Change threshold, the PM
should immediately notify his/her acquisition management chain, in advance of the due
date for the next MQR. Since the MQR is the vehicle for official notification of Significant
and Critical changes, the 45- or 60-day deadlines for reporting to Congress are
established from the date the MQR is due to the office of the Senior Official, i.e., the last
business day of the month the MQR is due.
The (staff office of a) Senior Official should 1) promptly review a Major Automated
Information System (MAIS) Quarterly Report ( MQR ) to see whether it reflects a less
than "significant" (or no) variance, a "Significant Change," or a "Critical Change" in cost,
schedule or performance and, 2) each month promptly provide the MQR to the Senior
Official. Senior Officials may choose to obtain independent opinions on the
measurement of a variance and proper determination of a Change.
If, based on the MAIS Quarterly Report (MQR ) , the Senior Official makes a
determination that a Significant Change has occurred, he or she must notify the
congressional defense committees in writing of that determination not later than 45 days
after the MQR was due.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      886
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
10.11.4.1. Significant Change Thresholds
A Significant Change is defined as one in which one of the following has occurred:
    •    There has been a schedule change that will cause a delay of more than 6
         months but less than a year in any program schedule milestone or significant
         event from the schedule submitted as the Original Estimate;
    •    The estimated total acquisition cost or total life-cycle cost for the program has
         increased by at least 15 percent, but less than 25 percent, over the Original
         Estimate, or
    •    There has been a significant, adverse change in the expected performance from
         the parameters submitted in the original MAR. The Department, however, has
         determined that a "significant, adverse change" is defined as a failure to meet a
         Key Performance Parameter (KPP) threshold value, which is the same definition
         chosen for a Critical Change in performance (addressed below). Therefore, all
         such failures will be determined to be Critical Changes .
When a Significant Change is determined, the Senior Official must notify the
congressional defense committees in writing that he or she has made such
Determination. The Notification should be in the form of a one-to-two page letter signed
by the Senior Official and is due to the congressional defense committees not later than
45 days after the date the MAIS Quarterly Report (MQR) was due in the office of the
Senior Official.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      887
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
10.11.4.3. Coordination and Transmittal of a Significant Change Notification to
Congress
Notifications are drafted by Program Managers and coordinated with their respective
Program Executive Officers and Component Acquisition Executives (CAE) for signature
by the Senior Official. The Notification must be coordinated with the Under Secretary of
Defense (Acquisition, Technology, and Logistics), the Deputy Chief Management
Officer, or the DoD Chief Information Officer, as appropriate before sending to
Congress. Copies of Notifications should be sent to the cognizant Overarching
Integrated Product Team (OIPT) Leader before transmittal to Congress. Example
Significant Change Notifications are available.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      888
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
10.11.5. Critical Changes
When the Senior Official anticipates or makes a determination that a Critical Change
has occurred, the Senior Official should initiate a process to satisfy the statutory and
regulatory requirements. This section describes those requirements and sets forth a
model process.
A Critical Change is defined as one in which any of the following has occurred:
    •    The system failed to achieve a full deployment decision (FDD) within 5 years
         after the Milestone A decision or if no Milestone A then the date when the
         preferred alternative was selected and approved by the Milestone Decision
         Authority (this threshold is more fully explained in section 10.11.5.2, below);
    •    There has been a schedule change that will cause a delay of one year or more in
         any program milestone or significant event from the schedule originally submitted
         to Congress in the Major Automated Information System (MAIS) Annual Report
         (MAR);
    •    The estimated total acquisition cost or total life-cycle cost for the program has
         increased by 25 percent or more over the Original Estimate submitted to
         Congress in the MAR; or
    •    There has been a change in the expected performance of the MAIS that will
         undermine the ability of the system to perform the functions anticipated at the
         time information on the program was originally submitted to Congress in the
         MAR. The Department has determined that a critical performance change is
         defined as a failure to meet a Key Performance Parameter threshold value.
The phrase "failed to achieve" is interpreted literally; i.e., the Increment must have
actually exceeded (not expected to exceed) five years between start of the 5-year
development clock and FDD. A breach of this threshold will therefore be reported in the
Major Automated Information System (MAIS) Quarterly Report ( MQR ) next due after
the 5-year point.
If, however, any other Critical Change is reported in advance of the 5-year point and it is
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      889
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
expected that FDD will not occur within the 5-year threshold, include an additional
determination of the 5-year-to-FDD breach in the evaluation and report to Congress.
When the 5-year point arrives, re-send the same report to Congress with a transmittal
letter indicating that "the previously reported certifications were meant to apply now."
If there is no reason to determine and report any Critical Change in advance of failure to
achieve FDD within 5 years, such determination, evaluation, report, and certification will
be accomplished after the 5-year point is reached in accordance with the first paragraph
of this section.
For Acquisition Category III programs that are graduating to MAIS status and have
achieved an FDD (no matter how long it took), that event has overcome the 5-year-to-
FDD breach criterion, and it is no longer applicable. Graduating programs carry their
program history with them (including the date the 5-year development clock was
started).
The 5-year development clock starts when the automated information system or
information technology investment is granted Milestone A approval for the program, or if
there was no Milestone A decision, the date when the preferred alternative is approved
by the Milestone Decision Authority (excluding any time during which program activity is
delayed as a result of a bid protest).
Because schedule events and thresholds are expressed in whole months, the additional
time to be added as a consequence of bid protest is calculated by dividing the bid
protest time lost in days by 30 and rounding up to the next month. For example, if a bid
protest was filed on October 26, 2011 and resolved on December 20, 2011 (53 days),
the 5-year development clock would be extended two months (53/30=1.76 and rounded
up to 2 months).
With respect to a MAIS program, the Full Deployment Decision is the final decision
made by the Milestone Decision Authority (MDA) authorizing an Increment of the
program to deploy software for operational use. Each Increment can have only one
FDD. The 5-year development clock stops when the MDA signs the FDD Acquisition
Decision Memorandum.
If the Increment will have multiple partial deployments, the MDA should specifically
designate which partial deployment decision will serve as the FDD for the entire
Increment. At the MDAs discretion and as specified in the Acquisition Strategy, a partial
deployment would be appropriately designated as the FDD with an accumulation of
successes related to the entire Increment, such as:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      890
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    IOT&E indicates that the system is operationally effective, suitable, and
         survivable;
    •    High percentage of capability fielded;
    •    High percent of geographical fielding completed;
    •    High percentage of legacy system(s) replaced;
    •    Insignificant risk associated with remaining releases ; and
    •    Achievement of Initial Operational Capability.
If the MDA has not formally specified which partial deployment will serve as the FDD, by
default, the last partial deployment will be the FDD.
Upon determination of a Critical Change, the statute directs an evaluation ("E") of the
program, including "an assessment of-
    •    (E1) "the projected cost and schedule for completing the program if current
         requirements are not modified;
    •    (E2) "the projected cost and schedule for completing the program based on
         reasonable modification of such requirements; and
    •    (E3) "the rough order of magnitude of the cost and schedule for any reasonable
         alternative system or capability."
While not per se a part of the Critical Change Report that will be submitted to the
congressional defense committees, these three "E" assessments will feed into the four
certification ("C") areas of the Critical Change Report described below.
The statute further directs delivery of a report (i.e., Critical Change Report (CCR)) to the
congressional defense committees, including: "a written certification (with supporting
explanation) stating that-
To avoid a prohibition on the obligation of funds for major contracts, the report must be
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      891
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
submitted to the congressional defense committees not later than 60 days after the date
the Major Automated Information System (MAIS) Quarterly Report (MQR) was due to
the staff office of the Senior Official.
For Acquisition Category (ACAT) IAM programs, Triage Meeting attendees should be
senior representatives from 1) the staff office of the Senior Official, 2) the office of the
Joint Chiefs of Staff (J8,Force Structure Resources and Assessment), 3) the office of
the Deputy Director, Program Evaluation (plus the office of the Deputy Director Cost
Analysis if the Under Secretary (Acquisition, Technology and Logistics) is the Milestone
Decision Authority), 4) the office of the Director, Acquisition Resources & Analysis, and
5) the OSD office with program oversight responsibility (Overarching Integrated Product
Team, Investment Review Board, or equivalent).
For ACAT IAC programs, Triage Meeting attendees should be from analogous
Component organizations.
The staff office will document the recommendations of the Triage Meeting in a draft
Determination and Tasking memorandum to be signed by the Senior Official. The
"Determination and Tasking" memorandum will:
    •    State the Senior Official's determination and nature of the Critical Change;
    •    Direct a program evaluation be conducted;
    •    Direct a report of the results be prepared; and
    •    Designate leadership of a Critical Change Team to manage the process.
As part of the "Determination and Tasking" memorandum, the Senior Official should
establish leadership for a CCT to conduct the program evaluation and produce the
Critical Change Report. A Team Leader from an appropriate oversight or program
integration office under the Senior Official will organize the CCT and integrate the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      892
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
contributions of the several IPTs. The Team Leader should be an O-5/O-6 or equivalent
civilian. If the magnitude of the program warrants it, a Flag/General Officer/Senior
Executive Service-level "Integrated Product Team (IPT) Principals Lead" from the
Senior Official's staff should be named to provide advice and direction to the CCT, as
well as to chair meetings of a committee of "IPT Principals." Figure 10.11.5.5.2.F1. is a
notional depiction of CCT Organization and Reporting Paths.
Ultimately, the Senior Official must be satisfied sufficiently with the evaluation and report
to sign the certification statements required by the statute. When the Senior Official
perceives the need to specify leadership or membership of individual IPTs, that
specification should also be made as part of the "Determination and Tasking"
memorandum. Otherwise, the IPT Principals Lead and Team Leader will select
individual members and leadership of the IPTs that will focus on certifications C1-4.
Membership should include all interested parties, and individuals must be empowered
to represent their organizations. In all cases, IPT membership and leadership
designations should consider joint/departmental interests as well as the circumstances
of the Critical Change.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      893
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Principals Lead as necessary, and at least once for a mid-process progress check.
Eventually, the CCT should meet to pre-brief the IPT Principals Lead on the final
Report. The final Report and briefing should then be presented to the IPT Principals for
a final review of the Report before delivery to the Senior Official for certification
(signature).
The Critical Change process should be conducted by IPTs under the Critical Change
Team (CCT), each focused on Certifications 1-4 . To preserve IPT and CCT
independence to the maximum extent practicable, team membership should be
independent of the Program Management Office (PMO). IPT membership should be
selected to maximize the success of the group and avoid non-productive contributions.
For Acquisition Category (ACAT) IAM programs, IPT membership is suggested below.
For ACAT IAC programs, the IPT membership representatives should be from
analogous Component organizations plus the appropriate OSD organizations.
    •    IPT C1 will document the explanation that permit’s the Senior Official to certify
         "the automated information system or information technology investment to be
         acquired under the program is essential to the national security or to the efficient
         management of the Department of Defense." The IPT C1 should write a few
         paragraphs about the need for the program:
             o Include threat, mission, and current systems available to meet the threat
                 or efficient management need.
             o Reference relevant strategy documents, Concept of Operations
                 (CONOPS), roadmaps, requirements documents, threat assessments,
                 Quadrennial Defense Review, etc.
             o Address the program and the capability to be acquired, as appropriate.
             o IPT C1 members : Component operations staff, Program Executive Officer
                 (PEO) staff, Component Acquisition Executive (CAE) staff, user
                 representatives, Program Manager (PM), Joint Chiefs of Staff (JCS)/J8,
                 Office of the Secretary of Defense (OSD) (Principal Staff Assistant (PSA)
                 and the Overarching Integrated Product Team (OIPT) acquisition
                 analysts).
    •    IPT C2 will document the explanation that permit’s the Senior Official to certify
         that "there is no alternative to the system or information technology investment
         which will provide equal or greater capability at less cost." This IPT should:
             o Reference any existing Analysis of Alternatives (AoA) and discuss any
                 major deviations from past analysis. Do not re-accomplish the AoA.
             o Identify any alternative systems.
             o Include the assessment (E3) of the "rough order of magnitude of the cost
                 and schedule for any reasonable alternative system or capability."
             o IPT C2 members : Component operations staff, user representatives,
                 Component & program office cost estimators, PM, CAE and PEO staff;
                 JCS/J8; OSD (PSA; Office of the Deputy Director, Program Evaluation;
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      894
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                and OIPT acquisition analyst).
    •    As indicated in Figure 10.11.5.5.2.F1 , above, IPT C3 is responsible for
         assessing E1 and E2, forming conclusions thereupon, and recording an
         explanatory statement that permit’s the Senior Official to certify "the new
         estimates of costs, schedule, and performance parameters with respect to the
         program and system or information technology investment, as applicable, have
         been determined, with the concurrence of the Director of Cost Assessment and
         Program Evaluation (D, CAPE), to be reasonable." This IPT should:
             o Identify changes that have occurred to the program's requirements.
             o Summarize acquisition and total life-cycle cost growth from the Original
                Estimate. Display changes in constant (BY) and current (TY) dollars.
             o Include rationale for growth such as technical uncertainties/corrections or
                changes in inflation, requirements, escalation outlay, quantity, schedule,
                budget, or estimating errors.
             o Include the assessment (E1) about the "projected cost and schedule for
                completing the program if current requirements are not modified."
             o Include the assessment (E2) about "projected cost and schedule for
                completing the program based on reasonable modification of ...
                requirements."
             o Update the cost estimate and milestone schedule
             o Develop a draft Acquisition Program Baseline for management approval
                concurrent with the Critical Change Report. The Original Estimate status is
                explained in 10.11.7.1.
             o IMPORTANT: In addition to concurrence, an independent cost estimate by
                D, CAPE) may also be required. See 10.11.7.2 and 10.5.1. Independent
                Cost Estimates for further explanation.
             o IPT C3 members: Component operations staff, user representatives,
                Component & program office cost estimators, PM, CAE and PEO staff;
                JCS/J8; OSD (PSA; Office of the Deputy Director, Cost Assessment; OIPT
                acquisition analyst).
    •    IPT C4 will document the explanation that permit’s the Senior Official to certify
         "the management structure for the program is adequate to manage and control
         program costs." The IPT C4 should:
             o Review PMO and contractor management structures.
             o Conduct site visit’s if the IPT Principal Lead determines they would be
                useful.
             o Re-examine recent program oversight reviews and recommendations to
                appraise the degree and success of implementation.
             o Develop a draft Acquisition Decision Memorandum for the MDA to direct
                corrective actions.
             o IPT C4 members : CAE and PEO staff; PM; OSD (Office of the Assistant
                Secretary of Defense (Research and Engineering) (ASD(R&E)); Offices of
                the Deputy ASD (Developmental Test & Evaluation) and the Deputy ASD
                (Systems Engineering), Office of the Director, Defense Procurement and
                Acquisition Policy, Office of the DoD Chief Information Officer; and the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      895
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                   OIPT acquisition analyst).
             IPT                        C1        C2                        C3                  C4
             Organization               essential no                        new                 management
                                                  alternative               estimate
             PMO/PM (as                 X         X                         X                   X
             required)
             PMO                                       X                    X
             Cost/Finance
             PEO Staff                                 X                    X                   X
             CAE Staff                  X              X                    X                   X
             Component                  X              X                    X
             Operations Staff
             User                       X              X                    X
             Representatives
             JCS/J8                     X              X                    X
             OSD Acquisition            X              X                    X                   X
             Analyst
             DASD(SE)                                                                           X
             DASD(DT&E)                                                                         X
             AT&L(DPAP)                                                                         X
             OSD CAPE                                  X                    X
             OSD PSA                    X              X                    X                   X
             DoD CIO                                                                            X
Figure 10.11.5.5.4.F1 . portrays a typical Critical Change process calendar and shows
the general flow of events described in 10.11.5.5.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      896
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                    Figure 10.11.5.5.4.F1. Critical Change Process Calendar
The Critical Change Report is envisioned to be a document of about six pages: a two-
page letter offering a succinct introduction/background on the program and the events
that led to the Critical Change that contains the required certifications and one page
each for the explanations provided by the Integrated Product Teams (IPTs) C1-4. The
IPT C1-4 sections include an outline of corrective actions that will be taken to add
discipline to program execution and avoid repeated deviation from the new Original
Estimate. Example CCRs are available.
In case of an audit, it is important for the Component to keep all records used to prepare
the CCR.
In accordance with section 2445c(d)(1)(B) of title 10, United States Code , CCRs must
be sent "through the Secretary of Defense, to the congressional defense committees."
In cases where the Senior Official is an individual within OSD, this will be inherent in the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      897
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
CCR coordination and signature process.
In cases where the Senior Official is not an individual within OSD, the CCR shall be
signed by the Senior Official and provided to the cognizant OSD official for transmittal to
Congress. The signed CCR should be provided to the appropriate OSD official with draft
Transmittal Letters addressed to the congressional defense committees no later than 5
working days before expiration of the 60-day period.
If the Senior Official determines a Critical Change has been reported by a program and
a Critical Change Report ( CCR ) is not submitted to the congressional defense
committees within the 60-day period, "Appropriated funds may not be obligated for any
major contract under the program." For Chapter 144A purposes, the term "major
contract" is defined as any contract under the program that is not a firm-fixed price
contract whose target cost exceeds $17M (FY00 constant dollars); or if no contract
exceeds $17M (FY00 constant dollars), then the largest contract under the program.
Program Managers should not obligate funds for a major contract during the period in
which the CCR is being prepared.
The prohibition on the obligation of funds will cease to apply on the date on which the
congressional defense committees have received a report in compliance with Chapter
144A requirements.
According to Chapter 144A of title 10 United States Code , a Critical Change is the only
opportunity to update the Original Estimate contained in the Major Automated
Information System (MAIS) Annual Report (MAR): "an adjustment or revision of the
Original Estimate or information originally submitted on a program may be treated as
the Original Estimate or information initially submitted on the program if the adjustment
or revision is the result of a Critical Change."
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      898
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
10.11.7.1. Status of the Critical Change Report (CCR) Estimate
The new estimates of cost, schedule, and performance parameters included in a CCR
will be the basis for a revised Original Estimate in the Major Automated Information
System (MAIS) Annual Report ( MAR ) and the MAIS Quarterly Report ( MQR ), and
inform the continuing management of the program. An Acquisition Decision
Memorandum and an Acquisition Program Baseline ( APB ) should therefore be
coordinated concurrently with the CCR to direct the actions responsive to the CCR.
Failing to get concurrent signatures, the Program Manager (PM) should make approval
of an updated APB a high priority.
Once the CCR has been sent to Congress and before the next MQR is prepared, the
Program Manager should submit the new cost, schedule, and performance parameters
as an updated MAR Original Estimate using the Defense Acquisition Management
Information Retrieval (DAMIR) tool (see the DAMIR MAR Users Guide , and call the
DAMIR Hot Line for assistance). Subsequent MQRs will commence reporting variances
from the revised Original Estimate .
The Weapon Systems Acquisition Reform Act of 2009 (P.L. 111-23, May 22, 2009),
codified at section 2334(a)(6)) of title 10 United States Code , requires the Director,
Cost Assessment and Program Evaluation (D, CAPE) to conduct an Independent Cost
Estimate (ICE) in the case of a Major Automated Information System (MAIS) Critical
Change if the Milestone Decision Authority (MDA) is the Under Secretary of Defense
(Acquisition, Technology, and Logistics) (USD(AT&L); and at any other time considered
appropriate by the Director or upon the request of the USD(AT&L).
Additionally, DTM 11-009 , Acquisition Policy for Defense Business Systems (DBS),
June 23, 2011, requires the D, CAPE to conduct an ICE for all DBS MAIS reporting a
Critical Change if the MDA is the USD(AT&L), the Deputy Chief Management Officer, or
the Department of Defense Chief Information Officer. If the MDA is delegated after
incurrence of a Critical Change, an ICE is still required.
If a D, CAPE ICE was conducted, great weight should be given to the resulting estimate
derived from that effort as it is likely to possess the accuracy desired for publication in
the Acquisition Program Baseline.
The Base Year of an Original Estimate (as reported in the Major Automated Information
System (MAIS) Annual Report (MAR) and MAIS Quarterly Report) may be updated
without going through a Critical Change process, provided that the proper conversion
factors have been applied. Such a conversion should be footnoted in those reports
through submittal of the next MAR. The conversion calculations should be retained as a
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      899
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Memorandum for the Record in the program files.
The purpose of the DAES is to provide a venue to identify and address, as early as
possible, potential and actual program issues which may impact the Department of
Defenses (DoD’s) on-time and on-schedule delivery of promised capabilities to the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      900
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
warfighter. The DAES is not just a report; it is a process that includes;
The goal of the DAES process is to facilitate communication between, and provide
feedback to, key stakeholders in OSD, the Joint Staff, the Components, and Program
Offices. It is important to note that the DAES is an internal management system meant
to fulfill the needs of senior Department of Defense executives and is NOT for general
public consumption. Unlike the Selected Acquisition Report information, DAES
information is considered to be For Official Use Only and is not releasable outside the
department without prior approval from the Director, Acquisition and Resource
Analyses.
The DAES process enables the USD(AT&L) to fulfill statutory requirements to manage
and oversee MDAPs and MAIS programs. Additionally, it establishes a mechanism for
the Department to meet the Unit Cost Reporting requirement of section 2433, Chapter
144 of title 10, United States Code . Access to the data reported through the DAES also
enables the Director of Performance Assessments and Root Cause Analyses to fulfill
statutory requirements to perform program assessments as directed by the Weapon
Systems Acquisition Reform Act of 2009 ( Section 103 Public Law 111-23 ).
The DAES process for a program begins when the Under Secretary of Defense
(Acquisition, Technology, and Logistics) (USD(AT&L)) designates the program as a
DAES reporting program and the Office of the USD(AT&L), specifically the Office of the
Director, Acquisition Resources and Analyses (ARA), assigns it to a quarterly reporting
group (A, B, or C). Most DAES reporting programs are ACAT ID or IC programs and full
DAES reporting usually begins at program initiation (typically Milestone B) and after the
program has submitted its initial Selected Acquisition Report (SAR).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      901
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
month. For example, the Group A DAES submitted by the PM at the end of January is
assessed by OSD in February and the corresponding DAES meeting is held in March.
Once DAES reporting is initiated, it continues until the program is 90% or more
delivered through the production phase (or 90% expended, if RDT&E only), at which
time a program will begin submitting only a Unit Cost Report (UCR) DAES pursuant to
section 2433 of title 10, United States Code that is supplemented by Sustainment
information.
DAES reporting may be terminated for a program when it is 90% delivered or expended
and the final SAR has been submitted. The official list of active DAES reporting
programs is maintained by the Office of the Director, ARA and is available via the
Defense Acquisition Management Information Retrieval (DAMIR) system.
The DoD Components submit the DAES information to DAMIR in accordance with the
prescribed monthly or quarterly submission cycle. DAES submissions are due to OSD
on the last working day of the month. The required information consists of both the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      902
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
electronic DAES information and supplemental Microsoft Office Power Point charts.
The supplemental Power Point charts are sent to the Component who then e-mails
them to DAMIR@osd.mil . The DAMIR team loads the charts into the DAMIR
Acquisition Documents where they are visible to any DAMIR user with DAES access to
the program. As the DAMIR system is the mechanism that OSD uses to view and
assess the programs, it is highly recommended that each program office access DAMIR
and validate that all information and supplemental charts were correctly submitted.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      903
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                              Program Initiation-                                     Minimum Update
Information                                   75%                 > 75%                               Frequency
Unit Cost                                     Calculated                                              Quarterly
Contracts/Earned Value                        -                                                       Monthly
Deliveries & Expenditures                     -                                                       Quarterly
Operating & Support Costs                     -                   -                                   Quarterly
Sustainment                                   -                   -                                   Quarterly
Risk Summary Chart (Power
Point)                                        -                                                       Quarterly
Issue Summary Chart (Power
Point)                                        -                                                       Quarterly
The DAES information submitted should be the Program Managers assessment of the
program and be consistent with the Acquisition Program Baseline (APB) , President’s
Budget (PB), Acquisition Decision Memorandums (ADMs) and other official program
guidance. The Program Manager is responsible for both ensuring the accuracy,
completeness and consistency of the information, and for elevating risks and other
issues that may require managerial attention.
The OSD and Joint Staff stakeholders collectively evaluate each program in 10 different
categories. A green/yellow/red rating and an associated narrative is provided for each
category rated by an individual stakeholder. The categories evaluated by OSD are
identical to the categories evaluated by the Program Offices. The OSD assessments
provide an independent assessment of program execution status and are used when
selecting programs to be briefed at the DAES meeting. Detailed instructions on
completing a DAES assessment can be found in the DAMIR Acquisition Documents.
Table 10.12.1.5.T1 below shows which OSD and Joint Staff stakeholders have primary
responsibility for each indicator; however, any stakeholder may evaluate any category.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      904
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                          Table 10.12.1.5.T1 List of Assessment Indicators
In addition to the program assessment process performed by the Office of the Secretary
of Defense (OSD) and Joint Staff stakeholders, the Office of the Director,
ARA/Acquisition Visibility (AV) concurrently performs a data quality assessment of the
submitted information. All information is reviewed for availability, currency, and
consistency. Non-compliance with reporting requirements is reported to the
Components and requires the immediate correction and re-submittal of information.
It is typically a two-hour meeting scheduled for the third week of the month. Attendance
is tightly controlled.
Once the Office of the Secretary of Defense (OSD) assessments have been submitted,
the DAES meeting agenda selection process begins. The Director, Acquisition
Resources and Analyses (ARA) chairs a DAES Program Selection meeting on
approximately the 15 th working day of the month at which the Overarching Integrated
Product Team Leaders and the Director, Program Assessment and Root Cause
Analysis are responsible for recommending programs and/or issues for review at the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      905
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
monthly DAES meetings.
The criteria for nomination vary from month to month, but nominations normally fall
within one of the following categories:
    •    Program Briefs. This is the most typical agenda item. Programs may be selected
         due to specific issues that requires management attention or as a good news
         story. Program assessments are just one factor used to determine if a program
         should be recommended, as all knowledge of the program is considered when
         making the selection. Program Briefs are typically presented by the Program
         Manager.
    •    Single Issue Program Updates. These are condensed briefings that normally
         focus on a previously identified issue where a short status update is required.
         Single Issue Program Updates are typically given by the OSD staff.
    •    Program Executive Officer (PEO) Portfolio Briefs. A PEO portfolio brief may be
         recommended in conjunction with a specific Program Brief or independent of a
         specific Program Brief. Briefings requested independent of a specific program
         brief are typically due to a systemic issue affecting multiple programs within the
         portfolio. PEO Portfolio Briefs are typically presented by the PEO.
Typically, 3 or 4 programs or issues are selected for the agenda each month. Normally,
programs that are within 90 days (before or after) of a Defense Acquisition Board review
are excluded from consideration. Once the agenda selection is finalized, the Office of
the Director, ARA publishes the agenda and schedule.
Templates for the Program Briefings can be found in the Acquisition Documents section
of the Defense Acquisition Management Information Retrieval system. Primary focus
areas of the briefings should be: contract and Acquisition Program Baseline compliance
status, closure plans for known issues, risk management/mitigation of potential issues,
and Better Buying Power initiatives (to include should cost).
The Program Briefing charts are due to the Office of the Director, Acquisition Resources
and Analyses (ARA) no later than 6 working days prior to the scheduled date of the
DAES meeting.
In addition to the selected program or issue briefings, the Office of the Director, ARA
provides summaries of the Program Manager assessments and the OSD and Joint Staff
assessments for each selected program as well as an update on any outstanding
actions from previous DAES meetings. A data quality assessment update is also
provided by the Office of the Director, ARA/Acquisition Visibility. Other systemic issues
are briefed as required or directed by the Principal Deputy Under Secretary of Defense
(Acquisition, Technology, and Logistics) (PDUSD(AT&L)).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      906
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Within 5 working days of the DAES meeting, the Office of the Director, ARA submits the
DAES meeting minutes, including any action items, to the PDUSD(AT&L) for approval.
Once approved, the meeting minutes are posted in the DAMIR Acquisition Documents.
The Office of the Director, ARA is responsible for tracking DAES action items to
completion.
In 2007, the Office of the Under Secretary of Defense (Acquisition, Technology, and
Logistics) (USD(AT&L)) started an initiative to achieve Acquisition Visibility (AV) within
the Department of Defense (DoD). AV is defined as having timely access to accurate,
authoritative, and reliable information supporting acquisition oversight, accountability,
and decision making throughout the Department for effective and efficient delivery of
warfighter capabilities. AV began as a concept in early 2008 with a demonstration of
data governance and Service Oriented Architecture to support major weapons system
decision-making.
Five years later, the technology framework and governance process has solidified,
providing the Defense Acquisition Community with a capability that supports
management of Major Defense Acquisition Programs (MDAPs) and Major Automated
Information System (MAIS) programs. It is a capability that:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      907
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Captures acquisition data from the Military Departments and the Office of the
         Secretary of Defense (OSD);
    •    Federates that data through a single interface; and
    •    Publishes this information through web-services where a customer can access
         the appropriate information for his or her reporting or tracking requirements.
Currently, over 180 data elements are used to provide acquisition system information
that is compartmentalized into seven major categories: earned value management, unit
cost, budget, milestones, sustainment, science and technology, and program
administration. AV is now entering into a phased-production environment and working to
increase the number of data elements (totaling approximately 500) available to the AV
capability which will bring additional, relevant data to our decision makers.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      908
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Based upon an Office of the Secretary of Defense enterprise decision, use of the
DAMIR system is mandatory for all Major Defense Acquisition Programs (MDAPs) and
all MAIS programs and must be employed to satisfy statutory requirements for SAR and
MAR submissions and the APB. Non-MDAP and non-MAIS programs may also use the
system.
The Director, Acquisition Resources and Analysis, has responsibility for the
development, upgrade, and maintenance of the DAMIR system. The DAMIR system
includes instructions for preparing the APB, the SAR, the MAR, the DAES, the Unit Cost
Report, and the P/BR submission (referred to in the DAMIR system as POM), including
administrative procedures. User help can be obtained through the following sources:
The DAMIR system is the authoritative source for all APBs. APBs for Acquisition
Category (ACAT) I and IA programs must be created and released using the DAMIR
system. The DAMIR system provides the data entry capability and required workflow to
create and edit an APB. An APB is approved within the DAMIR system when a formal
signature page with the Milestone Decision Authority’s signature is acquired--at this
point, the APB can no longer be edited. The APB Objectives and Thresholds will also be
visible within both the Selected Acquisition Report and Defense Acquisition Summary
(DAES) views in the DAMIR Purview Program View module. The full Web Services data
exchange with the Components acquisition information systems: Army (Acquisition
Information Management), Navy (Dashboard), and Air Force (System Metric and
Reporting Tool) also allows the Components to pull the official APBs into their
respective systems to use in their respective DAES processes.
The DAMIR system is the authoritative source for SARs and provides the data entry
capability and required workflow to create and edit a SAR. The computational model
capability is also integrated into the DAMIR SAR module. DAMIR provides extensive
data checks, ensuring that a SAR is not released to Congress with critical errors.
[NOTE: Acquisition Program Baseline (APB) values are pulled from the APB module
and cannot be edited within the SAR.] All Major Defense Acquisition Programs are
required to use DAMIR to prepare the annual and quarterly SARs. Hard copy SARs are
no longer submitted to Congress. Instead, Congress is granted access to the SAR
information through DAMIR. The only exception is when the SAR contains classified
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      909
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
information. In those few cases, a hard-copy classified annex is submitted.
The DAMIR system is the authoritative source for MARs and provides the data entry
capability and required workflow to create and edit a MAR. All MAIS programs are
required to use the DAMIR system to prepare the annual MARs. The DAMIR MAR
Module supports both Baselined and Unbaselined MAIS programs. The DAMIR system
provides extensive data checks, ensuring that a MAR is not released to Congress with
critical errors. Historical MARs (December 2008 December 2010) will be stored in PDF
format in DAMIR Acquisition Documents .
For Baselined MARs, a MAR Original Estimate (OE) module is provided; the MAR OE
will be automatically pulled into the MAR by the DAMIR system. The MAR OE can be
initialized from the APB. Hard copy MARs are no longer submitted to Congress.
Instead, Congress is granted access to the MAR information through the DAMIR
system.
To improve information sharing and to reduce duplicate data entry, DAES information is
now obtained either via Web Services data exchange between the Components'
acquisition information systems and the DAMIR system or directly via the DAMIR
Create or Edit DAES Report module. Major Automated Information System programs
that are not Component-specific must enter all DAES information directly into the
DAMIR system. (The action in the data exchange between the DAMIR system and the
Component systems is referred to as a push; DAES data is pushed to the DAMIR
system via Web Services on a monthly/quarterly basis.) DAES information is required to
be submitted for all Acquisition Category (ACAT) I and IA programs using one of the
previously mentioned collection methods.
Acquisition Program Baseline (APB) values displayed in the DAES/Web Services view
are pulled directly from the APB module and cannot be updated via web services. In
addition to the Currently Approved APB Objectives and Thresholds, for reference only,
the DAMIR DAES submission will also show the Initial Phase Objectives and
Thresholds, if applicable.
Office of the Secretary of Defense (OSD) Assessments against the quarterly pushed
Program Managers Assessments are created in the DAMIR DAES Review module.
Each organization has the ability to rate a program on any of the eleven indicators. OSD
Assessments are visible in the DAMIR system the month after the Program Managers
assessments are submitted.
Two unclassified supplemental Microsoft Word Power Point briefing slides (Issues
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      910
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Summary and Risk Summary.) must be submitted with the quarterly DAES submission.
For those programs selected to be on the monthly DAES Meeting Agenda, an additional
eleven slides must also be submitted. The list of thirteen (total) required slides is:
    1.  Program Information
    2.  Overview
    3.  Issues/Help Needed
    4.  Schedule
    5.  Cost and Quantity
    6.  Quad Chart
    7.  Earned Value
    8.  Risk Summary
    9.  Interrelationships, dependencies, and Synchronization with Complementary
        Systems
    10. Sustainment
    11. Better Buying Power
    12. International Program Aspects
    13. O&M and O&S Crosswalk Chart
When received, these slides are loaded into the DAMIR Acquisition Document module
by the DAMIR administrative support staff. Access to DAES information is based on
approved permissions.
The DAMIR Ad hoc Report s module provides a capability for cross-program analysis.
Access to completed reports is permission based, but all users have access to SARs
and SAR Ad hoc reports.
Users may request a report, or a query, of the DAMIR system database by sending an
e-mail message to damir@osd.mil . Results from report requests will be added to the
long-standing Ad hoc report list; results from queries are a one-time data dump into an
excel spreadsheet and will not be turned into an ad hoc report unless specifically
requested.
The DAMIR Portfolio View module provides a cross-program analytical capability much
like that of the DAMIR Ad hoc Reports module with the addition of graphical
representations of the data. The DAMIR system software presents both dashboard and
detailed views of Selected Acquisition Report data or Defense Acquisition Executive
Summary data in the form of tables, charts, and graphs. The data presented in these
views is based on portfolios of identified programs. The DAMIR system supports several
standard portfolios that are accessible to all users. The standard portfolios allow the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      911
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
user to view data for all programs or for only those programs related to a specific
Component. Users are also able to create personal portfolios that reference only
specific programs that they identify. Any of these portfolios may then be used to create
the portfolio views relevant to these programs. The DAMIR Portfolio View module also
allows the user to customize their dashboard views uniquely, so that the user is
presented with only those charts and graphs which are most useful to their inquiry. The
ability to see draft and unofficial information is permission-based.
During the first phase in the annual budget cycle, the Office of the Director, Cost
Assessment and Program Evaluation (D, CAPE) and the Office of the Under Secretary
of Defense (Comptroller) (USD(C)) are responsible for conducting an annual Integrated
Program/Budget Review (P/BR) on all Department of Defense (DoD) resources and
require an annual Integrated Program/Budget data submission from all DoD
Components.
Data shall be submitted for all current MDAPs and MAIS programs, as well as
acquisition program concepts and Unbaselined MAIS programs that will achieve
Milestone B prior to the end of the calendar year, or have been certified under the
provisions of section 2366a of title 10 United States Code . The MDAP and MAIS
program data shall include all acquisition costs (RDT&E, Procurement, MILCON,
Acquisition O&M, Working Capital Funds, or Other Financing with an explanation) and
MDAP quantities (RDT&E and Procurement) for the full acquisition cycle of each MDAP
and each MAIS program (by fiscal year and funding appropriation). For MAIS programs,
the total life-cycle cost is the development cost plus ten years of Operation and Support
(O&S) costs following Full Deployment declaration. For MDAP programs, the full
acquisition lifecycle and associated funding is defined by the D, CAPE and USD(C)
annual Integrated Program/Budget Submission Guidance.
All MDAPs and MAIS programs shall submit annual P/BR data that has been
coordinated with and approved by the appropriate Component Acquisition Executive
(CAE) into their Components acquisition information system. For efficient information
sharing, the CAE systems shall publish P/BR data to Acquisition Visibility (AV) using the
Defense Acquisition Management Information Retrieval (DAMIR) system Web Services.
Components without access to one of the Component acquisition information systems
shall use the DAMIR Create or Edit a Budget Report module. Notwithstanding the
method of transmission, exposure, or publication, the CAE-approved P/BR data shall be
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      912
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
available for consumption by AV and AV subscribers as determined by annual
guidance.
All MDAPs shall submit P/BR data at the sub-program level and all MAIS programs
shall submit at the increment level as appropriate, consistent with the Track-to-Budget
rules established for the data submission to the Program Resources Collection Process
(PRCP), per the program/budget transparency requirements of the Fiscal Year
Integrated Program/Budget Submission Guidance.
MDAPs and MAIS programs whose schedules have changed due to funding and
quantity changes in the P/BR submission shall report estimated program schedule
changes. A limited number of large MDAP and MAIS programs may be required to
provide P/BR revision data periodically during the Integrated P/BR. For programs so
designated, revisions driven by the MDAP Issue Review Team or by any other direction
shall cover the same data included with the original transmission and will be maintained
by the responsible Component.
Components shall also review their acquisition program budgets, and ensure RDT&E
Program Element funding is reflected in the RDT&E budget activity that aligns with the
program's acquisition phase as defined in DoD Instruction 5000.02.
A program, or a technology project that will result in a program, has special interest if it
has one or more of the following factors: technological complexity; Congressional
interest; a large commitment of resources; the program is critical to achievement of a
capability or set of capabilities; the program is part of a system of systems; or the
program is a joint program. Generally, the level of funding, desired oversight and
reporting will determine the Milestone Decision Authority and whether or not the
program is designated a "Special Interest" program.
Programs that already meet the dollar thresholds for a Major Defense Acquisition
Program (MDAP) may not be designated Special Interest programs.
If a program meets one of the MDAP dollar thresholds (per section 2430 of title 10,
United States Code ), then the program is automatically an MDAP. If the program is
below the dollar threshold for designation as an MDAP, the Defense Acquisition
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      913
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Executive (DAE) may still choose to designate the program an MDAP if he or she
deems oversight with statutory reporting is needed. An MDAP is designated ACAT I and
it’s oversight comes from the DAE. The DAE can either retain MDA or delegate it to a
Component Head or Component Acquisition Executive (CAE). If the DAE retains MDA,
the program is an ACAT ID program. If the DAE delegates MDA to the Component
Head or CAE, then the program is an ACAT IC program. As an MDAP, the program
must meet all statutory reporting requirements for MDAP programs.
If the DAE desires oversight of a program that falls below MDAP dollar thresholds, and
deems that statutory reporting associated with MDAPs is not needed, the program is
designated a Special Interest Program. If the DAE retains MDA, the program is an
ACAT ID Special Interest program. If the DAE delegates MDA to the Component Head
or CAE, then the program is an ACAT IC Special Interest program. The CAE may also
designate programs that are ACAT II or below as CAE Special Interest Programs.
For such Special Interest programs, the reporting requirements are tailored to meet the
specific oversight needs and must be captured in an Acquisition Decision
Memorandum.
Table 10.14.1.T1 MDAP & Special Interest Designations & Decision Authorities
If an Automated Information System (AIS) program meets one of the dollar thresholds
for it to be designated a MAIS, then the program is automatically a MAIS program. If an
Acquisition Information System (AIS) program falls below the MAIS dollar thresholds,
the Defense Acquisition Executive (DAE) may still designate the program a MAIS
program if he or she deems that oversight with statutory reporting is needed. A MAIS
program is designated ACAT IA and the Milestone Decision Authority (MDA) is the
Defense Acquisition Executive (DAE) or the person within OSD to whom the DAE
delegates MDA. If the MDA remains within OSD (with the DAE or delegated MDA within
OSD), the program is an ACAT IAM program. If MDA is delegated to the Component
Head or CAE, then the program is an ACAT IAC program. A MAIS program must meet
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      914
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
all statutory reporting requirements for MAIS programs.
If the DAE desires oversight of an AIS program, but deems that the statutory reporting
associated with MAIS programs is not needed, the program is designated a "Special
Interest" program. If MDA remains within OSD (DAE or DAE delegated MDA within
OSD), the program is an ACAT IAM Special Interest program. If MDA is delegated by
the DAE to the Component Head or CAE, then the program is an ACAT IAC Special
Interest program.
For such Special Interest programs, the reporting requirements are tailored to meet the
specific oversight needs and must be captured in an Acquisition Decision
Memorandum.
Table 10.14.2.T1 MAIS & Special Interest Designations & Decision Authorities
10.15.2. Should-Cost
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      915
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
defining and achieving affordability targets. Past that point, the emphasis shifts to
defining and achieving should-cost estimates.
The Milestone Decision Authority (MDA) considers affordability at all major decision
points of an acquisition program. In part, this consideration ensures that sufficient
resources (funding and manpower) are programmed and budgeted to execute the
program acquisition strategy. The MDA also examines the realism of projected funding
over the programming period and beyond, given likely DoD Component resource
constraints.
Affordability Analysis is based upon the budgets we expect to have for the product over
its life cycle and provides a design constraint on the product we will build, procure, and
sustain. When the Department, i.e., the Milestone Decision Authority (MDA), establishes
the affordability requirement, it represents a metric that captures the products expected
capability against its expected (affordable) life cycle cost. From this point on, any future
unit or sustainment cost increase above those levels, whatever the cause, must come
back to the MDA and to the user to determine what requirements can be dropped to
stay within the affordability requirement, or-if the program must be terminated. For
further discussion of affordability and affordability assessments, see 3.2 .
10.15.2. Should-Cost
Should-cost can be applied to anything that we do and to any source of costs, including
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      916
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
costs for services and internal government costs as well as contracted product costs.
Should-cost targets are often stretch goals we expect our leaders to do their best to
reach; we expect them to be based on real opportunities, but to be challenging to
execute. Unlike affordability requirements, we do not expect them to always be
achieved, but we do expect strong efforts to do so.
Should-cost and affordability can come into conflict early in programs, particularly
before Milestone B, when an affordability requirement may have been defined based on
expected budgets, but it is too early to define should-cost estimates for future
production or sustainment of products because we have not yet defined the design. This
is also the time when spending money on efforts to reduce future costs can have the
biggest payoff. As a result, during the early stages of product development, the priority
should be toward establishing affordability constraints and working to provide the
enablers to achieve them in the ultimate design. In the early phases of programs,
should-cost can still be constructively used to control program overhead and
unproductive expenses and to generally reduce contracted development costs, but it
should not keep us from making sound investments in product affordability. Prior to the
pre-EMD Review or MS B, the ICE or Program Estimate for production and sustainment
has not been finalized and any should-cost estimates for future production lots and
sustainment would be premature. At that point, however, particularly if we are ready to
ask for bids and negotiate low rate initial production prices, we need a should-cost
estimate to inform negotiations. Once the requirements, design, and affordability goals
are established and an CE or Program Estimate exists, then it is time to challenge the
assumptions embedded in those analyses, formulate should-cost estimates for
production and sustainment, and work to achieve those estimates.
On April 22, 2011, the Under Secretary of Defense (Acquisition, Technology and
Logistics) directed the Component Acquisition Executives to deliver an annual progress
report on their Should-Cost implementation .
The annual report must list all Major Defense Acquisition Programs (MDAPs) and Major
Automated Information System (MAIS) programs for which should-cost estimates have
been established. It should describe the challenges and successes in implementing
these initiatives for each program from the perspectives of the respective Program
Executive Officers and Program Managers, especially with regard to the programs the
Services selected as models for Should-Cost implementation. The report should also
describe the incentive plans the Service/Component has developed for Program
Managers to reinforce and reward commitment to the Will-Cost and Should-Cost
Management process.
Additionally, each Component should submit a Should-Cost datasheet for each MDAP
and MAIS program that has established a Should-Cost estimate. The datasheet
template is to be used as a guide and may be tailored to better present relevant
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      917
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
information.
Questions regarding the annual Should-Cost Report may be directed to the Deputy
Director, Resource Analysis in the Office of the Director, Acquisition Resources and
Analysis.
The basic purpose, common goals, and common deliverables for the APTW process
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      918
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
are listed below.
Common Goals.
Common Deliverables.
Draft Request for Proposal (RFP). As a DoDI 5000.02 defined Milestone or a major
transition/restart is approached, information regarding APTWs should be included in the
RFP and Statement of Work. Acquisition Category (ACAT) ID and ACAT IAM Program
Managers should address APTWs in their Draft RFP briefings to possible respondents.
Pre-Contract Award. The period prior to source selection and contract award is a
particularly useful time for the government Program Manager to engage in APTW
government team training and/or process development for contract execution.
Post Contract Award. In the first few weeks following contract award, program
managers should coordinate with the industry program manager counterpart on actions
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      919
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
that will result in a joint APTW within five weeks following contract award.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      920
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
DEFENSE ACQUISITION GUIDEBOOK
Chapter 11 -- Program Management Activities
11.0. Overview
11.7. Property
11.0. Overview
11.0.1. Purpose
11.0.2. Contents
11.0.1. Purpose
The purpose of this chapter is to describe and explain some of the activities and
decisions available to and required of the program manager as he or she manages and
executes the program.
11.0.2 Contents
    •    Joint Programs
    •    International Programs
    •    Integrated Program Management
    •    Earned Value Management
    •    Contract Funds Status Report
    •    Quality Management
    •    Reporting
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      921
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Knowledge-Based Acquisition
    •    Technical Representatives at Contractor Facilities
    •    Contractor Councils
    •    Government Property in the Possession of Contractors
    •    Modeling and Simulation (M&S) Support to the Entire Product
11.1.2.1. Designation
11.1.2.2. Execution
There are two aspects of "jointness" to consider when discussing joint program
management: the jointness of the capability and the jointness of the development and
production of the system.
As part of the Joint Capabilities Integration and Development System (JCIDS) , the Joint
Staff J-8, with the assistance of the DoD Components, evaluates all JCIDS documents,
regardless of Acquisition Category or previous delegation decisions to determine
whether the proposal has joint force implications. The Joint Staff documents, CJCS
Instruction 3170.01 and the JCIDS Manual , provide full detail and direction on this
topic.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      922
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
11.1.2.1. Designation
Considering the recommendation of the Joint Staff and the Heads of the DoD
Components, the Milestone Decision Authority decides whether to place the program
under joint acquisition management. The Milestone Decision Authority should make this
decision and, if appropriate, designate the Lead Executive DoD Component, as early as
possible in the acquisition process.
The DoD Components should periodically review their programs to determine the
potential for joint cooperation. The DoD Components should structure program
strategies to encourage and to provide an opportunity for multi-Component participation.
11.1.2.2. Execution
The designated Lead Executive DoD Component for a joint acquisition should act on
behalf of all DoD Components involved in the acquisition.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      923
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         program, one program change control program, one integrated test program, and
         one set of documentation and reports (specifically: one set of capabilities
         documents, (with Service unique capability requirements identified), one
         Information Support Plan , one Test and Evaluation Master Plan , one Acquisition
         Program Baseline , etc.).
    •    The Milestone Decision Authority should designate the lead Operational Test
         Agency to coordinate all operational test and evaluation. The lead Operational
         Test Agency should produce a single operational effectiveness and suitability
         report for the program.
    •    Documentation for decision points and periodic reporting should flow only
         through the Lead Executive DoD Component acquisition chain, supported by the
         participating components.
    •    The program should use inter-DoD Component logistics support to the maximum
         extent practicable, consistent with effective support to the operational forces and
         efficient use of DoD resources.
    •    Unless statute, the Milestone Decision Authority, or a memorandum of
         agreement signed by all DoD Components directs otherwise, the Lead Executive
         DoD Component should budget for and manage the common Research,
         Development, Test, and Evaluation funds for the assigned joint programs.
    •    Individual DoD Components should budget for their unique requirements.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      924
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Guidebook. DoD Directive 5000.01 requires International Armaments Cooperation;
requires interoperability with U.S. coalition partners; and establishes the preference for
a cooperative development program with one or more Allied nations over a new, joint, or
DoD Component-unique development program.
During the development of the Technology Development Strategy (TDS ) for Milestone
A or the initial Acquisition Strategy for Milestone B for a new program, the potential for
international cooperative research, development, production, and logistic support should
be addressed, and thereafter, the potential for international cooperation should be
considered in every phase of the acquisition process. DoD Components should
periodically review their programs to determine the potential for international
cooperation. Milestone Decision Authorities may recommend forming international
cooperative programs based on the TDS or Acquisition Strategy considerations; DoD
Component Heads may also recommend forming international cooperative programs.
The Milestone Decision Authority should make the decision to establish an international
cooperative program as early as possible in the Defense Acquisition Management
System.
The Milestone Decision Authority, with the advice and counsel of the DoD Components
and the Joint Requirements Oversight Council, makes the decision to pursue an
international cooperative program. The decision process should consider the following:
The DoD Component remains responsible for preparation and approval of most
statutory, regulatory, and contracting reports and milestone requirements, as listed in
DoD Instruction 5000.02, Enclosure 4 . Documentation for decision reviews and periodic
reports flow through the DoD Component acquisition chain, supported by the
participating nation(s).
International cooperation can add stability to the program. DoD Instruction 5000.02
prevents DoD Components from terminating or substantially reducing participation in
international cooperative programs under signed international agreements without
Milestone Decision Authority notification, and in some cases, Milestone Decision
Authority approval.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      925
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
International Cooperation in Acquisition, Technology and Logistics Handbook .
Establishing and maintaining cooperative relationships with friends and Allies are critical
to achieving interoperability of equipment and services to be used by the U.S. Armed
Forces and our coalition partners; to achieving access to technology from sources
worldwide; to achieving economies of scale with our investment resources; and to
expanding our influence in critical areas of the world (USD(AT&L) Memorandum,
Support for International Armaments Cooperation Activities, January 23, 2006)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      926
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
        Figure 11.2.1.2.F1. Key International Cooperative Considerations During
                                       Acquisition.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      927
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
costs. Cooperative opportunities identification and formulation should be pursued during
the earliest stages of the pre-systems acquisition research and development process to
maximize the chance for success. DoD Instruction 5000.02, Enclosure 3, paragraph 2 ,
identifies technology projects and initiatives.
Several important mechanisms available to provide insight into the needs of potential
foreign partners are exploratory discussions, international forums, studies, and the
exchanges of information and personnel:
Studies. It is normal for the DoD and potential partners to conduct studies before
entering into a cooperative acquisition project. These studies can be conducted years
before the project starts, and are often called feasibility studies, or pre-feasibility
studies. Industry, government agencies, or a combination of both generally conduct the
feasibility studies, with the objective of providing a technical appraisal of the feasibility of
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      928
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
developing and producing equipment. These studies can develop input for the Analysis
of Alternatives required by DoD before the start of a new acquisition program.
Pre-Systems Acquisition. Decisions made during the Materiel Solution Analysis and
Technology Development phases of Pre-Systems Acquisition generally define the
nature of the entire program. Once the program enters the Engineering and
Manufacturing Development phase, it is difficult to adopt major changes without
significant schedule or cost adjustments. Consequently, the decision to include
international partners needs to be addressed as early as possible, preferably during
development of the Initial Capabilities Document, but no later than during the Materiel
Solution Analysis phase.
International Involvement
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      929
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         protection and exportability features.
These considerations are based on 10 U.S.C. 2350a requirements. They encourage the
consideration of alternative forms of international cooperation. Even if cooperative
development is impractical, cooperative production, foreign military sales, licensed
production, component/subcomponent co-development, or incorporation of subsystems
from allied or friendly foreign sources should be considered where appropriate.
The Defense Exportability Features (DEF) Pilot Program. DEF was established in
the fiscal year 2011 National Defense Authorization Act to develop and incorporate
technology protection features into a system or subsystem during its research and
development phase. By doing this, exportable versions of a system or subsystem could
be sold earlier in the Production and Development phase, thereby (1) enabling
capability to be available to allies and friendly companies more rapidly and (2) lowering
the unit cost of DoD procurements. Prior to the Engineering and Manufacturing
Development Phase, programs should investigate the necessity and feasibility (from
cost, engineering, and exportability perspectives) of the design and development of
differential capability and enhanced protection of exportable versions of the system or
subsystem.
Acquisition programs candidates may be considered for the DEF pilot program via
nominations from the DoD components. AT&L / International Cooperation (IC) is
available for consultation regarding potential DEF candidate nominations. After a
favorable preliminary assessment of exportability and differential capability / program
protection needs, AT&L / IC will approve DEF candidates. Specific differential capability
/ program protection requirements will be determined by DoD technology security,
foreign disclosure, anti-tamper processes. With sufficient industry and government
support, a feasibility study will be conducted to determine the cost to implement the
differential features and the associated design specifications. If a DEF candidate is pre-
Milestone A, the feasibility study should be incorporated into the appropriate technology
development requests for proposal (RFPs) and contracts. Otherwise, the feasibility
study should be contracted through the prime contractor if funding is available. If
government and industry agree that the differential capability / protection determined by
the feasibility study should be implemented, and funding arrangements are agreed
upon, the required design specifications should be incorporated into the engineering
and manufacturing development RFP and/or contract, depending on when the feasibility
study was completed.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      930
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Engineering and Manufacturing Development. After program initiation, during
Engineering and Manufacturing Development, key elements of the system design are
defined, and system/subsystem development begins. Major changes often present
schedule delays that program managers are unwilling to accept; however, there have
been numerous examples of successful subsystem cooperative development
partnerships that have been formed during the Engineering and Manufacturing
Development Phase. Once a program has reached this phase, absent cooperation in
earlier stages, there will be only limited opportunity to bring other nations on as full
cooperative development partners. Consequently, if the opportunity for cooperation in
subsystem development arises prior to or during Engineering and Manufacturing
Development, consult with the appropriate international programs organization to obtain
further assistance.
Production and Deployment Phase. There are three basic mechanisms for transfer of
U.S. produced defense articles and associated production capability to other nations:
sales, co-production and cooperative production. Sales under the Foreign Military Sales
Program foreign co-production of a U.S. developed system, fall under the purview of the
Defense Security Cooperation Agency (DSCA) . The Department of State is responsible
for transfer of defense articles and associated production capability under export
licenses. Both DSCA and the Defense Technology Security Administration coordinate
closely with the responsible DoD Component regarding the development and
implementation of DoD co-production policy in their respective areas of responsibility.
USD(AT&L) is responsible for oversight of the third basic mechanism, cooperative
production. Cooperative production is a joint or concurrent international production
arrangement arising from a cooperative development project. Examples of this type of
production program are the Rolling Airframe Missile and the Multi-Functional
Information Distribution System . Cooperative production falls under the authority of the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      931
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Arms Export Control Act Section 2751 .
Operations & Support Phase. Cooperative logistics refers to cooperation between the
U.S. and allied or friendly nations or international organizations in the logistical support
of defense systems and equipment. Cooperative logistics is part of the acquisition
process, but as a substantial part of military operations, much of the implementation
process involves Security Assistance processes and procedures.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      932
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
11.2.1.3. International Aspects of Program Protection
DoD Instruction 5000.02, Enclosure 10, paragraph 5 , and the tables of enclosure 4
establish international cooperative program protection policy requirements. Chapter
13.2 of this Guidebook provides additional insights into this policy.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      933
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
11.2.1.3.1. Classification Guide
In addition to the Program Protection Plan required by all programs containing Critical
Program Information, and the Technology Assessment/Control Plan , DoDM 5200.01
requires international programs to develop a classification guide for all programs
containing classified information of either party. The classification guide, as prescribed
in DoD Directive 5230.11 , identifies the items or information to be protected in the
program, and indicates the specific classification to be assigned to each item.
A PSI details security arrangements for the program and harmonizes the requirements
of the participants' national laws and regulations. Using the Under Secretary of Defense
for Acquisition, Technology and Logistics international agreements streamlined
procedures authorized by DoD Instruction 5000.02, Enclosure 10, paragraph 5 , the
International Agreements Generator will lead the program manager through the
considerations for, and the development of, a PSI. Additional information about the PSI
is found in the International Cooperation in Acquisition, Technology and Logistics
Handbook Chapter 7, Section 7.6..
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      934
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Classification Guide (SCG) , and other disclosure guidance.
    •    Provides early DoD Component planning for the program's proposed technology
         releases to foreign industry consistent with the National Disclosure Policy.
    •    Provides early planning for higher-level (i.e., above DoD Component-level)
         special technical reviews and approvals (i.e. Low Observable/Counter Low
         Observable, anti-tamper, cryptography) needed in support of proposed
         technology releases to foreign industry.
    •    Establishes a detailed export license approval planning process for U.S.-foreign
         industry cooperation to meet critical program and contract timelines.
The TRR includes three sections: 1) A timeline mapping key projected export licenses
against the program acquisition schedule; 2) A definition of the technologies involved in
each export license; and 3) A list of U.S. contractors (exporters) as well as foreign
contractors (end users) for each license.
11.2.2. Office of the Under Secretary of Defense for Acquisition, Technology and
Logistics (USD(AT&L))-Related International Agreement Procedures
11.2.2.2. Office of the Under Secretary of Defense for Acquisition, Technology and
Logistics (OUSD(AT&L)) Oversight
11.2.2. Office of the Under Secretary of Defense for Acquisition, Technology and
Logistics (USD(AT&L))-Related International Agreement Procedures
An International Agreement (IA) is any agreement concluded with one or more foreign
governments including their agencies, instrumentalities, or political subdivisions, or with
an international organization. The IA delineates respective responsibilities and is
binding under international law. IAs are required by U.S. law for all international
cooperative projects.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      935
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Per DoD Instruction 5000.02 , all AT&L-related international agreements may use the
USD(AT&L)-issued streamlined procedures found in this Guidebook and in the
International Cooperation in Acquisition, Technology and Logistics Handbook , rather
than following the lengthy documentation requirements mandated by DoD Directive
5530.3 , "International Agreements."
11.2.2.2. Office of the Under Secretary of Defense for Acquisition, Technology and
Logistics (OUSD(AT&L)) Oversight
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      936
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
              o  Request for Authority to Develop (RAD);
              o  Request for Final Approval (RFA);
              o  Summary Statement of Intent (SSOI);
              o  Arms Export Control Act Section 27 Project Certification format
                 requirements; and
              o DoD International Agreement Generator computer software.
    •    Approves the following agreement process actions:
              o RADs and RFAs for Memoranda of Understanding (MOU)/Memoranda of
                 Agreement (MOA);
              o Project Agreements and Arrangements;
              o Arms Export Control Act Section 65 Loan Agreements;
              o End-User Certificate (EUC) Waivers ;
              o Foreign Military Sales or Direct Commercial Sales of Major Defense
                 Equipment with Letters of Request (LOR) Advisories and Requests for
                 Major Defense Equipment (MDE) Prior to Satisfactory Completion of
                 Operational Test and Evaluation (OT&E) formerly called Yockey Waivers ;
                 and
              o DoD Component requests for DoD International Agreement Generator text
                 deviations or waivers requested in RAD and RFA submissions.
    •    Delegates PA negotiation authority under the Streamlining I Coordination
         (Approval) Process to specifically designated DoD Components.
    •    Certifies DoD Component international agreement processes to the Streamlining
         II standards prior to delegation of RAD/RFA authority to a DoD Component.
    •    Decertifies a DoD Component international agreement process in the event
         minimum quality standards are not maintained.
    •    Resolves RAD/RFA coordination process disputes.
    •    Oversees the DEF pilot program to include technology protection features during
         research and development of defense systems under 10 USC 2358.
    •    Supports satisfaction of the following statutory requirements:
              o Obtains USD(AT&L) determination under 10 U.S.C. 2350a paragraph (b)
                 for all international agreements that rely upon this statute as their legal
                 authority;
              o Notifies Congress of all Arms Export Control Act Section 27 (see 22
                 U.S.C. Section 2767 , "Authority of President to enter into cooperative
                 projects with friendly foreign countries") international agreements a
                 minimum of 30 calendar days prior to authorizing agreement signature;
                 and
              o Conducts interagency coordination with the Department of State,
                 Department of Commerce, and the Department of the Treasury (see 22
                 U.S.C. 2767 and DoD Directive 5530.3 ).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      937
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
11.2.2.3.1. International Agreement Streamlining I Process
Office of the Under Secretary of Defense for Acquisition, Technology and Logistics
(USD(AT&L))/International Cooperation (IC) uses the following Streamlining I process
unless it has delegated coordination authority to the DoD Component:
Office of the Under Secretary of Defense for Acquisition, Technology and Logistics
(USD(AT&L))/International Cooperation (IC) may delegate approval authority for the
Request for Authority to Develop and Negotiate/Request for Final Approval (RAD/RFA)
for all international agreements associated with programs with a total program value of
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      938
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
less than $25M (in FY01 constant dollars) and for Acquisition Category II and
Acquisition Category III programs to the DoD Component Acquisition Executive. The
DoD Component Acquisition Executive may subsequently re-delegate RAD/RFA
authority for programs with a total program value of less than $10M (in FY01 constant
dollars) and Acquisition Category III programs to the Head of the DoD Component's
international programs organization. The following procedures will apply:
    •    The DoD Components will obtain the concurrence of their legal, financial
         management, and foreign disclosure organizations prior to approving
         RADs/RFAs.
    •    The DoD Components will forward coordination disputes to OUSD(AT&L)/IC for
         resolution.
    •    The DoD Components will send Notices of Intent to Negotiate (NINs) or Notices
         of Intent to Conclude (NICs) to OUSD(AT&L)/IC for all approved RADs and
         RFAs. NINs will include the DoD Component's approval document and program
         Summary Statement of Intent. NICs will also include the final international
         agreement text to be signed, plus an Arms Export Control Act Section 27 Project
         Certification, if required. The DoD Components will not sign international
         agreements until a 15-working-day period (for PAs and Loans) or 21-working-day
         period (for Memoranda of Understanding) after AT&L/IC receipt of the NIC has
         elapsed and any required 10 U.S.C. 2350a approval or AECA Section 27
         Congressional notification process has been completed.
    •    OUSD(AT&L/IC) may, at its discretion, decide to waive these rules on a case-by-
         case basis and require that certain agreements receive specific OUSD(AT&L/IC)
         approval before conclusion.
    •    OUSD(AT&L)/IC will use NINs, NICs and other relevant information to verify DoD
         Component international agreement process quality.
    •    Generally, within 9 months of receipt of RAD authority, DoD Component
         personnel will negotiate the international agreement in accordance with the
         provisions of the most recent version of DoD International Agreement Generator.
The Office of the Under Secretary of Defense for Acquisition, Technology and
Logistics/International Cooperation coordinates all international agreements (including
Memoranda of Understanding, Project Arrangements, other similar agreements) and
Information Exchange Program annexes (See IC in AT&L Handbook, Chapter 13.)
relating to NCB warfare technologies (including defenses against such technologies)
with the Assistant to the Secretary of Defense ( Nuclear and Chemical and Biological
Defense Programs) prior to approving the agreement. DoD policy requires this
coordination for NCB-related RADs for project arrangements under Streamlining I
authority, and for NINs and NICs under Streamlining II authority.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      939
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
11.2.3. Acquisition and Cross-Servicing Agreements (ACSAs)
ACSAs are bilateral international agreements that allow for the provision of cooperative
logistics support under the authority granted in 10 U.S.C. Sections 2341-2350 . They
are governed by DoD Directive 2010.9 , "Acquisition and Cross-Servicing Agreements"
and implemented by CJCS Instruction 2120.01B , "Acquisition and Cross-Servicing
Agreements." ACSAs are intended to provide an alternative acquisition option for
logistics support in support of exercises or exigencies.
Title 10 of the United States Code provides two legal authorities for foreign logistic
support, supplies, and services: an Acquisition-only Authority, and a Cross-Servicing
Authority, which includes an acquisition authority and a transfer authority.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      940
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         U.S. armed forces or permit’s other military operations by the U.S. armed forces
         in such country.
ACSA is for the transfer of logistics, support, supplies, and services only. Per Section
4.5 of DoD Directive 2010.9 , items that may not be acquired or transferred under ACSA
authority include weapons systems; the initial quantities of replacement and spare parts
for major end items of equipment covered by tables of organization and equipment,
tables of allowances and distribution, or equivalent documents; and major end items of
equipment. Specific items that may not be acquired or transferred under ACSA authority
include guided missiles; naval mines and torpedoes; nuclear ammunition and included
items such as warheads, warhead sections, projectiles, and demolition munitions;
guidance kit’s for bombs or other ammunition; and chemical ammunition (other than riot
control agents). General purpose vehicles and other items of non-lethal military
equipment not designated as Significant Military Equipment on the United States
Munitions List promulgated pursuant to 22 U.S.C. 2778 , may be leased or loaned for
temporary use. Specific questions on the applicability of certain items should be referred
to the Combatant Command's legal office for review and approval.
In addition to the use of cash and subject to the agreement of the parties, ACSA
obligations may be reconciled by either Replacement-in-Kind or Equal Value Exchange.
ACSA obligations not repaid by Replacement-in-Kind or Equal Value Exchange
automatically convert to cash obligations after one year.
Replacement in Kind (RIK) . RIK allows the party receiving supplies or services under
the ACSA to reconcile their obligation via the provision or supplies and services of an
identical or substantially identical nature to the ones received. As an example, a country
may provide extra water to the United States during a training exercise with the proviso
that the United States will provide the same amount of water during a future exercise.
Equal Value Exchange (EVE) . EVE enables the party receiving supplies or services
under the ACSA to reconcile their obligation via the provision of supplies or services
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      941
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
that are considered to by both parties to be of an equal value to those received. As an
example, a country may provide extra water to the United States during a training
exercise in exchange for the United States providing extra ammunition.
DoD Directive 2010.9 and CJCS Instruction 2120.01B provide management guidance
on initiating ACSA orders, receiving support, reconciling bills, and maintaining records.
As this is a Combatant Command-managed program, organizations interested in
acquiring logistics, support, supplies and services should work through the applicable
logistics branch to receive further guidance on this topic.
International cooperation offers the opportunity to achieve cost savings from the earliest
phases of Pre-Systems Acquisition throughout the life cycle, while enhancing
interoperability with coalition partners. All DoD acquisition personnel, in consultation
with the appropriate international programs organizations, should strive to identify and
pursue international cooperative programs in accordance with DoD 5000 policy .
Specific topics are found in the International Cooperation in Acquisition, Technology and
Logistics Handbook at the OSD/International Cooperation website .
The program manager should obtain integrated cost and schedule performance data at
an appropriate level of summarization to monitor program execution. The program
manager should require contractors and government activities to use internal
management control systems that accomplish the following:
Unless waived by the Milestone Decision Authority, the program manager should
require that the management control systems used to plan and control contract
performance comply with American National Standards Institute/Electronic Industries
Alliance Standard 748, Earned Value Management Systems ( ANSI/EIA-748 (see DoD
Instruction 5000.02 ) in accordance with paragraph 11.3.1.1.. The program manager
should not impose a specific system or method of management control or require a
contractor to change its system, provided it complies with ANSI/EIA-748.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      942
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
11.3.1. Earned Value Management (EVM)
The requirement for EVM applies to cost or incentive contracts, subcontracts, intra-
government work agreements, and other agreements that meet the dollar thresholds
prescribed in DoD Instruction 5000.02 and DFARS Subpart 234.2 . The application
thresholds (total contract value including planned options in then-year dollars) are
summarized below:
    •    $20 million but less than $50 million EVM implementation compliant with
         ANSI/EIA-748 is required. No formal Earned Value Management System (EVMS)
         validation is required.
    •    $50 million or greater EVM implementation compliant with the guidelines in
         ANSI/EIA-748 is required. An EVMS that has been formally validated and
         accepted by Defense Contract Management Agency (DCMA) (per paragraph
         11.3.1.5) in coordination with, the cognizant contracting officer is required.
The program manager will implement EVM on applicable contracts within acquisition,
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      943
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
upgrade, modification, or materiel maintenance programs, including highly sensitive
classified programs, major construction programs, and automated information systems.
EVM should also be implemented on applicable contracts wherein the following
circumstances exist: (1) the prime contractor or one or more subcontractors is a non-
U.S. source; (2) contract work is to be performed in government facilities, or (3) the
contract is awarded to a specialized organization such as the Defense Advanced
Research Projects Agency (DARPA) . In addition, EVM should be implemented on
applicable contracts designated as major capital acquisitions in accordance with Office
of Management and Budget Circular A-11, Part 7 , and the Capital Programming Guide
.
If a contract type is mixed, the EVM policy should be applied separately to the different
parts (contract types).
For Indefinite Delivery/Indefinite Quantity (ID/IQ) or task order types of contracts, the
application of EVM based on dollar threshold is assessed at the computed total contract
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      944
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
value and not by each separate order. To determine EVM applicability, anticipated cost
or incentive orders should be summed to reach the computed total contract value. FFP
orders are generally not included in that summation.
The DoD program manager should use Defense Federal Acquisition Regulation
Supplement (DFARS) clauses 252.234-7001 and 252.234-7002 to place the Earned
Value Management System (EVMS) requirement in solicitations and contracts.
The contract should not, either at the time of award or in subsequent modifications,
specify requirements in special provisions and/or statements of work that are not
consistent with the EVM policy and EVMS guidelines (required by imposition of DFARS
252.234-7002), or which may conflict with offeror’s or contractors approved EVM
system descriptions. Consult DCMA for guidance on compliance of the contractor's
EVMS.
IBRs should be scheduled as early as practicable and the timing of the IBRs should
take into consideration the contract period of performance. The process will be
conducted not later than 180 calendar days (6 months) after a significant program event
or contract change including, but not limited to: (1) contract award, (2) the exercise of
large contract options, and (3) the incorporation of major modifications. IBRs are also
performed at the discretion of the program manager at any time, even without the
occurrence of a major event in the life of a program.
Events that may trigger an IBR include completion of the preliminary design review,
completion of the critical design review, a significant shift in the content and/or time
phasing of the PMB, or when a major milestone such as the start of the production
option of a development contract is reached. Continuous assessment of the PMB will
help identify when a new IBR should be conducted with the clause at DFARS 252.234-
7002 and DoD Instruction 5000.02 require IBRs on all contracts that require the
implementation of Earned Value Management The IBR is not dependent on the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      945
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
contractor's Earned Value Management System being formally validated as complying
with the guidelines in ANSI/EIA-748 . Subcontracts, intra-government work agreements,
and other agreements also require IBRs as applicable. The scope of the IBRs should be
tailored to the nature of the work effort.
The policy allows for the use of IBRs prior to contract award in situations where they
may be appropriate and beneficial. If a program manager elects to conduct a pre-award
IBR on a DoD contract, that requirement should be included in the statement of work.
See the NDIA Guide to the Integrated Baseline Review Process(April 2003 version) for
additional guidance on IBRs.
The Integrated Program Management Report (IPMR) applies to all contracts that meet
the Earned Value Management (EVM) applicability requirements in DoD Instruction
5000.02 . The IPMR combines the CPR (DI-MGMT-81466) and the IMS (DI-MGMT-
81650) into a single Data Item Description (DID), DI-MGMT-81861. This new DID was
effective as of July 1, 2012. However, for those existing contracts with separate
Contract Data Requirements Lists (CDRLs) for the CPR and the IMS, those two DIDs
and their content are still contractually applicable. On contracts valued at or greater than
$20 million but less than $50 million, it is recommended that IPMR reporting be
appropriately tailored. Refer to the IPMR DID Implementation Guide for tailoring
guidance. See PARCA EVM Website for the latest version of the guide.
A common, product-oriented Work Breakdown Structure (WBS) that follows the DoD
Work Breakdown Structure Standard ( MIL-STD-881C ) (current version at time of
award) is required for the IPMR and the Contractor Cost Data Report (CCDR). Except
for high-cost or high-risk elements, the required level of reporting detail should not
normally exceed level three of the contract WBS.
The IPMR for all Acquisition Category (ACAT) I programs must be submitted directly to
the EVM Central Repository (CR) by the reporting contractors. The EVM CR, which is
managed by the PARCA Deputy Director for EVM, is the sole addressee on the
Contract Data Requirements Lists for these reports. See the EVM CR Manual for
additional guidance on the CR requirements.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      946
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Extensible Markup Language (XML) schemas located in the EVM CR .
The IPMR provides performance data which is used to identify problems early in the
contract and forecast future contract performance. The IPMR should be the primary
means of documenting the ongoing communication between the contractor and the
program manager to report to date cost and schedule metric trends and to permit
assessment of their effect on future performance.
The program manager obtains an IPMR on all cost or incentive contracts, subcontracts,
intra-government work agreements, and other agreements valued at or greater than $20
million. The IPMR is not typically required for cost or incentive contracts valued at less
than $20 million, contracts less than 12 months in duration, or Firm-Fixed Price
contracts for production efforts.
Data Item Description (DID) DI-MGMT-81861 (current version at time of award URL:
https://assist.dla.mil/quicksearch/basic_profile.cfm?ident_number=278901 ) is used to
obtain the IPMR. The contracting officer and contractor should negotiate reporting
provisions in the contract, including frequency and selection of formats, level of detail,
submission dates, variance thresholds and analysis, and the Work Breakdown Structure
to be used. The program manager should tailor the IPMR, via the contractual CDRL, to
the minimum data necessary for effective management control on contracts valued at
less than $50 million. In exceptional cases, the contractor may determine that the
performance measurement baseline (PMB) or existing contract schedule cannot be
achieved and no longer represents a reasonable basis for management control. With
government approval, the contractor may implement an Over Target Baseline (OTB) or
Over Target Schedule (OTS). For cost-reimbursement contracts, the contract budget
base excludes changes for cost growth increases, other than for authorized changes to
the contract scope. The OTB/OTS creates additional budget to complete in-scope work,
but it does not increase the negotiated contract cost.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      947
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
data necessary for effective management control on contracts valued at less than $50
million.
The Defense Contract Management Agency (DCMA) has responsibility for EVMS
compliance, validation, and surveillance for the Department of Defense, except for
those DoD Components that are also part of the Intelligence Community (IC) and are
excluded from the requirement to delegate EVMS authorities to DCMA.
Validation is achieved by conducting a formal review of the processes defined and used
by the contractor to manage major acquisitions that assesses the capability of the
contractor's proposed system to comply with the EVMS guidelines in ANSI/EIA-748. It
determines that the contractor is using the system as one of its primary program
management processes; that the contractor has properly implemented the system on
the contract; and that the contractor is using the data from its system in reports to the
government. See the DCMA EVMS Compliance Review Instruction for additional
guidance on EVMS compliance and validation.
Surveillance is required for all contract efforts that require the implementation of an
EVMS, regardless of whether a formal system validation is required. For the life of the
contract, surveillance will be conducted on a recurring basis and should evaluate both
the continuing capability of the contractor's EVMS and the validity of the internal and
external performance information generated by the system. The results of surveillance
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      948
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
efforts should be documented and identified deficiencies should be monitored and
corrected. The responsibility and requirement for government surveillance of contracts
should be based on the effectiveness of the contractor's implementation of internal
management controls. See the Defense Contract Management Agency (DCMA)'s
surveillance process for additional guidance on surveillance activity where applicable.
The Navy Supervisors of Shipbuilding have the authority to conduct EVMS surveillance
activities, issue Advance Agreements, approve EVM processes, and the responsibility
to coordinate with DCMA for the contracts under their cognizance. EVM system
validation reviews and reviews for cause are the responsibility of DCMA in coordination
with the contracting officer.
The CFSR described in this section applies to many defense contracts. It helps to
ensure effective program management and supplies funding data about defense
contracts to program managers for:
The program manager will obtain a CFSR ( DD Form 1586 ) on contracts over 6 months
in duration. The CFSR has no specific application thresholds; however, the program
manager should carefully evaluate application to contracts valued at less than $1.5
million (in then-year dollars).
DID DI-MGMT-81468 (current version at time of award) is used to obtain the CFSR. The
contracting officer and contractor should negotiate reporting provisions in the contract,
including level of detail and reporting frequency. The program manager should require
only the minimum data necessary for effective management control. The CFSR should
not be applied to Firm-Fixed Price contracts unless unusual circumstances dictate
specific funding visibility.
The CFSR for all Acquisition Category I programs is submitted directly to the Earned
Value Management Central Repository (CR) by the reporting contractors. The CR will
be the sole addressee on the CDRL for this report. See the EVM CR Manual for
additional guidance on the CR requirements.
The use of a standard electronic data exchange format is required for all reports unless
disclosure of this information would compromise national security. All data will be in a
readable digital format (e.g., PDF files are not acceptable). The Extensible Markup
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      949
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Language standard (Project Schedule Cost Performance Management message) is the
preferred format. The American National Standards Institute X12 standard (839
transaction set) is also acceptable. On-line access to the data may be provided to
augment formal submission.
Effective quality management activities are important for reducing process-related risks
to programs. Such risks include:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      950
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Undetected product defects resulting from unidentified verification technologies
         or failure to implement existing ones.
If not managed and mitigated, these risks may start a chain of events leading to
undesirable outcomes such as:
The later these risks are identified, the greater the cost of corrective action and the
greater the delays in schedule. Early identification, management, and mitigation of
important process-based risks to a program lead to less expensive and less disruptive
corrective actions that break the chain of undesirable outcomes.
While the DoD program manager should encourage and support the contractor's efforts
to assure quality, ultimately, the prime contractor is responsible. Therefore, from a DoD
perspective, a key program success factor is selecting contractors that can demonstrate
effective quality management. This subject is discussed in section 11.3.3.1 .
The contract should provide incentive to the contractor to deliver products or services
that provide value beyond the basic requirement. Without additional incentives, the
systems engineering process will normally lead to decisions that satisfy requirements at
the lowest cost. It may however be possible to incentivize the contractor to (1) exceed a
basic requirement such as mean time between failures or (2) generate a higher level for
an important derived requirement (e.g., one that affects operational flexibility,
maintainability, supportability, etc.). Section 11.3.3.2 discusses this topic.
Applying best practices as described in Sections 11.3.3.1 and 11.3.3.2 may not be
sufficient to manage and mitigate the process-based risks list above. Section 11.3.3.3
discusses how encouraging a quality focus can also contribute.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      951
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
11.3.3.1. Differentiating Among Offeror’s on the Basis of Quality
A contractor's quality management system is used to direct and control the organization
with regard to quality. Quality management is an enterprise level process, driven by
senior leadership involvement, to support the delivery of high quality products and
services by ensuring that all aspects of quality are considered and acted upon by every
element of the organization. The fundamental goal is to provide objective insight to
assure that: customer requirements are thoroughly analyzed and understood;
processes are defined and capable; and the resulting product meets the customer's
needs. It interacts with systems engineering technical processes and technical
management processes by focusing on both the quality of the system and the quality of
the processes being used to create the system. Quality management provides objective
insight into processes and work products for all stakeholders including program team
members, management, suppliers, customer’s, and users involved with the
development, manufacture, operation, and support of a system.
The quality management process begins early in the life cycle and continues
throughout. The principal elements of the quality management process include:
While the quality management focus is on the key aspects of the product realization
process (e.g., requirements, design, make/buy decisions, supplier management,
production), it also encompasses supporting processes such as contracting and
training. Both value-added activities and continuous process improvement should be
stressed and encouraged.
Further information about quality management may be found in ISO 10005 Quality
Management - Guidelines for Quality Plans (available for purchase), AQAP-2000 NATO
Policy on an Integrated Systems Approach to Quality through the Life Cycle, AQAP-
2009 NATO Guidance on the Use of the AQAP 2000 Series, and at Process and
Product Quality Assurance in the CMMI for Development (CMMI-DEV) v1. 2 or the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      952
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
CMMI for Acquisition (CMMI-ACQ) v1.2.
Program managers should allow contractors to define and use their preferred quality
management system as long as it meets the needs of the program. International quality
standard ISO 9001-2008, Quality Management Systems - Requirements, AQAP-2110,
NATO Quality Assurance Requirements for Design, Development and Production, and
AS 9100C:2009, Aviation, Space and Defense Quality Control Management System
Standard, define process-based quality management systems and are acceptable for
use on contracts per FAR 46.202-4, Higher-Level Contract Quality Requirements .
AQAP-2110 and AS 9100 contain additional requirements beyond ISO 9001. AS 9100
is applicable to most complex DoD systems. The AQAP 2000 series should be
considered for complex DOD systems, when the supply chain or the end products have
NATO or international implications. Program managers should consider the use of
additional requirements (such as those contained in the Missile Defense Agency
Assurance Provisions) beyond ISO 9001 as appropriate.
Other sector specific quality management systems acceptable under FAR 46.202-4
include:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      953
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
leadership involvement, will enable a capability for delivering high quality products or
services.
    •    Effective policies and procedures that encourage the use of the system;
    •    Organizations with defined authorities and responsibilities;
    •    Objectives to drive people, processes, and the system;
    •    Method to analyze and resolve quality problems;
    •    Metrics that reflect desired outcomes;
    •    Interacting processes to transform inputs into outputs; and
    •    Records as evidence of what happened.
Furthermore, to the extent that they are available, metrics that show the effectiveness of
the contractor's quality management system and processes over time should also be
used to differentiate among offeror’s.
The following subsections describe several broad areas that have had a significant
impact on quality. Topics include Customer Satisfaction , Supply Chain Quality
Management , Top Management Involvement , and Continual Improvement of
Performance . They provide additional guidance on items the program office and the
contracting office should ask for in Requests for Proposals and evaluators should look
for in proposals to make a better assessment of a contractor's quality. These items may
be used to differentiate among offeror’s. Depending on the specific situation, there may
also be other areas (e.g., competent personnel for special processes) where information
should be sought.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      954
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Too often in the past, the Department of Defense relied heavily upon detailed technical
and management proposals and contractor experience to compare the relative
strengths and weaknesses of offers. This practice often allowed offeror’s that could
write outstanding proposals, but had less than stellar performance, to "win" contracts
even when other competing offeror’s had significantly better performance records and,
therefore, represented a higher probability of meeting the requirements of the contract.
Emphasizing past performance in source selection, can help ensure that the winning
teams (prime contractors and major subcontractors) are likely to meet performance
expectations. When evaluating past performance data, consideration should be given to
the relevancy, complexity and ultimate mission success of the contract.
Beyond the Department's past performance information, a Request for Proposals may
ask for further evidence of customer satisfaction such as data tabulated from customer
surveys or from complaints and equally important, how changes were made because of
the results.
It is important for DoD program managers to inform their prime contractors of their
interest in quality throughout the supply chain. Therefore, through requests for
proposals and corresponding proposal evaluation factors, the program office and the
contracting office should request and evaluate evidence of effective supply chain
management. The evidence should reflect the following characteristics:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      955
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    The use of supplier development programs focused on continuous improvement;
    •    Strategic partnerships with suppliers, over the product life cycle, that are based
         on a clear understanding of the partners' and customer’s' needs and
         expectations in order to improve the joint value proposition of all stakeholders;
    •    Processes that effectively and efficiently monitor, evaluate, verify, and improve
         the suppliers' ability to provide the required products with a focus on defect
         prevention rather than defect detection;
    •    Right of access for both the prime contractor and the Government to supplier
         facilities and documentation where applicable; and
    •    Requirements for the supplier to flow down analogous quality management
         system provisions to its subcontractors.
Because quality deficiencies often occur in the lower tiers, prime contractors, in addition
to having approved vendor (i.e., subcontractor) lists, should ask their subcontractors'
about planned suppliers. These subcontractors should flow the same requirement down
to their suppliers, etc. For critical and complex commercial-off-the-shelf (COTS)
products, the prime and its subcontractors should use their own internal processes and
controls to ensure that the COTS product meets it’s critical attributes.
Quality will permeate all levels of a company only if top management provides the
leadership necessary to drive and reinforce that behavior. Requests for Proposals
should also ask for evidence of top management support for quality. The following list
identifies important factors in evaluating the effectiveness of top management support:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      956
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         customer satisfaction.
An offeror with effective quality management will seek continual improvement of its
processes, product designs, and thereby products by improving its overall performance,
efficiency, and effectiveness. Such behavior increases the likelihood of increasing
customer satisfaction and enhancing an organization's competitive posture.
More specifically, all processes have defined inputs and outputs as well as the required
activities, actions and resources. Therefore, process improvement encompasses both:
Such process improvement invariably leads to (work and end) product improvement and
consequently increased customer satisfaction.
    •    Customer satisfaction;
    •    Planning stability;
    •    Good financial performance; and
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      957
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Improved cash flow.
Listed below are examples of contract incentives that can be made available to the
prime contractor and the prime contractor can in turn make available to subcontractors
under the appropriate conditions:
    •    Increased fee;
    •    Extended contract length;
    •    Follow-on contracts awarded;
    •    Accelerated progress payments;
    •    Shared savings; and
    •    Opportunities for return on investments (some of which may increase the
         contractor's competitiveness on other contracts).
The following are some potential ways to use these contract incentives to improve
quality, and at the same time, improve other product characteristics that are of value to
DoD. Their applicability depends on the specific situation.
    •    Warranties. The program manager could treat the warranty as a fixed price
         option per item. If there are no failures, the contractor keeps the money that DoD
         paid for the warranty. To reduce the price of the warranty, the program manager
         could consider a situation where DoD pays to repair the first failure and the
         contractor warranties the next "n" failures. Typically the warranty should exclude
         combat damage, abuse, misuse, and other factors out of the contractors' control.
    •    Award Fee for Product Support Contracts. The program manager could make the
         fee a function of operational availability.
    •    Award Fee for Product Development Contracts. The program manager could
         make the fee a function of successful operational test and evaluation.
    •    Progress Payments. The program manager could make payments contingent on
         successful corrective actions taken to alleviate quality deficiencies. The program
         manager could also establish an agreement with the contractor to repay the fee
         with interest if future measurements do not meet the conditions necessary for the
         entire amount of the fee to be awarded.
    •    Share of Savings. The contract could encourage the contractor to invest in
         facilities, non-recurring engineering, technology insertion, etc. that will result in
         improved performance and reduced costs. The program manager could then use
         the value engineering clause to repay the investment and give the contractor a
         share in the savings generated.
In building such relationships, the program manager should avoid actions that
encourage risky behavior by the contractor. For example, by insisting on reducing cost,
accelerating the schedule, improving performance beyond demonstrated technology
limit’s, etc. the contractor may be forced to forgo quality-related processes. This may
not only defeat the purpose of contractual incentives but also negate the other quality
activities discussed in this section.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      958
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
11.3.3.3 Encouraging a Quality Focus
Applying best practices as described in sections 11.3.3.1 and 11.3.3.2 may not be
sufficient to manage and mitigate process-based risks that may start a chain of events
leading to undesirable outcomes. DoD should also stress the importance of effective
quality management to industry. By encouraging a quality focus, DoD can help avoid
mismatches among value, beliefs, and behaviors. DoD should therefore encourage and
participate with industry to apply effective practices in the following areas.
At Program Startup
Evaluation considerations for each of the above areas are shown below:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      959
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         packages,
    •    Participation in the integration of inspection points into processing and test
         documentation, and
    •    Role in the supplier management, development, incentivization, and control
         process.
    •    The process to identify the need for quality management, quality engineering
         (hardware and software), quality planning, supplier quality, and product
         verification skills across the life cycle,
    •    The process to identify quality skills and any associated certifications and
         qualifications, and
    •    The process for addressing quality staffing ratios and skill shortfalls.
The process for analyzing quality requirements and mitigating associated risks:
    •    The process for identifying and achieving quality tasks in support of contract
         deliverables,
    •    How a post award contract analysis for Quality's tasks was performed / has been
         updated,
    •    An evaluation of how the Quality plan matches the program requirements and
         their integration across program sites, IPTs, partners and suppliers, and
    •    How quality activities factored into the Integrated Master Plan and Integrated
         Master Schedule.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      960
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         management decisions and actions on the program
           o The process for developing and identifying requirements for quality
              metrics and measurement systems
           o The system for monitoring supplier performance, including their product
              development activities
           o The process for review and update
    •    The process for addressing test and inspection findings and discrepancies,
    •    The process for addressing supplier non-conformances,
    •    Establishment and maintenance of a closed loop corrective action system that
         includes the reporting, root cause analysis, and implementation of actions
         necessary to correct and preclude recurrence of problems, failures, quality
         issues, defects/non-conformances, and
    •    The process for using lessons learned to drive continuous improvement.
    •    The process to collect, monitor, and analyze information for measuring customer
         satisfaction,
    •    The process to rapidly mitigate customer concerns,
    •    The process to communicate with customer’s at all levels, and
    •    The process / organizational structure for reacting to customer inquiries and
         needs.
The program managers and responsible technical authority will utilize DoD preferred
method of acceptance as reflected in MIL-STD-1916 , DoD Preferred Method of
Acceptance , (login, then URL:
https://assist.daps.dla.mil/online/parms/mainframe.cfm?ident_number=120287 ), to
allow contractors the maximum degree of flexibility to meet product or service
requirements. The preferred method is acceptance by contractor-proposed provisions
based on prevention-based strategies and process controls. The theme is partnering
between Government and contractor to develop an optimal acceptance method for
products and services that is consistent with the contract requirements for submission of
all conforming products or services.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      961
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Prior to achieving effective prevention-based strategies and process controls, MIL-STD-
1916 provides standardized acceptance sampling systems which are consistent with the
contract requirements for submission of all conforming products or services. These
sampling systems allow program managers to influence continuous improvement
through corrective action while still allowing maximum degree of flexibility to contractors.
GCQA is a joint responsibility between the program office and Defense Contract
Management Agency (DCMA). Interdisciplinary skills (such as quality assurance,
industrial specialist, engineering, and software) are needed.
The program manager should establish open and effective communication with DCMA.
DCMA uses Contract Data Package Recommendation/Deficiency Reports (DD Form
1716) for the following:
For item-managed contracts, Defense Logistics Agency ICPs issue Quality Assurance
Letters of Instruction to DCMA to provide additional contractor past performance history
and to request tailored or specialized surveillances during contract performance.
For defense acquisition programs, the program manager should conduct a customer
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      962
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
outcome strategy meeting (i.e., a post award conference) soon after the Systems
Development and Demonstration contract award. At this meeting, the participants
should:
The program manager should ensure that some of these performance measures relate
to key processes in the acquisition framework. For example, the performance measures
should be linked to the entrance and exit criteria of the systems engineering technical
reviews and the Milestone programmatic reviews during both the Systems Development
and Demonstration Phase and the Production and Deployment Phase of the acquisition
management framework.
The program manager should form a GCQA team and allow it the flexibility to formulate
a risk-based quality assurance surveillance strategy designed to ensure that customer
outcomes are achieved. The surveillance strategy should focus on the effectiveness of
the contractor's product realization process which includes:
The surveillance strategy should also cover the contractor's continual improvement
process. To be effective, this process should be data driven and the data should (1) be
used to address both actual and predicted problems, and (2) should be revised to
remain connected to process changes. In addition, include both periodic audits of the
contractor's quality management system as well as product examinations in the
surveillance strategy. Both independence and the use of criteria in conducting audit’s
and surveillance are critical to providing objective, meaningful insight.
As performance data are collected, the GCQA team should adapt the surveillance
strategy based on risks identified and the need for special surveillance of critical safety
items, critical characteristics or processes, mission critical items, key characteristics,
etc. When planned results are not achieved, the program manager should ensure that
preventive and corrective actions are developed and implemented. The GCQA team
should extend the surveillance to verify that such actions accomplished their objectives.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      963
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
11.3.3.4.2. Government Contract Quality Assurance (GCQA) Inspections
Per FAR Parts 46.402 and 46.404, the program manager shall use destination
inspection for contracts or purchase orders under $250,000 for the reprocurement of
items with no significant technical requirements, no critical characteristics, no special
features, and no specific acquisition concerns, and where there is confidence in the
contractor. Such inspections are limited to kind, count and condition. This may involve
preservation, packaging, and marking (if applicable). Put FAR 52.246-1 on the contract.
Use FAR 52.246-2 without FAR 52.246-11 only in those rare circumstances where there
is reason to believe that there may be a problem.
The program manager should put both FAR 52.246-2 and FAR 52.246-11 (or FAR
52.246-8 for research and development programs) on the contract. FAR 52.246-2
allows Government access to the facility and requires the contractor to develop and
maintain an inspection system. FAR 52.246-11 requires the contractor to implement a
higher level quality management system. The responsible technical authority should
prepare a Quality Assurance Letter of Instruction through the contracting officer to
ensure that appropriate product specifications, drawings, and inspection and test
instructions, including critical characteristics, are available and/or identified for use by
the Defense Contract Management Agency. GCQA at the source encompasses one or
more of the following based on defined risk:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      964
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         communication.
Special attention must be paid to CSIs regardless of whether they are item-managed or
program-managed. Defense Federal Acquisition Regulation Supplement ( DFARS)
246.103 states that the activity responsible for technical requirements may prepare
instructions covering the type and extent of Government inspections for acquisitions that
have critical applications (e.g., safety) or have unusual requirements. Section 4.3.18.6
discusses CSIs as a systems engineering design consideration. It provides a definition
and links to some additional reference material.
The contracting officer should clearly mark the front page of the solicitation/contract with
the words "Critical Safety Item." This raises the alertness level and makes everyone
aware that CSIs are involved in the contract. When CSIs are determined after contract
award, the responsible technical authority should use the words "Critical Safety Items"
in the subject line of a Quality Assurance Letter of Instruction (QALI). All critical and
major characteristics, the extent of inspection required, and the associated acceptance
criteria should be described either in the contract or in the QALI. In addition, the
technical authority should provide criteria for special inspections, process verification, or
similar requirements. Acceptance criteria should also include additional instructions for
situations where a CSI is purchased from a distributor, a CSI is purchased on a
commercial contract, or CSI critical characteristics cannot be fully examined at a prime
contractor's facility. To assure the communications loop is closed with Defense Contract
Management Agency (DCMA), the QALI should request acknowledgement and DCMA
acceptance of duties included within. The form should be returned to the responsible
technical authority that transmitted the QALI.
Public Law 108-136, "National Defense Authorization Act for FY04," Section 802,
Quality Control in the Procurement of Aviation Critical Safety Items and Related
Services, " requires that the head of the design control activity for aviation critical safety
items establish processes to identify and manage the procurement, modification, repair,
and overhaul of aviation critical safety items." DoD procedures for managing aviation
CSIs are contained in Joint Service instruction, " Management of Aviation Critical Safety
Items ," and the Joint Aeronautical Logistics Commanders' Aviation Critical Safety Items
(CSIs) Handbook . Additionally, per DFARS 246.407, the head of the design control
activity is the approval authority for acceptance of any nonconforming aviation critical
safety items or nonconforming modification, repair, or overhaul of such items.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      965
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
they have a concern, question, or possess additional information important to achieving
customer outcomes.
Implicit in this approach is the need to conduct the activities that capture relevant,
product development knowledge. And that might mean additional time and dollars.
However, knowledge provides the decision maker with higher degrees of certainty, and
enables the program manager to deliver timely, affordable, quality products.
The following knowledge points and ensuing considerations coincide with decisions
along the acquisition framework:
Program Initiation . Knowledge should indicate a match between the needed capability
and available resources before a program starts. In this sense, resources is defined
broadly, to include technology, time, and funding.
Considering the knowledge associated with technology, the knowledge should be based
on demonstrated accomplishments. If a technology is not mature, the DoD Component
must use an alternative technology or discuss modifying requirements with the users.
By requiring proven technology before a program starts, we reduce uncertainty. Rather
than addressing technology development and product development, the program
manager and Milestone Decision Authority can focus on product development, because
they know the technology is available. DoD Instruction 5000.02 enforces this concept
with the following policy:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      966
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
             Technology developed in S&T or procured from industry or other
             sources shall be assessed to determine whether they are considered
             mature enough to use for product development (see the "Technology
             Readiness Assessment (TRA) Guidance ). . . . If technology is not
             mature, the PM shall use alternative technology that is mature and
             that can meet the user’s needs or conduct a dialog with the user to
             modify the requirements. Technology readiness assessments shall be
             conducted by the PM and used by the MDA to assist in determining
             whether program technologies have acceptable levels of risk based in
             part on the degree to which they have been demonstrated, including
             demonstration in a relevant environment, and to support risk
             mitigation plans prepared by the PM. They will be focused on the
             specific planned technical solution.
Program managers should maximize the use of Defense Contract Management Agency
(DCMA) personnel at contractor facilities. The program manager should only assign
technical representatives to a contractor's facility as necessary. Technical
representatives shall not perform contract administration duties as outlined in Federal
Acquisition Regulation (FAR) Section 42.302(a) .
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      967
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
DCMA prioritizes and balances it’s Contract Management activities to reduce acquisition
risk by focusing limited resources on the highest risk processes, products, and
programs. DCMAs mission is best achieved when there is open communication and
teaming between DCMA and its acquisition partners and when there is a full
understanding of all program risks and acquisition objectives.
While DFARS 242.202 allows for limited exceptions to DCMA performing contract
administrative functions, it is not a prudent use of limited DoD resources for buying
activities to duplicate the contract administration functions assigned to DCMA. Similarly,
DCMAs acquisition partners are not authorized to audit DCMA operations. In our
constrained fiscal environment, organizations should not be expending precious funds
to perform functions budgeted elsewhere by the Department. This duplication may
create additional costs for Industry, and ultimately the Department; these are costs that
we cannot afford.
Per DFAR PGI 242.74, when the program, project, or system manager determines that
a technical representative is required, the manager shall issue a letter of intent to the
contract administration office commander listing the assignment location, starting and
ending assignment dates, technical duties assigned, delegated authority, and support
required from the contract administration office. Any issues regarding the assignment of
a technical representative should be resolved promptly. However, final decision on the
assignment remains with the program manager. Issues regarding the assignment of
technical duties that cannot be resolved between the program office and the on-site
DoD contract administration office will be elevated.
The program, project, or system manager will furnish the designated technical
representative a letter of assignment of delegated technical duties, with copies to the
contract administration office, the contracting officer, and the contractor, at least 30
days before the assignment date (or termination date). Any changes to the
requirements of the assignment letter will be made by a new letter of intent and
processed in accordance with paragraph (1) of this section.
The contract administration office normally provides the technical representative with
office space, equipment, supplies, and part-time clerical support. The program, project,
or system manager provides supervision, technical direction, administrative services
(e.g., pay, travel, maintenance of personnel records), and, when required, full-time
clerical support.
The program manager or designee and the contract administration office, at the local
level, shall negotiate a memorandum of agreement (MOA) delineating their functional
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      968
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
administrative interrelationships, with annual updates as necessary. The agreements
may be included in an existing MOA, if one exists, or as a separate MOA.
The technical representative shall keep the contract administration office commander
fully informed of matters discussed with the contractor. The contract administration
office shall also keep the technical representative fully informed of contractor
discussions that relate to technical matters within the purview of the technical
representative's assigned duties.
The program office staff should interface with contractors' councils, keeping in mind that
such councils are not federal advisory committees under the Federal Advisory
Committee Act . The staff may find that these councils strengthen the corporate
relationship with the Department of Defense, provide an interface between company
representatives and acquisition managers, communicate acquisition reform initiatives,
or even resolve issues. In leading corporate endeavors, such as Standard Procurement
System proposals, civil-military integration ideas, or other initiatives designed to achieve
efficiencies for the company, these councils may ultimately produce savings for the
Government.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      969
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
11.7. Property
11.7. Property
GPPC includes Government property that is not "owned" by the program manager, but
is "used" on the program. Government property may only be furnished to contractors
under the criteria, restriction, and documentation requirements addressed in Federal
Acquisition Regulation 45.102 and Procedures, Guidance, and Information 245.105.
DoD policies, processes, and practices are structured on delivery, receipt and
acceptance of property. This aligns and is consistent with other DoD processes and
practices (e.g., Wide-Area Work Flow , Unique Item identification). (Note: The Wide-
Area Flow site access is conditional based on registration and identification of user
roles.) Although the DoD may have title to some property, e.g., property acquired,
fabricated, or otherwise provided by the contractor for performing a contract, such
property has not yet been delivered.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      970
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
contract, consistent with the terms and conditions of the contract or third party
agreement, for the Government property in their care.
Although the Department of Defense may not have physical custody, to maintain
effective property accountability and control and for financial reporting purposes, DoD
Components are required to establish records and maintain accountability for property
(of any value) furnished to contractors as Government Furnished Property.
Modeling and Simulation capabilities can significantly improve the efficiency and
effectiveness of conceptualization, development, experimentation, test, and sustainment
activities during the life cycle of DoD systems. The program manager should employ
M&S resources and products during system design, test and evaluation, modification,
upgrade, and operations and sustainment.. The program manager should collaborate
with the weapon system operational users, analysis agencies, test and training activities
(e.g. government laboratories and facilities), and consider industry inputs during M&S
program planning. Planning should include the application, support, documentation, and
reuse of M&S resources, including data and analyses generated outside the program of
record, as well as from the program of record; and the integration of M&S across
functional disciplines.
The following additional considerations are useful during M&S planning activities:
    •    Plan for M&S and make necessary investments early in the acquisition life cycle.
    •    Incorporate M&S tools to improve the requirements development process.
    •    Employ M&S tools to assist in the evaluation of contractor proposals.
    •    Develop system models in preparation for use across a wide range of disciplines
         (e.g. use of CAD/Cam for training manuals, etc.).
    •    Identify or define standards and technical requirements that support re-use or
         leverage of M&S resources and products throughout the system life cycle to the
         greatest extent possible. Where it is necessary to invest in M&S development,
         ensure that licensing is appropriate, and avoid exclusive rights of developer.
    •    Use and reuse models and simulations, modified as appropriate to the task, in
         order to provide consistent and efficient test planning, pre-test results prediction,
         posttest evaluation, and the validation of system interoperability; and to
         supplement design qualification, actual test and evaluation, manufacturing, and
         post-production and operational support.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      971
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Employ verified, validated models and simulations, and ensure credible
         applicability for each proposed use.
    •    Use data from other activities (e.g. development test) during weapon system
         development to assist in model, simulation, and data validation.
    •    Involve the developmental and operational test agencies in M&S planning early
         in the application of M&S to efficiently support both developmental test and
         operational test objectives.
    •    Have the Defense Intelligence Agency review and validate threat-related
         elements of the models and simulations.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      972
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
DEFENSE ACQUISITION GUIDEBOOK
Chapter 12 - Defense Business System Definition and Acquisition Business
Capability Lifecycle (BCL)
12.0. Overview
12.3. Execution
12.0. Overview
12.0.1. Contents
12.3. Execution
12.0. Overview
The purpose of this chapter is to provide guidance for executing the Business Capability
Lifecycle (BCL) and is intended for use with existing policy for the definition and
acquisition of defense business systems (DBS). It is intended to serve as a reference to
support all Department of Defense (DoD) staff with responsibilities throughout a DBSs
lifecycle.
12.0.1. Contents
This overview discusses the Business Capability Lifecycle (BCL) at a very high level
and includes the major BCL tenets and a summary of the BCL process. Following the
overview, the chapter includes the following sections:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      973
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Section 12.1., Business Capability Definition Phase, provides an overview of the
Business Capability Definition (BCD) Phase process and an explanation of the rigorous
up-front analysis conducted during BCD Phase activities that result in the Problem
Statement section of the Business Case.
Section 12.3., Execution, provides an overview of the last four phases of BCL and an
explanation of the activities for each of these phases that result in a fully designed,
developed, tested, deployed, and sustained increment of capability (materiel and non-
materiel solution) that satisfies the specific outcomes defined in the Business Case.
Section 12.5., Tools and Methods, provides information on the various tools and
methods used in conjunction with BCL activities, including an explanation of a Doctrine,
Organization, Training, Materiel, Leadership and education, Personnel, Facilities and
Policy (DOTMLPF-P) analysis and an explanation of outcomes and measures
development.
12.0.3. Tenets
12.0.4. Process
The Department has instituted the Business Capability Lifecycle (BCL) to address the
unique challenges of information technology (IT) - recognizing that the deliberate
weapon systems acquisition process is not agile enough to meet the speed at which
new IT capabilities are being introduced to the market.
BCL is the overarching framework for the planning, design, acquisition, deployment,
operations, maintenance, and modernization of defense business systems (DBS). It
promotes rapid delivery of business capability by facilitating a process tailored to the
unique requirements of DBS. It is based on industry best practices, studies, and
emerging legislation. It realigns three separate Offices of the Secretary of Defense
(OSD) oversight processes into a single process, per Figure 12.0.2.F1 .
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      974
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
   Figure 12.0.2.F 1 Re-Alignment of Major DoD Processes for Business Systems
BCL mandates delivering capability quickly within 18 months or less, recognizing that
new and evolving IT requires frequent upgrades and that requirements must be
reprioritized and new ones will emerge. The goal of BCL is to enable the acquisition and
deployment of DBS capabilities to match the speed at which they become viable.
Industry and government have learned the big-bang approach to delivering IT seldom
meets user expectations; the time-lag between requirements and delivery is often too
long.
12.0.3. Tenets
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      975
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         produced, deployed, and sustained;
    •    May consist of multiple capability releases to facilitate delivery, while abiding by
         Milestone Decision Authority (MDA)-approved BCL time-limited delivery rules;
         and
    •    Must be fully-funded.
    •    The MDA has overall responsibility for DBS acquisition decisions and issues
         Acquisition Decision Memorandums (ADMs) at appropriate decision points .
    •    Investment Review Boards (IRBs) provide: The structure that integrates
         requirements, investment and acquisition reviews along with portfolio
         management for DBS; and, Act as the Overarching Integrated Product Teams
         (OIPTs) for Major Automated Information Systems (MAIS) DBS to advise the
         MDA for acquisition purposes.
    •    The Defense Business Systems Management Committee (DBSMC) makes final
         decisions over granting obligation authority (i.e., certification decisions) via the
         IRB process for all DBS and capabilities (i.e., requirements) costing $1M over the
         current Future Years Defense Program (FYDP).
    •    For DBS that are not MAIS, DoD Components are expected to establish similar
         processes and procedures as described in policy and this DAG chapter.
More information on governance and oversight processes and procedures are detailed
in section 12.4.2 , BCL Governance.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      976
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
4. Base acquisition decisions on risk and rigor, not document format. The
objective is to bring the right information and people to the point of decision-making
while eliminating non-value-added documents. This tenet is based on the assumption
that decisions should be based on the risk of delivering the capability to cost and
schedule and how prepared the program is to do so, not whether the program has
produced an over-abundance of properly formatted documentation with the "right
approvals.
Supporting this tenet, BCL (1) encourages tailoring, both the process and information
requirements, and (2) the use of a Business Case and Program Charter. From a
program structuring / process perspective, the goal should be to design and scope a
program that will deliver capability rapidly, while tailoring out unneeded or non-value
added steps. The goal of tailoring information requirements by decision authorities
should only require PMs and other participants in the defense acquisition process to
present the minimum information necessary to establish the program baseline, describe
program plans, understand program status, and make informed decisions; tailoring in
information requirements by the MDA is the recommended approach to achieve this. In
general, tailoring should consider multiple factors including program size, scope, risk,
urgency of need, and technical complexity. The PM proposes and the MDA approves
tailoring decisions in an ADM.
BCLs use of a Business Case (which integrates program-level plans and information for
decision makers) and a Program Charter (which outlines roles, responsibilities, and
organizational agreements) reduces the amount of documentation that must get
coordinated, particularly at the Office of the Secretary of Defense (OSD) level, reducing
time and cost. Program-level documentation may still be coordinated and approved
within the Component, but does not need to be approved by OSD. This approach
places focus on the need, the solution, and the risk not the amount of documentation.
The Business Case Template and Program Charter Template are available on the
Office of the Deputy Chief Management Officer (DCMO)'s BCL webpage .
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      977
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
12.0.4. Process
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      978
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
12.1. Business Capability Definition (BCD) Phase
The purpose of the Business Capability Definition (BCD) Phase is, upon the
identification of a problem, need or gap, to analyze it, understand it, and scope it.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      979
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
12.1.2. BCD Phase Process
The Business Capability Definition Phase (BCD) Phase begins with identification of a
business need ( note : it can also be considered a problem, symptom, gap, opportunity
or myriad other things, but in the Business Capability Lifecycle (BCL) and throughout
this Chapter it is referred to as a "problem" or "business need"). Multiple activities occur
in this phase, including clearly defining the problem and it’s root cause(s); conducting an
"As-Is" and "To-Be" Doctrine, Organization, Training, Materiel, Leadership and
education, Personnel, Facilities and Policy (DOTMLPF-P) Analysis; conducting high-
level Business Process Re-engineering (BPR) through the Functional Sponsor's
assessment of the desired outcomes; proposing a solution mix (either solely non-
materiel, or a mix of materiel and non-materiel); and defining / validating High-Level
Outcomes (HLOs) and measures that scope it. These activities result in a Problem
Statement, the main output of the BCD Phase.
Once the Component has approved the Problem Statement, the Functional Sponsor will
forward it to the Investment Review Board (IRB) for review and IRB Chair approval. If
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      980
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
approved, the IRB Chair will subsequently direct the development of Analysis of
Alternatives (AoA) Study Guidance to allow the Component to develop an AoA Study
Plan for approved Problem Statements that contain a materiel component.
  TIP: It is critical that functional users, who not only have an understanding of the
  problem but are also invested in its outcome, are involved in BCD Phase analysis to
  ensure the problem is well-understood and outcomes are developed correctly from
  the outset.
In the Business Capability Lifecycle (BCL), the Business Capability Definition Phase
(BCD) Phase consists of rigorous analysis activities, the output of which is the Problem
Statement (sections 1-3 of the Business Case Template, available on the Office of the
Deputy Chief Management Officer (DCMO)'s BCL webpage ). The final Problem
Statement can be developed either step-by-step as BCD Phase analysis activities are
completed or after all analysis has been completed. The activities involved in
developing the Problem Statement are depicted in Figure 12.1.3.F1 .
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      981
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
 Figure 12.1.3.F1 - Decomposition of "Develop Problem Statement" Process Step
TIP: When conducting the BCD Phase analysis, consider the following questions:
During Business Capability Definition (BCD) Phase "As-Is" Analysis, users / functional
experts take the problem that has been identified, put it into appropriate context
(organizational, environmental, etc.) and conduct both a Root Cause and Doctrine,
Organization, Training, Materiel, Leadership and education, Personnel, Facilities and
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      982
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Policy (DOTMLPF-P) Analysis (or, a DOTMLPF-P Constraints Assessment).
The business need is analyzed by users and functional subject matter experts (SME)
who understand the business processes and environment in which the business need
exists. This analysis results in a concise definition of the problem.
The analysis must take into consideration the functional scope and organizational span
of the problem - who is affected by it and where (i.e., Component-only or Enterprise-
wide). It also includes describing the problem in further detail and providing context and
boundaries (i.e., functional scope and organizational span), which expresses the
problem in a manner that is specific, testable, and quantitative in nature.
It is important to bound this analysis to ensure it does not reach into the territory of
specific, potential solutions ("a system will fix this" or "an Oracle ERP is the answer")
and focuses simply on defining the problem.
The following is a summary of the Determine Problem to be Solved activity along with
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      983
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
an example:
    2. The Functional Sponsor directs a team to analyze the problem that was
       identified.
Analyze and Validate. The Problem: It takes too long to get a security clearance.
Problem is decomposed and validated with anecdotes such as: The end-to-end
processing of an initial top secret clearance took 311 days; GAO has reported concerns
of the quality of investigative and adjudicative work in processing clearances; it is
impossible for facility security officers to get clearance information in a timely manner.
This information, which is intended to provide a better understanding of the problem, will
feed Root Cause Analysis.
Root Cause Analysis. One of the issues the Department faces with successfully
fielding information technology (IT) business capabilities is making the leap from
problem to solution too quickly, resulting in a solution that doesn't meet the fundamental
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      984
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
business need but rather provides temporary "band-aids" for its symptoms. The
tendency to "do something now!" must be appropriately balanced with a process that
mitigates the risk of fixing a symptom vs. it’s root cause(s). Within BCL, the expectation
is that functional SMEs will analyze the problem to ensure complete understanding of
it’s true or root causes.
  TIP: There are many definitions of a "root cause". The United States Air Force Air
  War College defines a "root cause" as " the fundamental breakdown or failure of a
  process which, when resolved, prevents a recurrence of the problem ".
  (To view the following link, copy and paste it into your browser)
  http://www.au.af.mil/au/awc/awcgate/nasa/root_cause_analysis.pdf
There is no single methodology for performing Root Cause Analysis and various
approaches (such as brainstorming, "5-Whys" analysis, and Cause and Effect
[Fishbone] diagrams ) can yield satisfactory results. A good option is to consult your
Component's Lean Six Sigma point-of-contact for guidance.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      985
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
For example, the "5-Whys" analysis starts by asking " why did the problem occur?"
and then takes the answer and asks that same question of the answer. This question
and answer cycle is repeated until you reach the fundamental process element that
failed. Regardless of the methodology used, it is imperative that the Root Cause
Analysis is thorough to ensure that resources are focused on the right item.
The following is a summary of the Root Cause Analysis activity along with an example:
    1. If a team has not yet been formed, the Functional Sponsor at this point should
       establish a team to conduct Root Cause Analysis. SMEs may be involved at
       multiple levels of the security clearance process.
    2. The team examines the problem, context, and discovers what may be symptoms
       or root causes:
    3. Based on expert judgment, the team analyzes the information to derive root
       causes apart from symptoms. Some of the root causes on security clearances
       were:
Root Causes. Data and processes are not standardized across agencies; there are
difficulties obtaining information from some national, state and local record providers;
and, not enough resources are available to handle the number of security clearance
requests.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      986
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
             Figure 12.1.3.1.F3 - DOTMLPF-P Constraints Assessment Context
These causal factors are referred to as "DOTMLPF-P constraints" and help determine
whether the problem can be solved by eliminating DOT_LPF-P constraints. For
example, "changing the Component-level policy will achieve the desired outcomes". It is
important to understand the impacts and consequences of implementing non-materiel
changes just as much as materiel changes - revising policy or enhancing training
programs, for example, can have obvious benefit’s but may also add cost and risk that
must be mitigated. It is highly likely that DOT_LPF-P factors or underlying business
processes are contributing to the problem, and will in fact contribute to the solution.
  TIP: It is possible that the solution can be entirely DOT_LPF-P, consisting of, for
  example, a combination of policy, Component-level guidance, BPR, and re-training
  solutions. Completing this analysis thoroughly may help determine that a materiel
  solution is not needed at all - or that the problem has many more factors that need
  to be solved (i.e., is more complex) than originally thought.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      987
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Analysis Activity along with an example:
The "To-Be" Analysis consists of the identification of High-Level Outcomes (HLOs) and
measures, Business Process Re-engineering (BPR), and a "To-Be" Doctrine,
Organization, Training, Materiel, Leadership and education, Personnel, Facilities, and
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      988
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Policy (DOTMLPF-P) Analysis.
Identify High-Level Outcomes (HLOs) and Measures. The Functional Sponsor (who
represents the needs of the user[s] that originally identified the problem) is ultimately
responsible for declaring whether the needed capability has been delivered. Therefore,
measurable High-Level Outcomes (HLOs) must be identified up-front so all
stakeholders know what constitutes success. Too often programs begin without a clear
understanding of what the end-state should be and subsequently development (and
corresponding costs) becomes endless.
More information on the outcome development process that begins in the BCD Phase
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      989
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
and extends throughout BCL can be found in section 12.5.3 , Outcomes and Measures
Development .
   TIP: Major updates to a Business Case once it has been approved require review
   and re-approval by appropriate authorities, which may ultimately cause a delay in
   capability delivery. Therefore, it is imperative that the analysis of "what good looks
   like" is performed as thoroughly as possible, and HLOs and their associated
   measures provide a definitive baseline for testing during the Execution Phase. The
   Problem Statement, once approved, should not change.
The following is a summary of the Identify HLOs and Measures activity along with an
example:
Risk identification started during this activity begins a continuous process of risk
management throughout the lifecycle of the program. This is a good point to establish a
formal process of identifying, documenting, managing, and mitigating risks. An example
of risk documentation is shown in section 12.2.3.2 , Define Risks and Risk Mitigation,
Table 12.2.3.2.T2 .
Strategic linkage: Enhance the DoD Civilian Workforce (DoD SMP Business Goal #4).
    3. Based on the determined HLO, the Functional Sponsor works with the team to
       determine a quantitative metric for the success of the HLO:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      990
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
              o    (metric) Days from application to clearance granted.
              o    (measure) Current: 444 days; Target: 60 days.
There must be an understanding of the "As-Is" processes for BPR to be effective so that
defects and issues can be identified and eliminated in the eventual "To-Be" state. The
ideal business process is defined during the initial BPR and specific, actionable
business outcomes will be developed based on the HLOs and potential courses of
action will emerge (more information on the outcome development process is located in
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      991
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
section 12.5.3 , Outcomes and Measures Development ).
More information on BPR, see the Office of the Deputy Chief Management Officer
(DCMO)'s BPR webpage .
  TIP: Content in the Business Case's Problem Statement should not be replicated in
  a BPR Assessment Form. Refer to the original content through reference or
  hyperlink.
End-to-End (E2E) Process Alignment . For BCL, a primary input to conducting BPR is
aligning HLOs to the Department's End-to-end (E2E) Business Process Flows, which
are mapped to the Business Enterprise Architecture (BEA) . Business outcomes and
activities developed during initial BPR should align to an E2E Business Process and
corresponding E2E Flows to show which E2Es will be affected if the "To-Be" state is
realized.
The DoD has currently identified15 E2E Business Flows which represent a combination
of mature, industry best practices and DoD-specific business functions. Each E2E
Business Flow is a value chain that represents a set of integrated business functions
that fulfill a need identified by the organization. E2Es are cross-functional, cutting across
organizational boundaries. By streamlining business processes using an end-to-end
approach, organizations can create consistent data models, eliminate data
redundancies, eliminate the need for duplicate data entry, eliminate the need for manual
reconciliations between DBS and reduce the total life-cycle costs of the organization's
DBS.
The Functional Sponsor will identify which E2E(s) will be affected by matching HLOs
and business outcomes to the E2E Flows. More information on the E2Es, including
instructions on how to allocate Business Processes to Business Flows and document
them in the Business Case can be found in section 12.5.4 , BEA and BCL. Additional
information and reference material are available on the DCMO's BEA webpage .
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      992
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         out of E2E business flows. Leveraging the BEA through the E2Es, the team
         aligns, maps, and decomposes processes. Refer to section 12.5.4 , BEA and
         BCL and the DCMO's BEA webpage .
The results of the BPR and the E2E alignment are summarized in the Business Case by
showing the decomposition of the HLOs into subordinate business outcomes. The
important characteristics of each business outcome are also defined - specifically
measures, benefit’s, risks, assumptions, dependencies and constraints.
    1. Determine the business outcome(s) that support achieving each HLO applicable
       to the security clearances problem:
Business Outcome Definition: Will provide Security Officers with a listing of security
clearance determinations, eliminating the unnecessary processing of clearance
applications for applicants with prior clearance investigations or adjudicative
determinations
    2. Align the business outcomes to the BEA's E2E Business Process Flows and then
       drill-down to underlying Business Processes and Business Capabilities
       applicable to the security clearances problem:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      993
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                 Figure 12.1.3.2.F3 - DOTMLPF-P Impact Assessment Context
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      994
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
DOTMLPF-P Impact Assessment Example. Table 12.1.3.2.T1 is an example summary
output of a "To-Be" DOTMLPF-P analysis (Note: more detailed information does not
appear in the Business Case and is kept as "working papers"):
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      995
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
        Figure 12.1.3.3.F1 - Determine Recommended Course of Action Context
No specific solutions are recommended at this point, but based on the analysis
conducted, sufficient information is available to inform decision makers that a Materiel
Development Decision (MDD) may be required to move forward in the process and, if
so, enable an Analysis of Alternatives (AoA) to explore specific materiel solution
options.
The Functional Sponsor's recommendation(s) is one of the factors that the Investment
Review Board (IRB) Chair will consider when reviewing and determining whether to
approve the Problem Statement. The Functional Sponsor's recommendation should
also include any DOTMLPF-P impacts.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      996
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
"developed when a quick estimate is needed and few details are available. Usually
based on historical ratio information, it is typically developed to support what-if analyses
and can be developed for a particular phase or portion of an estimate to the entire cost
estimate...."
It is important to provide a best guess for the ROM as it will be an indicator of the level
of oversight required after Problem Statement approval.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      997
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Review. Considerations for this activity are outlined in Figure 12.1.3.3.F4 .
  When compiling the Problem Statement, and before presentation to the IRB, the
  Functional Sponsor should determine whether the following questions have been
  adequately answered / addressed:
       1. Does the Problem Statement concisely and convincingly demonstrate that the
          business need exists and merit’s solving?
       2. Have comprehensive Root Cause and DOTMLPF-P analyses been
          performed?
       3. Has this business need / problem already been solved in the Department, as
          discovered through initial research?
       4. Have specific and measurable success factors been defined and agreed
          upon among the functional and stakeholder community?
       5. Do initial BPR efforts result in enough streamlining and efficiencies to warrant
          further analysis and continued investment?
       6. Is it clear what the Functional Sponsor is seeking from the decision maker,
          and what steps / activities will take place after the decision?
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      998
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
activity:
Once the Functional Sponsor determines that the Problem Statement is ready for
Investment Review Board (IRB) review, it is submitted to the IRB Support Staff.
Procedures for IRB submittal can be located in the "Defense Business Systems
Investment Management Process Guidance", June 2012 . The package must include
Sections 1-3 of the Business Case signed by the Functional Sponsor and a summary
slide containing a short description of the problem, the desired outcome or "what good
looks like", the ROM Cost Estimate, and the proposed business value to the
Department / end-user.
At the IRB, a review is conducted that will generally address items #1-6 of Figure
12.1.3.3.F4 , in addition to the Enterprise and portfolio implications of the need and the
Functional Sponsor's recommendation(s) for the way ahead. The IRB Chair will approve
or disapprove the Problem Statement or may send it back to the Component for
additional work. For an IRB Chair-approved Problem Statement, one of the following
occurs:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                      999
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
12.1.3.5. Materiel Development Decision (MDD) Preparation
Information on developing the AoA Study Plan and sample AoA Study Plan outlines can
be found in DAG Chapter 3, Section 3.3, "Analysis of Alternatives" . The AoA Study
Plan should take into account the HLOs and corresponding measures developed during
BCD as well as the results of the initial BPR, as these will provide valuable input to how
each alternative will be evaluated.
The Study Guidance and Study Plan, along with the approved Problem Statement, will
be reviewed by the MDA at the MDD. Based on the ROM:
    •    Functional sponsors of MAIS-level initiatives will submit the Study Plan through
         the IRB Chair to the MDA.
    •    Functional sponsors of below MAIS-level initiatives will submit the Study Plan
         through appropriate governance channels at the Component level.
The BCD Phase ends with approval of the Problem Statement by the IRB Chair and
submission of the AoA materials to the IRB Chair (or, appropriate Component-level
governance forum). If a Problem Statement is solely non-materiel, the BCD ends when
the Problem Statement is approved since no AoA will be required.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1000
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Problem Statement. It is an iterative process that will result in a strategy and plan that
can be executed to field useful capability.
    •    A completed AoA that enables the Functional Sponsor and program manager
         (PM) to recommend a preferred solution for solving the business need;
    •    A well-defined business and technical management approach that describes how
         the effort will achieve its objectives using the preferred solution-set. The
         Business Case is the summary level document for those functional plans and
         strategies.
    •    A Program Charter defining roles and responsibilities for the potential program;
         and,
    •    Certification of funds to proceed through the next BCL phase;
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1001
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
At the Materiel Development Decision (MDD) the Milestone Decision Authority (MDA),
in most cases, will approve entry into the Investment Management (IM) Phase.
However, it is possible that the MDA may specify a different entry point (phase) into
BCL if the technology supporting the materiel solution has been demonstrated and is
well-understood, and the potential program is defined well enough to begin in a later
phase. If this is the case, the program shall proceed to the designated entry point and
perform the appropriate activities as specified in that phase and the MDD Acquisition
Decision Memorandum (ADM).
During the IM Phase a Program Manager (PM) is typically assigned early on and will
work with the Functional Sponsor to begin managing the materiel portion of IM Phase
activities.
The IM Phase begins with an analysis to describe the requirements for the materiel
solution and an Analysis of Alternatives (AoA) to select a preferred solution. Based on
the preferred solution, the Functional Sponsor and PM will conduct activities necessary
to define a program and develop a well-documented plan to deliver the outcomes
defined in the IRB Chair approved Problem Statement. Planning documents are
developed as appropriate (e.g., systems engineering, test & evaluation) for the program
and are expected to evolve as the program matures; prior to Milestone (MS) A, some
plans are merely strategies to be refined into plans when more facts are known.
The results of IM Phase activities are summarized in a Business Case that provides
decision makers with on overview of the proposed solution including the acquisition and
contracting approach. The Program Charter, that outlines the managerial methods and
standards for governing the program, is also developed during this phase.
Also, prior to the end of IM Phase, the PM must schedule an independent risk
assessment (Enterprise Risk Assessment Methodology (ERAM) is required for MAIS)
approximately 120 days prior to MS A review. An ERAM is required for all MAIS DBS
prior to a MS A or B review. The program manager will collaborate with the risk
assessment team to incorporate findings and recommendations into the program's risk
mitigation plan. No additional documentation is created by the program for a risk
assessment, as it is based on existing program documentation. Detailed risk
assessment findings will be provided to the Functional Sponsor, PM, Investment Review
Board (IRB) Chair, and MDA. Summary ERAM findings are presented at the IRB.
The IM Phase ends when the phase activities are complete and summarized in the
Business case, and the PM compiles and submits the MS A acquisition decision
package to the IRB for review and the IRB Chair forwards a MS A recommendation to
the MDA .
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1002
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
plan based on the results of this analysis. These activities are depicted in more detail in
Figure 12.2.3.F1 . The goal of IM Phase activities is to develop an efficient and effective
plan to fulfill the business need documented in the Problem Statement. The results of
IM Phase analysis and activities are summarized in a Business Case and a Program
Charter, the two key documents used by decision-makers throughout the Business
Capability Lifecycle (BCL). The Business Case Template and Program Charter
Template are available on the Office of the Deputy Chief Management Officer (DCMO)'s
BCL webpage .
Conducting a Materiel Solution Analysis enables the Functional Sponsor to describe the
needed requirements to achieve the high-level outcomes (HLOs) and business
outcomes defined in the Problem Statement. Activities completed during the Material
Solution Analysis include: conducting an analysis on each of the selected alternatives
per the Analysis of Alternatives (AoA) Study Guidance along with their associated
Doctrine, Organization, Training, Leadership and education, Personnel, Facilities, and
Policy (DOT_LPF-P) impacts and risks; comparing each alternative against how well it
will address the HLOs, business outcomes and their corresponding measures to solve
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1003
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
the business need; selecting a preferred solution based on criteria outlined in the AoA
Study Guidance and Plan; and, developing and defining program outcomes. These
activities are depicted in further detail in Figure 12.2.3.1.F1 .
The AoA is an analytical study that is intended to compare the business capability,
performance potential, operational effectiveness, cost, and risks of a number of
potential alternative solutions to address the problem identified in the Problem
Statement. Detailed information about conducting an AoA - including how to develop an
AoA Study Plan - can be found in DAG Chapter 3, Section 3.3, "Analysis of
Alternatives" .
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1004
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                Figure 12.2.3.1.F2 - Conduct AoA Context
Whereas JCIDS uses the Initial Capabilities Document (ICD) to guide the AoA, the AoA
conducted during the IM Phase utilizes information from the Problem Statement and is
directed by the AoA Study Guidance and Plan. It is critical that the results of the
DOTMLPF-P Impact Assessment conducted during the BCD Phase are leveraged
during the AoA.
During the AoA, the Functional Sponsor will leverage a team to assess each defined
alternative and determine which will best solve the problem. Each alternative must be
evaluated in terms of how well it addresses the HLOs, business outcomes, and
measures in the Problem Statement and how well it fits into the "To-Be" state as defined
by the initial Business Process Re-engineering (BPR). Any potential solution must also
have the ability to become Business Enterprise Architecture (BEA) -compliant. A cost
analysis (total life-cycle or total ownership, as appropriate) and cost effectiveness
analysis on each alternative is conducted in addition to market research. The summary
of these results is provided in the Business Case. The summary must include, at a
minimum:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1005
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
 TIP: The HLOs and associated measures developed during the BCD phase should
 have been written to be independent of a particular solution (i.e., solution agnostic).
 Something to avoid is the following scenario found during a GAO audit: DoD and
 service officials responsible for conducting AoA’s indicated that often proposed
 capability requirements are so specific that they effectively eliminate all but the
 service sponsor's preferred concepts instead of considering other alternatives ( GAO-
 09-655, September 2009 ).
    •    Inputs. AoA Study Plan (based on the approved Problem Statement and AoA
         Study Guidance), initial BPR results, HLOs and business outcomes, and
         corresponding measures.
    •    Process. The Functional Sponsor coordinates an AoA study team/working group
         and assesses each alternative using the approved AoA Study Guidance and AoA
         Study Plan. Together, the Functional Sponsor and team will, at a minimum,
         conduct market research, perform cost analysis and provide a summary of the
         alternatives (and the preferred solution) in the Business Case.
    •    Outputs. Solution options (AoA results), AoA summary documented in the
         Business Case.
Once all alternatives have been analyzed according to the AoA Study Plan, the
Functional Sponsor selects the best-value solution in terms of cost, best fit for providing
the desired business capability, performance, support and other factors, for solving the
problem as defined in the Problem Statement. The selection process takes into
consideration impacts of potential tradeoffs, and the principles of Better Buying Power .
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1006
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                    Figure 12.2.3.1.F3 - Preferred Solution Selection Context
  TIP: The "best value" alternative does not always mean the least expensive.
  According to DoD ESI Best Value Toolkit , "best value" is defined as "the expected
  outcome of an acquisition that, in the Government's estimation, provides the
  greatest overall benefit in response to the requirement" . Thus, selecting a preferred
  solution should take into account other factors than just cost, such as performance
  or time, and most fundamentally, meeting the needs of the user.
The following is a summary of the Select Preferred Solution activity along with an
example:
    1. The Functional Sponsor and technical team review the following AoA results
       presented in a table format:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1007
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Solution Option 2: Hybrid solution (i.e., Oracle FM software and custom development)
    3. After careful consideration using best-value determination, the team selects the
       COTS alternative (Oracle Financials software) as their preferred solution. The
       team determined that it is feasible that the policy can be changed (added risk),
       and adopting the commercial business processes will help deliver the desired
       outcomes but will require additional training.
After a solution has been selected as a result of the AoA, the Functional Sponsor along
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1008
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
with the Program Manager performs solution-specific BPR. This activity includes
updating the "To-Be" process based on the business process inherent to the solution in
addition to defining and prioritizing program outcomes based on the decomposition of
HLOs and business outcomes first defined during the BCD Phase. The connected top-
down framework of outcomes from a strategic to a more detailed level ensures
continuity between the HLOs and program outcomes, and provides the basis for
developing more specific system-level requirements to be tested against during
Execution.
Program outcomes defined during the IM Phase should be specific enough to allow the
association of functional requirements and non-functional requirements (i.e., DOT_LPF-
P) during and beyond the Prototyping Phase. For example, if a program outcome
identified during IM is: "Administration capability for role-based authorization", then an
associated functional requirement may be: "The system shall enable a user with the role
of System Administrator to assign one or more roles to a user of the system".
The following is a summary of the Define Program Outcomes activity along with an
example:
    •    Inputs. Preferred solution, initial BPR results, HLOs and business outcomes and
         their corresponding measures, benefit’s, risks, assumptions, constraints, and
         dependencies.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1009
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Process. The Functional Sponsor decomposes the HLOs and business
         outcomes into more specific program outcomes (e.g., what specific functions the
         potential program will perform) based on the initial BPR results, the preferred
         solution, and any previous requirements tradeoffs. Up to now, the business
         outcomes/Capabilities have been driven by the BEA-defined end-to-end (E2E)
         business flows, business processes, and capabilities. Now that a preferred
         solution has been selected the "To-Be" business process may have to be revised
         to accommodate the preferred solution and any tradeoffs made during the
         analysis. If the updated business process ("To-Be) causes gaps between it and
         the BEA a determination will have to be made regarding issuing a waiver or filling
         the gap in the BEA for its next release. Program outcomes must also have
         associated measures, benefit’s, risks, assumptions, constraints, and
         dependencies. A summary of this information is then documented in the
         Business Case as appropriate.
    •    Outputs. Program outcomes and their measures, benefit’s, risks, assumptions,
         constraints, and dependencies; and, the updated business process.
    1. The Functional Sponsor determined the program outcome that aligns to the
       HLOs and business outcome(s):
HLO: Accurate, useful, reliable and timely financial data and management information
Program Outcome Definition: Ensure financial controls and internal controls are
embedded in the financial solution to prevent material weaknesses, and ensure
budgetary integrity by establishing financial control over funds, obligations, assets, and
liabilities.
HLO: Accurate, useful, reliable and timely financial data and management information
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1010
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
trace activities to original documents and verify account balances.
Once the Materiel Solution Analysis is complete, the program manager and Functional
Sponsor must define and describe the potential program in preparation for future
reviews and decisions by the Investment Review Board (IRB) and / or Milestone
Decision Authority (MDA) prior to and during program execution. This includes: defining
a properly scoped Concept of Operations with assumptions; updating the Doctrine,
Organization, Training, Materiel, Leadership and education, Personnel, Facilities and
Policy (DOTMLPF-P) Assessment; identifying any additional risks and developing a risk
mitigation plan; identifying Critical Success Factors (CSFs); conducting a financial and
sensitivity analysis; developing a funding profile and a capability delivery schedule; and,
preparing the necessary information to be summarized in the Business Case and
Program Charter. The order of activities conducted during Investment Management (IM)
is based on what makes sense for a particular program; in fact, many of the activities
may be conducted simultaneously. It's important to note that during Program Definition
various program-level documents will start to be developed to capture key information
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1011
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
that will inform program planning. These activities are depicted in Figure 12.2.3.2.F1 .
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1012
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                              Figure 12.2.3.2.F2 - Define CONOPS Context
  TIP: Remember that the CONOPS provides decision makers with a general
  overview of the potential program based on the preferred solution. While creating
  the CONOPS, it may be helpful to consider what, as a decision maker, you would
  like to see presented that would give you the best overall picture of how the potential
  program will be structured.
The following is a summary of the Define CONOPS activity along with an example OV-1
depiction from program documents:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1013
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
CONOPS Example. An example of an OV-1 diagram, shown in Figure 12.2.3.2.F3 , is
from the Defense Agencies Initiative (DAI) Program:
The purpose of this analysis is to understand the effects on any DOT_LPF-P elements
now that the Functional Sponsor has selected the preferred materiel solution. The
results may differ from the "To-Be" DOT_LPF-P assessment performed during the
Business Capability Definition (BCD) Phase, particularly if a COTS product is chosen
and the "To-Be" process needs to be changed to minimize customization. In summary,
the process is meant to identify which non-materiel elements must be addressed to
deliver the capability as intended. The Functional Sponsor is responsible and
accountable for implementing non-materiel components of the solution.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1014
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                 Figure 12.2.3.2.F4 - DOTMLPF-P Impact Assessment Context
The Functional Sponsor and PM lead the effort to determine which changes, if any,
need to be made to the previous "To-Be" assessment performed in the BCD Phase.
This DOTMLPF-P Impact Assessment is based on emerging information resulting from
the preferred solution, updated Business Process Re-engineering (BPR), and the
program outcomes and corresponding measures defined in the previous activity.
The following is a summary of the DOTMLPF-P Impact Assessment activity along with
an example of the summary output that may appear in the Business Case ( Note : more
detailed information does not need to appear in the Business Case, and may be kept as
"working papers"):
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1015
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Assessment is based on using a commercial off-the-shelf (COTS) package per the
revised "To-Be" business process, as summarized in the Business Case:
Eliminating all risk is not feasible. Identifying risks early and continuously throughout the
lifecycle and developing a plan to mitigate them is part of successful program
management. Most business executives ask the following questions: what problem are
we trying to solve; what's the benefit; how much will it cost; and what are the risks?
Having an effective risk mitigation strategy will go a long way towards gaining buy-in
from senior leadership and provide the program manager with information to plan for
risk management.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1016
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                 Figure 12.2.3.2.F5 - Define Risks and Risk Mitigation Context
For information on identifying, mitigating and tracking risk, refer to sections 3 and 5
respectively of the Risk Management Guide for DoD Acquisition, August 2006 .
  TIP: Generally, the Risk Management Guide refers to outputs from activities
  occurring later in the process (i.e., developing a work breakdown structure (WBS),
  earned value management (EVM), testing); however, based on lessons-learned, the
  chances of success are dramatically increased by effective risk mitigation. Therefore
  BCL encourages risk identification and mitigation early and continuously throughout
  the life of the program.
The following is a summary of the Defining Risks and Risk Mitigation activity along with
an example:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1017
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         register or risk log at the program execution level and an individual is assigned
         responsibility to each risk. For additional information on risk management, view
         the Risk Management Guide for DoD Acquisition, August 2006 .
    •    Outputs. Prioritized Risks, Risk Mitigation Strategy.
A risk register or risk log is a repository of identified risks for the program. For each risk,
additional information is included such as the risk's probability of occurrence, impact of
occurrence, planned counter-measures (or risk mitigation), risk owner, and other
pertinent information. An example of risk register or risk log is shown in Table
12.2.3.2.T2 .
Table 12.2.3.2.T2 - Example of the Output of Risk and Risk Mitigation Activity
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1018
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
   System interface High                            High - may    Sign an MOU/MOA with PM
   from a required                                  cause delay each system interface
   feeder system is                                 or additional owner.
   not ready on-time                                resources
   for                                                            Include technical reps in
   implementation.                                                design discussions and
                                                                  publish a formal
                                                                  interface design
                                                                  specification.
                                                                         Develop contingency
                                                                         plan in case a suitable
                                                                         workaround becomes
                                                                         necessary.
Critical Success Factors (CSFs) inform stakeholders of those elements that are deemed
must-haves for the potential program to succeed and identify the factors that
stakeholders agree must be implemented to achieve Initial Operational Capability (IOC).
The advantage of identifying CSFs is that they are simple to understand and they help
focus attention on major concerns. This will influence requirements tradeoff analysis
conducted during the Prototyping Phase as the Functional Sponsor and program
manager define and scope each increment of capability delivery.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1019
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                Figure 12.2.3.2.F6 - Identify CSFs Context
The following is a summary of the Identify CSFs activity along with an example list of
CSFs summarized in the Business Case:
    •    Inputs. High impact risks and risk mitigation strategy, preferred solution, program
         outcomes and DOTMLPF-P impacts from the "To-Be" assessment.
    •    Process. From the inputs listed above, the program manager (PM), Functional
         Sponsor and SMEs from the functional and technical teams develop a list of
         prioritized CSFs. CSFs are the elements deemed as must-haves for the program
         to achieve the desired outcomes and may include factors outside the program
         manager's control. The CSFs become candidates for subsequent program
         management planning. For example: a CSF is proposed for the use of a
         requirements management tool. The program manager decides the tool is both
         essential and cost effective, so plans are made to acquire, install, and operate
         the tool.
    •    Outputs. List of CSFs summarized in the Business Case.
  TIP: CSFs should differ from Key Performance Indicators (KPIs) which are
  measures that quantify management objectives, along with a target or threshold,
  and enable the measurement of strategic performance.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1020
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Team knowledgeable in implementing large-scale ERP
    •    User involvement
A financial analysis evaluates the cost and benefit of the proposed program in relation
to the current "as-is" operation in order to define the planned investment and obtain
greater efficiency and productivity in defense spending (i.e., Better Buying Power). For a
MAIS program, an Economic Analysis (EA) is also conducted to evaluate alternatives
for meeting objectives based on the present value of life-cycle costs and financial
benefit’s.
More detailed guidance for developing and preparing EA and Life-Cycle Cost (LCC)
estimates can be found in DAG Chapter 3, Section 3.6, "Major Automated Information
Systems Economic Analysis" and Section 3.7 "Principles for Life-Cycle Cost (LCC)
Estimates" . The LCC estimate and EA are summarized in the Business Case for the
MS A and MS B reviews.
Below are basic inputs, processes, and outputs of the Conducting a Financial Analysis
activity:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1021
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         Material Solution Analysis and the CONOPS, they will conduct the following
         activities:
             o Develop the program's Work Breakdown Structure (WBS) for all work
                 necessary to: 1) meet requirements; 2) manage risks; and 3) obtain
                 greater efficiency and productivity in defense spending (i.e., Better Buying
                 Power).
             o Based on the WBS, prepare the LCC Estimate using planned estimating
                 techniques.
             o Estimate the financial benefit’s based on the qualitative benefit’s
                 documented in the Business Case.
             o Prepare a high-level, resource-loaded milestone schedule.
             o For MAIS programs, conduct an EA, including a calculation of return on
                 investment (ROI).
    •    Outputs. WBS, Schedule, LCC Estimate, and financial benefit’s estimate; an EA
         for MAIS programs.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1022
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                   Figure 12.2.3.2.F8 - Conduct Sensitivity Analysis Context
The result may be refinement of time, cost and performance boundaries of the program
and a corresponding increase in the degree of confidence in the LCC Estimate. This is
the initial step where the PM is going to define what can be done in the first increment,
and what needs to be done in follow-on increments.
More information and guidance can be found in DAG Chapter 3, Section 3.7.2.4,
"Assess Risk and Sensitivity" .
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1023
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Define Funding Profile.
Once the Financial and Sensitivity Analyses are complete, and in conjunction with
resource management activities of program management, the program manager will
prepare a Funding Profile that documents the proposed overall strategy for funding the
program. Defining a Funding Profile is essential for ensuring program stability over its
planned lifecycle and for providing a disciplined approach for program managers to
execute their programs within cost and available funding. The Functional Sponsor is
ultimately responsible for ensuring that funding is identified and obtained.
The Define Funding Profile activity includes considerations from every other activity
conducted during the Investment Management (IM) Phase. During the Define Funding
Profile activity, the Functional Sponsor and program manager reviews the Business
Case and determines that the Funding Profile adequately supports the program being
planned. This should include a review of:
    •    Requirements (i.e., the planned high-level outcomes (HLOs) and business and
         program outcomes) to ensure the program is funded in order to meet those
         requirements;
    •    Planned deliverables of the program to ensure they are adequately funded - for
         example preparing and implementing the Test Plan; and
    •    Other aspects of the program to verify adequate funding, such as: the Acquisition
         Approach, potential risks as a result of the Sensitivity Analysis, and the Financial
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1024
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         Risks considered as a part of the Risk Management process. This includes
         verification of adequate funding for the costs associated with planned risk
         mitigation activities.
  TIP: Generally, not all risks are avoidable so Funding Profile development should
  also include a verification that planned costs are reasonable in order to manage the
  issues that result from risks that materialize.
The following is a summary of the Define a Funding Profile activity along with an
example Funding Chart:
Funding Chart Example . Example of the latest Funding Chart used at a Defense
Acquisition Board (DAB) . (**Note: Requires login with password or Common
Access Card)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1025
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                            Figure 12.2.3.2.F10 - Example Funding Profile
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1026
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
               Figure 12.2.3.2.F11 - Plan Program Capability Delivery Context
    •    Inputs. Life-Cycle Cost Estimate, estimated schedule and performance from the
         Financial Benefit Analysis.
    •    Process. The program manager and Functional Sponsor review the cost,
         schedule, and performance planned for the potential program during the Program
         Definition section of the IM Phase (i.e., the Financial Benefit Analysis). They plan
         the program capability delivery approach based on prioritized requirements.
         Increments are defined to support the program capability delivery approach. The
         Program Capability Delivery Plan can be best depicted in a graphic summarized
         in the Business Case.
    •    Outputs. Capability Delivery Plan summarized in the Business Case.
Program Planning activities begin after the Material Solution Analysis has been
completed and in conjunction with Program Definition activities and includes, but is not
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1027
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
limited to, developing plans for the following: test and evaluation, systems engineering,
lifecycle sustainment, data migration and management, information assurance (IA), and
interface design and management. This activity is demonstrated in Figure 12.2.3.3.F1 .
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1028
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
(template or document) that is the critical factor not the creation and completion of a
document. In general, the Business Case integrates information requirements from
traditional stand-alone templates to provide a summary-level integrated view of the
program for decision making, reducing coordination time at the OSD-level. It is not
expected or reasonable, however, that the Business Case will suffice as the only
program documentation.
The Acquisition Approach, the components of which are described in more detail in the
following paragraphs, is the main output of the Conduct Program Planning activity.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1029
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
to incentivize contractor productivity and innovation.
Market research conducted in the Business Capability Definition (BCD) phase and
Investment Management (IM) Phase helps: establish an understanding of commercially
available solutions; identify potential suppliers; identify small business capabilities; and,
initiate development of strategies to promote competitive best value acquisitions. The
results of completed market research and plans for future market research are
summarized in the Acquisition Approach of the Business Case for review at MS A.
Market research, tailored to program needs, should continue throughout the acquisition
process and into Operations and Support (O&S). Use of COTS products requires
periodic monitoring of the commercial marketplace through market research activities
and alignment, when appropriate, of affected business and technical processes. This
may impose additional cost, schedule, and performance risks that need to be assessed
in the program's risk management plan (RMP).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1030
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Systems Engineering Planning. Effective systems engineering planning is
         essential to the success of a program. One important measurement of that
         success is a Technical Performance Measure (TPM) for Reliability, so that
         should be included as one of the critical success factors of the Business Case .
         The summary of systems engineering planning in the Business Case is
         approved, at MS A, Pre-ED, and MS C, by the Director, Systems Engineering .
Defined content for the summary of systems engineering planning is the architecture
and interface definition and management includes:
Summarize the Technical Data Rights Strategy for meeting data rights requirements
and to support the overall competition strategy, including:
    •    Analysis of the data required to prototype, develop, deploy and sustain the
         system (e.g., conceptual data model (DIV-1), logical data model (DIV-2));
    •    Approach to provide for rights, access, or delivery of technical data the
         government requires for the systems total life cycle sustainment;
    •    Approach for using open systems architectures and acquiring technical data
         rights;
    •    Approach to including a priced contract option for the future delivery of technical
         data and intellectual property rights not acquired upon initial contract award; and
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1031
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Analysis of the risk that the contractor may assert limitations on the government's
         use and release of data.
For additional information on this subject, refer to DAG Chapter 2, Section2.2.14 "Data
Management Strategy and Technical Data Rights" ; and DAG Chapter 5, Section
5.1.6.3, "Contracting for Technical Data" and 5.1.6.4, "Data Management Strategy" .
During the IM phase, the Test Plan establishes test and evaluation from a strategic
level, for government and contractors, in support of the MS A review. It supports the
overall program resource requirements, schedule, and performance requirements
planned in the Business Case. It identifies the approach for integrated Developmental
Test and Evaluation (DT&E) and Operational Test and Evaluation (OT&E). It is guided
by the testing roles, responsibilities, standards and methods to be specified in the
Program Charter.
After the IM Phase, the Test Plan matures during Prototyping from a strategy to a plan
in support of the review at MS B. This transformation includes the identification of
evaluation criteria for testers, and for more detailed documentation at the program level,
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1032
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
ensures that the test plan is useful, executable, and outcome-based.
12.2.3.5. MS A Preparation
Business Case. The business case is the one location where all relevant facts are
documented and linked together into a cohesive story. It is an executive-level document
used by decision-makers for investment and acquisition decisions.
The Functional Sponsor and the program manager summarize the results of the IM
Phase activities in the Business Case and document the managerial methods and
standards for executing the potential program in the Program Charter. This
summarization should provide decision makers with the essential information about the
potential program to make an informed decision supporting a MS A review.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1033
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
            Figure 12.2.3.4.F1 - Complete Business Case and Program Charter
While developing the Business Case, the program manager and the Functional Sponsor
should ensure that its content presents the required information at a level appropriately
suited for executive-level decision makers, keeping in mind that detailed working papers
or detailed analysis should be kept at the program level and available if requested.
Before submitting the Business Case for formal review, the program manager and
Functional Sponsor may consider the following:
    •    Is the Executive Summary clear, concise and focused on the problem and it’s
         solution?
    •    Does the Business Case contain the appropriate level of relevant information to
         enable a decision and inform decision makers?
    •    Is the value and the risks inherent in the proposed program clearly explained?
    •    Is the Functional Sponsor with the capability and authority to deliver the benefit’s
         fully committed to the investment?
    •    Can all HLOs be quantified so their achievement can be tracked?
    •    Is the Business Case tailored to the size and risk of the proposed solution?
    •    Does the Business Case focus on the business capabilities and impact, rather
         than on technical aspects? (Remember, the Business Case is not a technical
         proposal.);
    •    Does the Business Case contain clearly relevant and logical contents which are
         simple to understand?
    •    Does the Business Case justify critical elements in a transparent manner?
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1034
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Is there clear accountability for and commitment to the delivery of the capability
         and the management of costs?
The following is a summary of the Business Case portion of the Complete Business
Case and Program Charter activity:
  TIP: Before deeming the Business Case complete, the Functional Sponsor should
  review the Problem Statement to ensure that any discovery during IM does not
  significantly affect the approved Problem Statement. If significant changes are
  evident (such as, the fundamental problem has changed or the scope has
  broadened), it will warrant re-approval by the IRB Chair. The IRB Support staff
  should be consulted in this circumstance to determine the appropriate way ahead.
Program Charter. The purpose of a Program Charter is to define the manner in which
the program will be managed and the governance surrounding the program. It is an
agreement between the program team (including contractors), Functional Sponsor and
PM and identifies the roles and responsibilities and assigns accountability to each of
these individuals or groups. It also contains references to: detailed project plans such as
schedules; work breakdown structure; complete risk assessments; etc.
The Program Charter describes the managerial methods and standards for the program
and is a tool that helps enable the outcomes described in the Business Case. It is first
developed as part of program management planning activity that spans the IM Phase,
though it is updated in later phases of BCL as the program matures.
BCL does not prescribe the tools and techniques for performing program management,
but does require the preparation of the Program Charter for review and approval by the
Component Acquisition Executive (CAE) and for inclusion in the MS A package.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1035
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         increments within the program; program and increment initiation, closure, and
         decision documentation.
    •    Scope Management. Aligns the activities that identify the deliverables and
         establish the relationship between product scope and program scope, while
         setting standards for clear achievable objectives to the Business Case,
         establishes change management, plans for delivery of program benefit’s, defines
         the program deliverables, and the approach to requirements management.
    •    Schedule Management. Provides the plan for schedule tracking, controlling and
         performance reporting. Applicable earned value management is described, if
         utilized. A summary schedule of major deliverables and events is provided.
    •    Risk Management. Implements the program's risk management plan which
         defines the approach to risk identification, analysis, and mitigation, in addition to
         the process for conducting risk reviews and how to escalate risks.
    •    Procurement Management. Refers to the acquisition approach (Business Case)
         and outlines planning for managing acquisition and procurement activities.
    •    Financial Management. Establishes a plan for developing and managing
         program costs; budgeting; and defining the monitoring, forecasting, change
         controls and performance reporting. Applicable earned value management is
         described, if utilized.
    •    Stakeholder Management. Defines the plan for stakeholder identification,
         analysis, and management, as well as relationship management.
    •    Communications Management. Based on stakeholder management
         information, outlines a communications management plan, to include format,
         content, frequency, approval, recipients, and distribution.
The following is a summary of the Program Charter portion of the Complete Business
Case and Program Charter activity:
12.2.3.5. MS A Preparation
When the Functional Sponsor determines that the proposed investment has reached a
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1036
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
level of detail sufficient for a MS A review, the program manager compiles a MS A
Package that includes the following documents:
    •    A Business Case;
    •    A Program Charter;
    •    The Defense Business Systems Management Committee (DBSMC) Chair
         approval memorandum to obligate funds;
    •    The Component Acquisition Executive (CAE) Memorandum (Compliance and
         Recommendation), for MAIS; and
    •    Any additional requirements as outlined in previous acquisition decision
         memorandums (ADMs).
  TIP: Prior to submitting the Business Case as part of the MS A package, the
  program manager and Functional Sponsor should verify whether the following
  aspects have been adequately addressed:
        1. The investment has value to the enterprise and aligns with enterprise
           priorities;
        2. There is proper management by and support from senior officials for the
           proposed solution;
        3. The scope for the proposed solution has been adequately defined and
           measures for desired outcomes have been appropriately defined;
        4. There is clear evidence that Business Process Re-engineering (BPR) has
           been or is being conducted;
        5. There is clear evidence that the component has to ability to deliver the
           benefits of the proposed solution within the timelines specified; and
        6. There is clear evidence that dedicated resources are working on the highest
           value opportunities.
For non-MAIS efforts, MS A materials are submitted for review and approval in
accordance with Component processes and procedures.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1037
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
12.3. Execution
12.3. Execution
Execution third segment of BCL and is comprised of the following Phases: Prototyping,
Engineering Development, Limited Fielding, Full Deployment, and Operations & Support
(O&S).
The purpose of Execution is to design, develop, test, deploy, and sustain each
increment of capability (materiel and non-materiel solution) by satisfying the specific
outcomes defined in the Business Case.
    •    Determine the most cost-effective technical and design approach that will satisfy
         user capability requirements; and
    •    Conduct risk-reduction activities by: identifying use cases that determine the
         specific capabilities to be developed during the increment; the technologies to be
         used; and the approximate schedule for deploying the materiel solution.
  TIP: Knowledge gained during this phase may result in: changes to the order in
  which required capabilities are satisfied; technology trades; and/or movement of
  requirements to follow-on increments.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1038
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
The outputs and outcomes of the Prototyping Phase are:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1039
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
  Figure 12.3.2.2.F2 - Prototyping Phase High Level Process Flow (Increment 2-n)
The Prototyping Phase begins when the MDA approves entry into the phase after
conducting either a MS A review (for the initial DBS increment) or Authorization to
Proceed (ATP) review (for the DBS increments 2-n). In other words, there is one MS A
for each DBS and there is an ATP for each follow-on increment. Note: Multiple
increments can be executed concurrently. It's important to note that the ATP is the
starting point for obligation of funds for the increment.
During the Prototyping Phase, the Functional Sponsor and PM will perform the
necessary activities to install and configure the solution in a relevant environment,
perform detail design and requirements trade-offs, summarize updated plans in the
Business Case, and develop a draft RFP for a Pre-Engineering Development (Pre-ED)
Review. After the Pre-ED Review a draft APB will be developed for the increment and
submitted for the MS B review. During this phase, the PM leads all activities pertaining
to the materiel portion of the solution while the Functional Sponsor leads activities for
the non-materiel or DOT_LPF-P portion of the solution, as outlined in the Program
Charter. An independent risk assessment (Enterprise Risk Assessment Methodology
(ERAM) for MAIS) will also need to be conducted and a risk mitigation plan developed
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1040
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
prior to MS B. The PM and the Functional Sponsor must work collectively to ensure the
DOTMLPF-P requirements are integrated and will deliver a holistic solution.
Prototyping Phase activities end for each increment after the Investment Review Board
(IRB) (or Component equivalent) has reviewed the necessary MS B information and the
IRB Chair has sent a milestone recommendation to the MDA.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1041
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
12.3.2.3. Prototyping Phase Activities
Prototyping Phase activities are conducted in accordance with the approved Business
Case, the MS A ADM, and the solution-specific implementation methodology being
employed. While conducting Prototyping Phase activities, the PM must work in close
collaboration with the Functional Sponsor, functional users, and appropriate Systems
Engineering (SE) and Test and Evaluation (T&E) communities as functional,
organizational, and user-related activities (such as requirements refinement and
implementation of change management / Business Process Re-engineering (BPR) /
DOT_LPF-P considerations) occur in tandem with traditional PM responsibilities.
The PM should collaborate with appropriate communities early on to plan the type(s)
and amount of design and T&E reviews necessary to facilitate development and
validation of the outcomes for the capability to be delivered. More information on
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1042
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
conducting test and engineering reviews is located in DAG Chapter 4, Section 4.2,
"Systems Engineering Activities in the System Life Cycle" .
  TIP: It is imperative that the Functional Sponsor and the program manager have
  collective understanding of their roles and responsibilities (and have documented
  them in the Program Charter) in order to smoothly execute the program going
  forward.
As mentioned previously, Prototyping Phase activities are continuous and will iterate
until the required level of maturity is achieved and prototypes of the system or key
system elements are produced, and when there is confidence that the scope defined for
the current increment and the cost, schedule, and performance baselines can be
maintained throughout the remainder of the increment's planned implementation and
sustainment of capability.
Install and Configure Software. This activity involves planning for the installation and
configuration of: infrastructure, as necessary (e.g., servers, utilities, services,
databases); commercial off-the-shelf (COTS) product(s); backup and recovery
procedures; and the initiation of support services necessary to sustain the Technical
Baseline. The Technical Baseline includes both the reference assets (e.g.,
documentation) and technical assets (e.g., software) of the system and acts as a formal
baseline for defining subsequent change.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1043
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                  Figure 12.3.2.3.F2 - Install and Configure Software Context
As a result of this activity, the program manager will initiate Configuration Management
(CM) to establish and maintain the integrity of work products and to track and control
changes. More information on CM can be found in DAG Chapter 4, Section 4.3.7,
"Configuration Management".
Perform Gap and Tradeoff Analysis. The purpose of this activity is to: evaluate the
installed software's actual performance against the planned capabilities ("To-Be"
process, outcomes) defined in the Problem Statement and Business Case; determine
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1044
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
the variance between the capabilities and the software's performance, or "gaps" and;
develop alternatives for filling the gaps.
Cost, schedule, and performance may be traded off within the "trade space" between
thresholds and objectives documented in the measurement criteria of the Business
Case.
The tradeoff analysis may also consider potential exchanges in the trade space, such
as re-sequencing capabilities across increments to accelerate high-value capabilities
into the early increments, or continued BPR to modify ways of doing business to more
closely align with suitable processes that already exist in the COTS products.
The PM and Functional Sponsor engage SMEs, including SE and T&E, who work to
determine the gap(s) between the desired business outcomes defined in the Business
Case and what the software can deliver, and develop an executable plan for filling the
gaps. One goal of this process is to minimize customization of COTS software.
Depending on complexity of the materiel solution, adjustment of the "To-Be" business
process may be necessary in order to accommodate the way the software was
designed to work. Or, it may be discovered that the software has out-of-the-box
capabilities superior to those originally planned. Trade-off decisions typically include
modifying the way work is traditionally conducted (business processes) versus incurring
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1045
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
the cost and risk of customizing the COTS software. This is why BPR is a continuous
refinement effort - where better ways of conducting business are continuously evaluated
based on the inherent business capabilities of the selected COTS products.
The following is a summary of the Perform Gap and Tradeoff Analysis activity:
Develop Detailed Design. The purpose of this activity is to translate planned business
capabilities documented in the Business Case into a design specification. To support
this, the updated Technical Baseline combined with the initial RTM is used to develop a
design to meet the planned business capabilities. As a result, the gaps in the solution
will now be closed through design activities by systems integration and software
engineering SMEs. More information can be found in DAG Chapter 4, Section 4.3.18,
"Systems Engineering Design Considerations" .
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1046
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                       Figure 12.3.2.3.F4 - Develop Detailed Design Context
The result of these design activities is a detailed design, which will undergo planned
functional and technical reviews to manage the identification of and resolution of design
issues (by the PM and Functional Sponsor).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1047
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Update Test, Engineering & Other Plans. The purpose of this activity is to use the
knowledge gained from preceding activities in conjunction with the strategies and plans
developed during the Investment Management (IM) Phase and enhance the information
to convert them into detailed plans during the Prototyping phase for subsequent
activities.
Guided by the analyses conducted during the IM Phase combined with the knowledge
gained from assessment of the current solution-set now available in the updated
Technical Baseline, the program manager and technical teams (e.g., SE, T&E, Systems
Integrators) collaborate on integration and engineering needs. Their plans for design,
development, test and evaluation, and performance measurement will be refined to
yield a comprehensive suite of updated planning information for the remaining phases
of Execution.
The following is a summary of the Update T&E, SE & Other Plans activity:
    •    Inputs . Strategies developed during the IM Phase (T&E strategy for the Test
         Plan, Acquisition Approach).
    •    Process. Update the strategy-level information developed during the IM Phase
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1048
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         with knowledge gained from assessment of the Technical Baseline and design
         activities to refine to a plan-level of detail that will support subsequent
         development, test and evaluation, and implementation.
    •    Outputs . Updated plans summarized in the Business Case.
Define Capability Delivery Schedule. Based on the high-level Capability Delivery Plan
in the Business Case (developed during the IM Phase), the program manager and
Functional Sponsor segment business capabilities into manageable increments based
on priority, dependencies, risks, and implementation strategy and prepare a detailed
Capability Delivery Schedule for the next planned increment. Changes will likely result
in changes to cost estimates, and should be reviewed for such.
The Capability Delivery Schedule includes major reviews and milestone events and
depicts releases, etc., as applicable. It is expected that the schedule will be presented
at a high level (summary) in the Business Case, but must be maintained and available
at a detail level (WBS) for program and stakeholder use since it critical for effective
program management. Decision-makers may request the detail-level information for
review.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1049
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
  TIP: In the case of an organization-by-organization release of capability, the PM and
  the users may adjust the capability delivery schedule to accommodate
  functional/user needs or priorities, consistent with the APB. Such adjustments,
  however, should not alter the FDD criteria approved by the MDA and documented in
  the MS B ADM.
    •    Inputs . Capability Delivery Plan (IM Phase) from the Business Case, design.
    •    Process. Partition business capabilities across multiple increments according to:
         Functional Sponsor priorities, dependencies, implementation strategy, and other
         priorities as necessary. Develop a capability delivery schedule. As needed,
         revise the Work Breakdown Structure (WBS) to indicate consequences of
         allocating business capabilities across multiple Increments. Verify estimated
         labor resource requirements, estimated task durations and dependencies based
         on the new distribution of business capabilities into multiple Increments and
         modify the Capability Delivery Plan as required.
    •    Outputs . Capability Delivery Schedule.
Update Business Case and Program Charter. Once Prototyping Phase activities are
complete, the program manager and Functional Sponsor must update the Business
Case and Program Charter as necessary to inform subsequent decision-making.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1050
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
  TIP: While updating the Business Case, the Functional Sponsor and PM must
  ensure that the materiel solution is still focused on solving the originally identified
  business need (from the Problem Statement). If it is not, it may mean that the
  problem has changed or the right solution has not been chosen.
The following is a summary of the Update Business Case and Program Charter activity:
Develop Draft RFP. The RFP begins the process of establishing government-industry
relationships for acquiring products and services to implement the defined program. The
RFP must establish a clear understanding of the government's needs to industry and
the resulting proposals help the government understand industry capabilities in their
pursuit of "best value".
Based on the Acquisition Approach in the Business Case and the Program Charter's
roles, responsibilities and standards for procurement, the program manager initiates the
draft RFP in collaboration with a Contracting Officer.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1051
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                            Figure 12.3.2.3.F8 - Develop Draft RFP Context
Contracting specialists provide insight into types of contract and language for the RFP
to include such things as data rights and terms and conditions. The content in the RFP
is organized in such a manner to clearly define the scope of products and services for
the Increment and allow the Government to effectively evaluate proposals; the Business
Case and Program Charter are the primary sources for RFP content. The completed
draft RFP becomes part of the Pre-ED review package.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1052
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
created by the program office for the assessment, as it is based on existing program
documentation. The PM will work with the assessment team to develop a risk mitigation
strategy once the assessment is complete.
Pre-ED Review. The purpose of the Pre-ED Review is to have the MDA review and
approve the Business Case and authorize the release of the RFP so source selection
can begin while the remaining Prototyping activities are being completed (i.e., APB
development). The goal is to complete all of the activities necessary to award a contract
or task order immediately after MS B is approved.
In preparation for the Pre-ED Review, the PM compiles a Pre-ED Review Package that
includes:
    •    The updated Business Case signed by the Functional Sponsor, PM, Component
         Acquisition Executive (CAE), and appropriate sections signed by Deputy
         Assistant Secretary of Defense, Systems Engineering (DASD(SE)), Deputy
         Assistant Secretary of Defense, Developmental Test and Evaluation
         (DASD(DT&E)) and Director, Operational Test & Evaluation (DOT&E);
            o DBS below the MAIS threshold will have a Business Case approved
                through comparable Component processes and authorities
    •    Component Acquisition Executive (CAE)-approved Program Charter (MAIS only);
    •    A presentation that outlines the elements of the Draft RFP.
    •    A CAE Memorandum (Compliance and Recommendation);
    •    Any additional information and/or requirements as outlined in previous ADMs;
         and
The package is submitted to the IRB for review and the IRB Chair will provide a
recommendation to the MDA.
The output of the Pre-ED Review is an MDA-approved Business Case and approval to
release the RFP and begin source-selection.
MS B Preparation. When the draft APB and independent risk assessment are
complete, the PM compiles a MS B Package that includes the following documents:
The package is submitted to the MDA, who may request an IRB MS B recommendation.
For those acquisitions that are not MAIS or designated "special interest" programs,
information requirements for a MS B review are submitted for review and approval in
accordance with the Component's process and procedures.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1053
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
12.3.3. Engineering Development
The purpose of the Engineering Development Phase is to ensure that the materiel
solution for the increment has been designed, configured, and developmentally tested in
a manner consistent with the approved Business Case and the Program Charter and
that it is prepared for limited deployment.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1054
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
12.3.3.2. Engineering Development Phase Process
During the Engineering Development Phase, the program manager (PM) leads the
effort to integrate the solution (increment) in a development / test environment and
partners with the Functional Sponsor / user community to ensure functional
requirements remain in alignment. The Functional Sponsor continues to solve or
implement the Doctrine, Organization, Training, Leadership and education, Personnel,
Facilities, and Policy (DOT_LPF-P) aspects of the solution that will accompany the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1055
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
materiel portion.
Engineering Development Phase activities include, but are not limited to:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1056
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
The Engineering Development Phase ends when the Functional Sponsor and PM are
satisfied that the capability is ready for use and request a MS C / Limited Fielding
decision.
MS C Preparation. When the Functional Sponsor and PM have determined that the
solution achieves the capabilities described in the Business Case for the increment and
the acquisition program baseline (APB), a MS C Package is prepared to request
approval to begin Limited Fielding at the MS C decision. Some of the considerations at
this phase include:
    •    User concurrence that the capability satisfies the outcomes specified in the
         Business Case;
    •    Associated DOT_LPF-P capability is deployment-ready;
    •    Performance during developmental and operational testing is acceptable and
         consistent with the Business Case;
    •    Interoperability Certifications, Information Assurance Assessment and
         Authorization (IA A&A), BEA compliance, and any other required certifications
         have been obtained; and
    •    Life-cycle support is ready to implement.
    •    Clinger-Cohen Act (CCA) certification is still valid.
    •    The Business Case including any updates resulting from Engineering Phase
         activities;
    •    Any additional information / documentation required as directed in previous
         ADMs; and
    •    If necessary:
             o An updated APB;
             o A DBSMC Chair Certification approval memorandum; and
             o An updated Program Charter.
The purpose of the Limited Fielding Phase is to limit risk by having a limited number of
users verify that the capability works in an operational environment and to have the Test
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1057
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
and Evaluation (T&E) community evaluate it against the outcomes (Business Case) and
the Test Plan (Business Case).
The Limited Fielding Phase begins at MS C when the Milestone Decision Authority
(MDA) reviews the updated Business Case and any other required information per
previous acquisition decision memorandum (ADMs) and approves the fielding of the
capability into an operational environment in accordance with the schedule outlined in
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1058
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
the APB. The MDA's decision is documented in the MS C ADM.
During this Phase, the PM continues to manage the materiel effort and will work closely
with users and T&E to resolve issues and execute scheduled initial operational test and
evaluation (IOT&E). The Functional Sponsor continues to solve or implement the non-
materiel aspects of the solution that will accompany full deployment of the materiel
solution. In addition, informed by T&E results and the outcomes defined in the Business
Case, the Functional Sponsor will determine whether to declare IOC. For MAIS
programs, there are specific test events that may be required to occur; these events and
their subsequent processes are discussed in Chapter 9, Section 9.5.8.2 "System
Readiness for IOT&E".
At the end of Limited Fielding, the Functional Sponsor will issue a written declaration of
IOC and notify the Investment Review Board (IRB). The Functional Sponsor will also
determine whether the DOT_LPF-P dependencies have been met sufficiently enough to
allow the program to proceed to Full Deployment.
During Limited Fielding the PM will verify that the materiel solution will support the
outcomes (i.e., high-level, business, and program) described in the Business Case and
APB for the increment. The operational effectiveness and suitability of the capability is
assessed by engaging appropriate T&E communities and collecting end-user feedback.
The PM will manage fielding the system to the defined users and manage feedback
from users to: identify issues; work with the Functional Sponsor to prioritize issues; and
manage priority issues to closure. The PM and Functional Sponsor will identify lessons
learned for each increment to help plan subsequent increments.
The PM will also limit risk by working with end-users to ensure they are appropriately
trained in using the capability and that issues are identified and addressed expediently.
The Functional Sponsor ensures that DOT_LPF-P aspects of the solution have been
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1059
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
integrated as necessary and appropriate and that users have received a complete,
usable capability.
The Business Case updates may include successive refinement as discovery continues
throughout the program life-cycle. At this point updates should include summaries of the
revised Life-Cycle Sustainment Plan (LCSP) and Test Plan based on actual IOT&E
results.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1060
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
12.3.5.2. Full Deployment Phase Process
The Full Deployment Phase begins when the Milestone Decision Authority (MDA)
approves the Full Deployment Decision (FDD). Criteria for this decision include:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1061
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         system is operationally effective, suitable, and survivable. The final IOT&E report
         is not a requirement for this decision point;
    •    Necessary compliance and certifications for deployment have been obtained /
         achieved;
    •    Lifecycle support is ready to implement; and
    •    Insignificant risk remains associated with the technical aspects of the capability,
         although acceptable fielding risk may remain. The software component of the
         capability should essentially be risk-free. Deployment risks will exist until they are
         addressed in the users operational environment.
During this phase, the increment of capability that was successfully deployed and tested
during Limited Fielding is deployed to the rest of user community defined in the
business case. The activities end when Full Deployment (FD) is declared by the
Functional Sponsor and a Close Out Review is scheduled with the Investment Review
Board (IRB).
The Functional Sponsor is responsible for ensuring the non-materiel components of the
DOTMLPF-P solution for the increment have been satisfied and meet the defined
outcomes outlined in the Business Case and the cost, schedule, and performance
parameters in the APB.
The program manager and Functional Sponsor must pay close attention to the
performance measures during fielding, as these should be an indicator of potential
issues. Failure to implement a single aspect of the DOTMLPF-P solution may skew one
or more performance measures. For example, if an organization fails to change an
underlying business process (to align with a capability’s fundamental business process)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1062
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
and instead continues to use their current business process, the timeliness and
efficiencies expected with automation will be diminished. These types of issues will be
made visible through performance measures.
Full Deployment activities end when the Functional Sponsor declares Full Deployment
(FD) and schedules a Close-Out review.
The purpose of Operations & Support ( O&S ) is to maintain materiel readiness, provide
operational support (e.g., help desk), monitor performance, and sustain the capability in
the most cost-effective manner possible over its total lifecycle. The end of this phase is
reached with the disposal of the capability when it has reached the end of its useful life.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1063
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Updated Systems Engineering planning.
O&S begins when the first increment of a DBS is fully deployed and requires lifecycle
sustainment to support users.
While O&S begins with the initial fielding of the capability, the capability fully enters into
lifecycle sustainment when the Functional Sponsor decides the capability has been
deployed to the full user community and is performing in accordance with the criteria in
the Business Case by declaring Full Deployment (FD). Additionally, during O&S, a
Close Out Review (which also constitutes the Post-Implementation Review (PIR),
traditionally part of Clinger-Cohen) is conducted.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1064
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
12.3.6.3. O&S Activities
One of the major activities during O&S is the evaluation & feedback of the fielded
capability by the users, the PM, and the Functional Sponsor. After considering results
against the desired outcome for the capability, resourcing, remaining service life,
Business Enterprise Architecture (BEA) compliance and technology improvements, this
feedback may lead to changes in the software, the product support package, Business
Process Re-engineering (BPR), and/or requirements for the next increment.
Sustainment is covered by the program's Lifecycle Sustainment Planning and
Execution, which seamlessly spans a systems entire life-cycle, from the Analysis of
Alternatives (AoA) to disposal. It translates business capability and performance
requirements into tailored product support to achieve specified and evolving life cycle
product support availability, maintainability, sustainability, scalability, reliability, and
affordability parameters. It is flexible and performance-oriented, reflects an evolutionary
approach, and accommodates modifications, upgrades, and re-procurement.
DAG Chapter 5, Section 5.1.2, "Life-Cycle Sustainment and the DoDI 5000.02
Acquisition Environment" goes into significant depth into sustainment planning. For
software development, the following additional guidance is provided:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1065
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Whenever possible, do not customize COTS software. If the software is not
         customized, the vendor/ developer maintains complete version control of their
         products, including interoperability, maintainability, and security. This enables
         modernization of the capability with newer technology without new
         integration/configuration efforts.
    •    When COTS products are used in the same manner in the government
         environment as in the commercial environment (with added physical, information,
         and operational security), they should be already be covered by organizational
         standing operating procedures (SOP). When developing the sustainment plans,
         use existing these SOPs, rather than creating new plans, for both modernization
         and disposal.
    •    Due to the nature of DBS capabilities, whenever possible, develop and leverage
         portfolio sustainment planning (i.e., for those capabilities that utilize the same
         infrastructure and have similar life-cycle sustainment requirements),), rather than
         developing a separate plan for each capability.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1066
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         support, throughout the product's life-cycle? This responsibility information is
         normally summarized in the Program Charter.
    •    Life-Cycle Approach. In many cases, support will be provided by multiple
         entities, each specialized in a different area. This includes COTS vendors,
         enterprise services, software hosting facilities, and other logistics support.
    •    Supportability Test and Evaluation (T&E) Concept. Ensure that T&E for
         supportability is in place prior to the initial fielding and how issues will be
         mitigated. Re-use (and reference) similar, proven concepts, and modify them to
         fit this specific acquisition, rather than developing new concepts.
    •    Integrated Logistics Support Planning. A large portion of the life-cycle
         sustainment planning will be addressed in this section. However, most of the
         information is contained in other program planning documents. The objective is
         to ensure the information requirements are addressed, not to repeat this
         information in another plan.
              o Design Interfaces. Should be integrated with SE and Information Support
                  Planning / interface documentation. Normally, it is in the form of DoD
                  Architecture Framework (DoDAF) products at the detailed level.
              o HAZMAT, Human Systems Integration (HSI), & ESOH. Generally will not
                  need to be addressed, as COTS capabilities will be utilized as in a similar
                  commercial environment.
              o Quality Assurance (QA). Address who is responsible for implementing the
                  QA plan and who provides oversight. Responsibilities are to be covered in
                  the Program Charter.
              o Reliability and Maintainability (R&M). Reliability is a systems ability to
                  operate and perform it’s intended function for a specified interval under
                  stated conditions. Maintainability is the ease and rapidity with which a
                  system or equipment can be restored to operational status following a
                  failure.
DBS are normally COTS-based. This approach provides a reliable and stable
foundation that is easily maintainable for the customer developed application. The use
of COTS hardware and software products means commercial sources are available to
provide maintenance and support. The program management office may maintain
hardware and software maintenance contracts with the appropriate vendors to provide
support to the development, test, and operation of systems.
    •    Failure Modes Effects and Criticality Analysis (FMECA) (or a similar Component
         process). Is generally addressed during Information Assurance (IA) planning and
         should cover what is considered a critical failure, time-limit’s for fixing critical
         failures, and who is responsible.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1067
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Damage Modes and Effects Analysis (DMEA) (or similar Component process).
         Damage modes are errors or defects in a process, design, or item, especially
         those that affect the intended function of the capability and/or the process (can
         be potential or actual).). Effects analysis refers to studying the consequences of
         those failures.
DMEA is a procedure used in the life-cycle for analysis of potential failure modes within
a system to classify the severity and likelihood of the failures. A successful DMEA
activity helps a team to identify potential failure modes based on past experience with
similar products or processes, enabling the team to design those failures out of the
system with the minimum of effort and resource expenditure, thereby reducing
development time and costs.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1068
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Manpower and Personnel. Addressed in accordance with the Doctrine,
         Organization, Training, Leadership and education, Personnel, Facilities, and
         Policy (DOT_LPF-P) analyses and as part of delivering the comprehensive
         Doctrine, Organization, Training, Materiel Leadership and education, Personnel,
         Facilities, and Policy (DOTMLPF-P) capability. There is a high likelihood that this
         will be effected by the Business Process Re-engineering (BPR), but may fall
         outside the scope of the PM and/or the Functional Sponsor.
    •    Training and Training Devices. Ensure training requirements, strategy,
         responsibilities, and methods that will be employed to deliver the best possible
         instruction are covered, in accordance with the DOTMLPF-P capability delivery.
    •    Supply Support. Generally will not apply to DBS. Supplies will primarily be
         satisfied though hardware and software concepts described above. If not, then
         ensure they are described.
    •    Support and Test Equipment. For COTS, maximum use of commercial service
         warranties and service contracts are normally utilized. No special tools, General
         Purpose Electronic Test Equipment or Special Purpose Electronic Test
         Equipment are normally required to support. If special tools are needed, ensure
         they are described.
    •    Technical Data. See DAG Chapter 5, Section 5.1.6, "Data, Software, and
         Intellectual Property Rights" .
    •    Packaging, Handling, Storage, and Transportation. DBS capabilities generally
         will not have any unique facility, special packaging, handling, or transportation
         needs. If they do, ensure they are addressed.
    •    Facilities and Installation. Generally, no new facilities are required for DBS. If
         they are, then they should be addressed.
    •    Support Transition Planning. Sustainment Transition Plans are developed to
         support a program (or increment) after the initial operational capability is
         achieved and the system moves into full operational capacity where it requires
         operational support. Since DBS uses an evolutionary approach and is developed
         in Increments with capability releases, the system is constantly and rapidly
         evolving to provide new capabilities and functional enhancements. As a result,
         the operation and sustainment period is very brief as the system transitions from
         release to release. The activities normally associated within O&S of a system
         (security and performance management, patches, bug fixes, COTS hardware
         and software updates, usability improvements, and interoperability updates) are
         addressed during the development of successive increments or as data updates
         to fielded versions as required.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1069
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         and organizational O&M costs.
    •    Configuration Management. Although CM is normally captured in a program's
         CM plan (and considered to be part of the SE discipline) it is crucial for life-cycle
         support, as different support options may be needed for the different version of
         the software or technologies that have been released.
    •    Demilitarization, Reutilization and Disposal (DR& D) Strategy. At the end of
         its useful life, a DBS or one or more increments of a DBS is disposed of in
         accordance with all statutory and regulatory requirements and policy relating to
         safety, security, and the environment. During the design process, the program
         manager should estimate and plan for safe disposal. Hardware no longer needed
         should be disposed of according to each organization equipment disposal
         procedures or be transferred to another program for reutilization. Software
         produced or purchased will be maintained in the configuration management
         library and will be available for reutilization as needed.
In O&S, the program manager and Functional Sponsor conduct a Close Out Review,
which also serves as the Post Implementation Review (PIR) and fulfills the requirements
of such, with the Investment Review Board (IRB) to determine whether or not the
delivered capability achieved the outcomes defined in the business case. The Close Out
Review is an important vehicle for lessons-learned since it incorporates user feedback
and to enable understanding of how well a recently-completed increment meets the
needs of users before finalizing the requirements for a subsequent increment. It also
informs the IRB of how well an investment performed against expectations and how
future increments of capability can be expected to perform.
The reasons for Time-Limited Phases in the Business Capability Lifecycle (BCL) are
that:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1070
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         additional capability tend to increase.
The key to successfully fielding IT is to quickly get capability into the hands of users.
Too often, users and developers spend years trying to specify requirements in this
dynamic environment and never field any capability. Instead, the functional stakeholders
should define the business outcomes they want to achieve, how they are going to
measure achievement, and acknowledge that things will change.
    •    Business Capability Definition (BCD) Phase. Although this phase is not time
         limited, it is critical to the success of all the other phases; it defines the problem
         to be solved and frames the scope of the acquisition. In this Phase, the
         Functional Sponsor (representing the needs of the end user) defines:
             o What the need / gap / problem is (the Problem Statement);
             o How they will know when the problem is fixed (the high-level outcome(s));
                and,
             o How they are going to measure progress towards those outcomes.
Although there are other relevant tasks to be accomplished (see a complete discussion
of the Business Capability Definition (BCD) Phase in Section 12.1 , Business Capability
Definition (BCD) Phase ), these have the highest impact on time-limited development (if
these are not adequately articulated and agreed to by the Functional Sponsor, it will be
unclear what constitutes success and will likely be revisited / redefined).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1071
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                technologies). Ensure assumptions, scope, boundaries and constraints
                are well-articulated (you may not be able to solve the entire problem with
                the selected technical solution today . Tomorrow brings new technologies,
                new versions of more-capable software, and updated requirements.)
             o Understanding assumptions, scope, boundaries and constraints.
                Determine which are statutory and which are "artificial" (an AoA is
                mandatory, but a subordinate 75-day sub-process may be artificial.)
                Agreements made with Functional Sponsor(s) and/or the MDA can
                mitigate artificial constraints.
             o Keep planning at the strategic level. Do not over-plan, because in this
                environment the plan is going to change. At this stage, the users desired
                outcomes will be at a very high level. (i.e., "full auditability"). There is no
                way to address all of the issues that are encountered this early in the
                process and, even so, requirements would change before completing the
                associated plans. The result would be an endless do-loop of changing
                your plans to fulfill new requirements. A key to success is to be able to
                clearly articulate the approach to solving the problem using the preferred
                materiel solution (and over-time, the preferred materiel solution is going to
                change.).
    •    Prototyping Phase. BCL mandates completion of Prototyping within 12 months
         or less of contract/option award. This time-limit also to each subsequent
         Increment (after Authorization to Proceed (ATP) is granted) as well. The key to
         completing Prototyping within 12 months is to limit the scope of activities to those
         absolutely necessary, such as the following:
             o It is critical to obtain the preferred materiel solution as rapidly as possible.
             o Ensure the Functional Sponsor has prioritized requirements (business
                outcomes) before the MS A decision. Expect this prioritized list to be more
                requirements than can be fulfilled in 18 months. Part of this phase's
                activities is scoping the requirements for the Increment to achieve a useful
                capability that can be delivered in less than 18 months. Multiple prototype
                or pilot demonstrations may be necessary to reach agreement on an
                acceptable, operationally useful, and affordable increment of capability
                that can be delivered within this timeframe.
             o Significant emphasis must be placed on leveraging enterprise services
                and existing infrastructures. These are known entities and will significantly
                reduce engineering development and testing risks in the next phase.
             o Develop a reasonable plan to build and deploy. Also, set user
                expectations. The objective is not to maximize the number of requirements
                that can be incorporated into an 18 month schedule; rather it is to get the
                useful capabilities out to the user as rapidly as possible. When the
                program manager and the Functional Sponsor have agreed which
                requirements are going to be satisfied and the technologies to be used for
                the Increment, the Business Case must be updated and the Functional
                Sponsor must provide a description of what will constitute IOC.
    •    Engineering Development (ED) and Limited Fielding Phases. After the
         Engineering Development Phase contract is awarded (post-MS B) the a MAIS
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1072
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         DBS program has 18 months to obtain a Full Deployment Decision (FDD) to
         include achieving Initial Operating Capability (IOC). This should be the focus of
         all of the program's efforts. When achieving IOC appears imminent, focus can be
         shifted to the next goal(s) - the Full-Deployment Decision (FDD) and possibly the
         next increment.
    •    Full Deployment Phase and O&S. These phases are not time-limited.
The timelines for the phases of BCL must be taken into consideration during program
planning, scoping, and Business Case development. Violations of these timelines
require re-validation of the Business Case by the Investment Review Board (IRB) (and
the MDA, as required), and can potentially slow down the delivery of capability to the
user. Table 12.4.1.T1 outlines BCL timelines.
    •    *IOC is a Functional Sponsor written declaration; though the MDA will generally
         not grant a MS A if it is not clear that IOC is achievable within 5 years of MS A.
    •    **Authority to Proceed (ATP) is for follow-on increments. There is one MS A for
         the overall program, but there may be multiple ATPs (if there are multiple
         increments). ATP will "kick off" an increment.
    •    ***If activities can be conducted in time periods much less than the maximum
         time allotted, reflecting this in schedules and plans promotes visibility, and rapid
         capability delivery.
The Business Capability Lifecycle (BCL) aligns defense business system (DBS)
requirements, investment, and acquisition processes into a single, tiered integrated
decision-making framework that provides oversight commensurate with program
complexity and risk. The Functional Sponsor is a key focal point at the Component level
in the earliest stages of the capability and partners with the program manager (PM)
throughout the process as the capability matures. This integrated model of governance
is depicted in Figure 12.4.2.F1 :
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1073
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                         Figure 12.4.2.F1 - Governance
Each Investment Review Board (IRB) assesses investments in its portfolio relative to
their functional needs, as well as the impact on end-to-end business process
improvements as guided by the Business Enterprise Architecture (BEA) , articulated in
the DoD Enterprise Transition Plan (ETP), the DoD Strategic Management Plan (SMP),
and/or described in Component architectures and transition plans. These products
provide both the end-state and the roadmap to deliver more robust business
capabilities.
For MAIS, the IRBs review requirement changes and technical configuration changes
during the development process that have the potential to result in cost and schedule
impacts to the program. Such changes will generally be disapproved or deferred to
future increments and will not be approved unless funds are identified and schedule
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1074
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
impacts mitigated.
The MDA is responsible for making DBS acquisition decisions and relies on the IRB's
advice in its role as OIPT and information provided by the Component to include:
functional requirements; the Business Case; appropriate Business Process Re-
engineering (BPR) and BEA compliance (as determined by the Pre-Certifying Authority
(PCA));and a DBSMC-approved investment decision.
DoD officials and organizations have specific investment and acquisition-related roles
and responsibilities throughout the Business Capability Lifecycle (BCL), as outlined in
Table 12.4.3.T1. and T2 .
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1075
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
   Functional      Represents the end-user / user community. Responsible for
   Sponsor         activities of the BCD Phase, defining the business need
                   (problem / gap), desired outcomes, and acceptance criteria,
                   remaining actively engaged in the program throughout its
                   lifecycle in order to achieve the complete Doctrine,
                   Organization, Training, Materiel, Leadership and education,
                   Personnel, Facilities, and Policy (DOTMLPF-P) solution, and
                   for declaring IOC and the criteria for declaring Full
                   Deployment (FD). Works with the program manager to
                   complete the Program Charter. Generally, the Functional
                   Sponsor establishes and continues a strong working
                   relationship with the program manager throughout the
                   lifecycle of the DBS beginning early in IM.
   Program Manager Designated early during IM and is accountable for the
   (PM)            successful development and deployment of the DBS to
                   deliver on the outcomes defined by the Functional Sponsor.
                   The program manager develops the APB for each increment
                   and manages the program to meet cost, schedule, and
                   performance objectives. program managers shall have
                   requisite experience and competency in delivering IT
                   solutions, including the ability to build and manage multi-
                   disciplinary integrated teams and in identifying and mitigating
                   risk. The program manager also establishes and continues a
                   strong working relationship with the Functional Sponsor
                   throughout the lifecycle of the DBS beginning early in IM.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1076
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
   Defense Business Per sections 186 and 2222 of Title 10, U.S.C., provides
   Systems          investment oversight for DBS and guides the transformation
   Management       activities of the business areas of the DoD. The DBSMC
   Committee        approves IRB Certifications and CMO/DCMO BPR
   (DBSMC)          determinations, and is the final authority for DBS
                    requirements.
   Director, Cost   Provides independent analysis and advice to inform decision-
   Assessment and making. Responsible for developing and approving Analysis of
   Program          Alternatives (AoA) Study Guidance for MAIS DBS. May also
   Evaluation       review, assess and / or conduct independent cost estimates,
   (DCAPE)          cost analyses, and economic analyses, as appropriate.
   Deputy Assistant Ensures that developmental test and evaluation is effectively
   Secretary of     addressed throughout the entire lifecycle of the DBS.
   Defense,         DASD(DT&E) works in partnership with the DOT&E) to review
   Developmental    and approve the Test Plan section(s) for MAIS described in
   Test and         the Business Case and to collaborate on an integrated testing
   Evaluation       approach.
   DASD(DT&E)
   Director,        Responsible for the test and evaluation of each DBS. Works
   Operational Test with the Functional Sponsor and program manager to ensure
   and Evaluation   that roles and responsibilities, along with required test
   (DOT&E)          resources, are adequately addressed with mutual agreement
                    early on in the testing process. The DOT&E also works with
                    the DASD(DT&E) to approve the Test Plan section(s) for
                    MAIS described in the Business Case and to collaborate on
                    an integrated testing approach.
   Deputy Assistant Reviews and approves the systems engineering sections of
   Secretary of     the Business Case for MAIS.
   Defense, Systems
   Engineering
   DASD(SE)
   DoD Chief        Works with DoD Components, the IRBs, the DBSMC, and
   Information      other stakeholders to ensure that DBSs develop in compliance
   Officer (CIO)    with applicable statute (i.e., the Clinger-Cohen Act (CCA)),
                    regulations, and in accordance with DoD policy on
                    architecture, design, interoperability, security, and information
                    assurance (IA).
   Deputy Chief     Responsible for determining BPR efforts have been
   Management       undertaken as appropriate and for determining BEA
   Officer (DCMO)   compliance for non-military department and joint DBS. The
                    DCMO may also hold delegated MDA authority for certain
                    DBS and may also serve as the Chair of governance forums
                    for review and decision making purposes.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1077
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
   Enterprise Risk               Conducts independent assessments to identify risk,
   Assessment                    recommend risk mitigations to the program manager, and
   Methodology                   provide insight to decision-makers as part of BCL.
   (ERAM) Team
   Information        Advise the IRB Chair and the MDA and provide cross-
   Review Board       functional expertise and oversight for DBS. The IRBs serve as
   (IRB)              the OIPT for the MDA for DBS. The IRBs review Problem
                      Statements (for all potential DBS), Business Cases, and
                      requirements changes / technical configuration changes for
                      MAIS in development that have the potential to impact
                      program cost and schedule. The IRBs also work to ensure that
                      investments are aligned with the BEA to ensure that DBS
                      support enterprise priorities.
   IRB Chair          In addition to reviewing all information mentioned above as a
                      member of the IRB, the IRB Chair has decision authority, and
                      will therefore decide on Problem Statement approvals, make
                      acquisition-related recommendations to the MDA, serve as the
                      validation authority for DBS requirements, and hold specific
                      duties regarding IRB Certification actions.
   Milestone          Responsible for making DBS acquisition decisions as well as
   Decision Authority determining the appropriate BCL entry / acquisition phases
   (MDA)              and the extent to which regulatory and other non-statutory
                      documentation can be tailored. The MDA is also advised by
                      the IRB Chairs during the review process.
12.5.1.3. Evaluation
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1078
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
performed for a point in time. It is a brief, high-level document, with input from the
Functional Sponsor and program manager, which outlines the program/increment and
typically does not provide the detail data that the program office produces.
The Business Case provides a compelling, defendable and credible justification for the
DOTMLPF-P solution to the defined problem, with corresponding outcomes and
performance measures for use throughout the capability's lifecycle. It is an evolving
document, with the intent of providing an overview of the current program status in a
condensed format. It is structured to best support the program and does not have a
mandatory format, but it must cover all of the statutory and regulatory information
requirements.
The Business Case provides a template to ensure that a problem, it’s root cause and
DOTMLPF-P issues are thoroughly analyzed; that all options have been considered;
that risks are identified; risk mitigation plans are sufficient; and that there is a high
degree of confidence the expenditure of resources and funds are justified and value-
added. It provides leadership with sufficient information to make informed investment
decisions within the context of enterprise priorities and available resources.
Components are responsible for the development and maintenance of the Business
Case.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1079
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
12.5.1.1. Content Updates
Additionally, the PM must update and/or revise the Business Case if changes occur to
the problem scope, context or the requirements for additional modernization funding.
If the Business Case is deemed no longer valid, but the capability is still needed, the
Functional Sponsor, along with the CAE, must notify the IRB and the MDA immediately
to determine how to proceed. If the Business Case is deemed no longer valid and the
capability no longer needed, the Functional Sponsor, along with the CAE, must
immediately notify the IRB and the Milestone Decision Authority (MDA) of their intention
to discontinue the program.
    •    The Business Case is not a technical proposal, though it will contain technical
         information.
    •    Utilize tables to summarize information as much as logically possible.
    •    Up front, explain the decision or action being sought (i.e., seeking a MS A
         decision in order to do X, Y, and Z.).
    •    Do not write the Executive Summary until the rest of the content is finished. The
         Executive Summary should be updated each time a decision is being sought
         (Note: after the first Executive Summary is written, it may only require cursory
         updates in the future).
    •    The Executive Summary should be concise and focused on the issue at hand.
         Do not discuss "general knowledge" data or information (such as, the history of
         ERPs in the Department, the largesse of the DoD, the challenges of the DoD IT
         environment, easily "Google-able" information, etc.).
    •    Clarify between usage of increments or releases, what operational business
         capability will be delivered in a specific increment or release, and how each
         works toward achieving the overall outcome from a measures perspective (is it
         25% of the overall capability?)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1080
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Measures should not focus on compliance - rather, what compliance will a Law,
         Regulation, Policy enable (i.e., compliance with SFIS requirements will enable
         ______.) or conversely, if required compliance with an L, R, or P introduces risk
         or additional requirements.
    •    As a general guideline, the complete Problem Statement analysis section for a
         (projected) MAIS solution should be less than 7 pages in total length and,
         depending on the number of alternatives considered, the total Business Case
         may vary from 15 to 40 pages.
    •    A Problem Statement should never be more than 3-4 sentences in length.
    •    The Business Case will always be judged on the quality of information it contains,
         not on the length of the content.
12.5.1.3. Evaluation
    •    The investment has value to the enterprise and aligns with enterprise priorities;
    •    A materiel solution has not been selected too early in the process, and evidence
         that robust analysis has been conducted;
    •    Proper management and support of senior officials for the proposed solution;
    •    Definition of the scope for the proposed solution and measurable desired
         outcomes;
    •    Clear evidence that BPR has been done or is being completed;
    •    Ability of the Component to deliver the benefits; and
    •    Dedicated resources are working on the highest value opportunities.
Doctrine.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1081
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Organization.
    •    Where is the problem occurring? What organizations is the problem occurring in?
    •    What are the primary and secondary mission / management focus of those
         organizations?
    •    What are the organizational values and priorities?
    •    Is the organization properly staffed and funded to deal with the issue?
    •    Are commanding officers / senior management aware of the issues?
    •    Is the issue already in some type of organizational issue list?
    •    If so, why isn't the issue being resolved?
    •    Who exactly is aware of / being impacted by the issue?
Training.
Materiel.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1082
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Has leadership properly assessed the level of criticality, threat, urgency, risk, etc.
         of the operational impact(s) of the issue?
    •    Is leadership aware of the drivers and barriers to resolving the issue within her /
         his own organization?
    •    Has leadership identified inter-service / agency cultural drivers and barriers which
         hinder issue resolution?
Personnel.
Facilities.
Policy.
    •    Is there existing policy that addresses or relates to the business need? Is it Joint?
         Service? Agency?
    •    If no policy exists which pertains to the defined need, does new policy need be
         developed and implemented that will provide a total or partial solution to the
         need?
             o Can policy be developed and signed at the Component level? Will policy
                require OSD-level sponsorship, coordination and / or signature?
Outcomes and measures development begins during the Business Capability Definition
Phase (BCD), helping to scope the effort and identify outcomes that will be used at a
future point for testing. During subsequent Business Capability Lifecycle (BCL)
activities, outcomes and measures are refined and the Functional Sponsor works
closely with the acquisition and testing communities in order to ensure the information is
appropriate and relevant to the program at applicable lifecycle points.
The outcome should explicitly state the business value of the resources to be invested
and to allow management to prioritize and weigh investments. The outcome provides
strategic alignment and clear criterion against which to evaluate potential approaches. It
always starts with the desired functional result and is used to focus behaviors and
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1083
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
results by answering the "what's in it for me?" question. Corresponding measures must
be specific, actionable, measureable, relevant, and timely operational capabilities that
can be achieved against their corresponding outcomes.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1084
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                   Figure 12.5.3.F2 - Outcome Hierarchy
High-Level Outcomes (HLOs). HLOs are developed during BCD as part of the "To-Be"
Analysis, support one or more Strategic Management Plan (SMP) goals/objectives, and
constrain business outcomes. They address the strategic alignment principle - programs
must enable effective portfolio management by aligning individual investments to SMP
goals and objectives - that is central to BCL.
HLO measures are developed at the same strategic level as HLOs. They define
measurements for strategic purpose and priority and address how the investment will
meet enterprise-level expectations in finite terms.
Business Outcomes . Business outcomes are developed during BCD when the "To-
Be" Analysis is refined as a result of BPR. Business outcomes should align to specific
end-to-end (E2E) business processes defined in the Business Enterprise Architecture
(BEA) and describe the functional users intended result of fulfilling an identified
business capability gap. They are the HLOs decomposed into observable and
measureable business results or changes in business performance.
Business outcome measures, like HLO measures, should address how the investment
will meet enterprise-level expectations. They should also add increasing level of detail
to determine how the investment will meet the business results outlined in the business
outcomes.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1085
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
developed during IM based on the preferred solution. Program outcomes are scoped to
the preferred solution and should include specific business rules that explicitly define
the "To-Be" state and should address expectations of how the preferred solution will
address business outcomes and HLOs.
Program outcome measures must demonstrate the value of the preferred solution to the
Department and should provide cost and period of performance expectations for each
business capability to be delivered as part of the preferred solution, taking into account
the Better Buying Power affordability target.
  TIP: System-level requirements are generally NOT included in the Business Case;
  but are critical for the operation of the program.
System requirements measures, like their outcomes counterparts, gain increasing level
of detail as a chosen solution matures through building, testing, and deployment.
The Business Enterprise Architecture (BEA) is the enterprise architecture for all DoD
defense business systems (DBS) and capabilities and reflects the DoD business
transformation priorities; the business capabilities required to support those priorities;
and the combinations of enterprise systems and initiatives that enable those
capabilities. It also supports use of this information within an end-to-end (E2E)
Framework. BCL investment decisions require this Departmental perspective provided
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1086
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
by the E2Es to compare investment opportunities across the Department and to allow
effective portfolio management of DBS . Access to the BEAs contents is provided on the
DCMOs BEA webpage.
The E2E Business Flows that comprise the Departments E2E Framework play a critical
role in how DoD builds business capabilities, as business processes actually span,
rather than operate within, functional areas. Each E2E Business Flow represents the
life-cycle of business processes that are executed in order to fulfill a business
requirement/need of organizations throughout DoD. In order to achieve business
process optimization, specific DoD organizations need to identify and decompose the
E2E Business Flows across the functional silos of the organization.
Decomposing the E2E flows first requires each organization to identify those E2E
Business Flows that apply to them. Next, the organization can break down the E2E
Business Flow reference model into a representation of the specific Business
Processes they perform, identify Business Process inefficiencies both within and across
functional silos, and optimize the organizations E2E Business Flows accordingly.
For more detailed information on BEA Compliance, decomposing E2E Business Flows,
and more general information on both the BEA and the E2Es, refer to the DCMOs BEA
webpage .
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1087
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
DEFENSE ACQUISITION GUIDEBOOK
Chapter 13 - Program Protection
13.0. Overview
13.7. Countermeasures
13.11. Compromises
13.12. Costs
13.13. Contracting
13.0. Overview
13.0.1. Purpose
13.0.2. Contents
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1088
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
13.0. Overview
Program Protection is the integrating process for mitigating and managing risks to
advanced technology and mission-critical system functionality from foreign collection,
design vulnerability, or supply chain exploitation/insertion, battlefield loss, and
unauthorized or inadvertent disclosure throughout the acquisition lifecycle.
At its core, Program Protection protects technology, components, and information from
compromise through the cost-effective application of countermeasures to mitigate risks
posed by threats and vulnerabilities. In a simple sense, Program Protection seeks to
defend warfighting capability by keeping secret things from getting out and keeping
malicious things from getting in. Where the capability is derived from advanced or
leading-edge technology, Program Protection mitigates the risk that the technology will
be lost to an adversary; where the capability is derived from integration of commercially
available or developed components, Program Protection mitigates the risk that design
vulnerabilities or supply chains will be exploited to degrade system performance. The
Program Protection Plan (PPP) is the milestone acquisition document that describes the
plan, responsibilities, and decisions for all Program Protection activities.
13.0.1. Purpose
This chapter provides guidance and expectations for the major activities associated with
Program Protection.
13.0.2. Contents
Vulnerability Assessment
Risk Assessment
Countermeasures
Horizontal Protection
Foreign Involvement
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1089
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Managing and Implementing Program Protection Plans (PPP)
Compromises
Costs
Contracting
Program Protection is an iterative risk management process within system design and
acquisition, composed of the following activities:
    •    Systems Engineering Technical Reviews (SETR) (see Section 13.10.2 for further
         elaboration on specific Systems Engineering Technical Reviews event
         expectations), starting Pre-Milestone A with the Alternative Systems Review
         (ASR)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1090
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Systems Engineering (SE) analyses that support preparation for each Acquisition
         Milestone (see Sections 13.7.6 and 13.14 for further elaboration on how this
         process is tied to lifecycle phase-related Systems Security Engineering (SSE)
    •    Development and release of each Request for Proposal (RFP) (see Section
         13.13.1 for further details on what should be incorporated in the Request for
         Proposal (RFP) package)
At each of these points, the process is iterated several times to achieve comprehensive
results that are integrated into the system design and acquisition. This process applies
to all programs and projects regardless of acquisition category (ACAT) or status (i.e., all
acquisition categories (ACATs), Quick Reaction Capability (QRC), Request for
Information (RFI), Joint Capability Technology Demonstration (JCTD), Science and
Technology (S&T) or Authority to Operate (ATO)), or whether the technology is meant
for Government and or military use.
Program Protection is the Department's holistic approach for delivering trusted systems
and ensures that programs adequately protect their technology, components, and
information. The purpose of the Program Protection Plan (PPP) is to ensure that
programs adequately protect their technology, components, and information throughout
the acquisition process during design, development, delivery and sustainment. The
scope of information includes information that alone might not be damaging and might
be unclassified, but that in combination with other information could allow an adversary
to clone, counter, compromise or defeat warfighting capability.
The process of preparing a PPP is intended to help program offices consciously think
through what needs to be protected and to develop a plan to provide that protection.
Once a PPP is in place, it should guide program office security measures and be
updated as threats and vulnerabilities change or are better understood.
It is important that an end-to-end system view be taken when developing and executing
the PPP. External, interdependent, or government furnished components that may be
outside a program managers' control must be considered.
The PPP is the focal point for documentation of the program protection analysis, plans
and implementation within the program for understanding and managing the full
spectrum of the program throughout the acquisition lifecycle. The PPP is a plan, not a
treatise; it should contain the information someone working on the program needs to
carry out his or her Program Protection responsibilities and it should be generated as
part of the program planning process.
The Program Protection Plan Outline and Guidance , established as expected business
practice through a July 18, 2011 Principal Deputy Under Secretary of Defense for
Acquisition, Technology, and Logistics (USD(AT&L)) policy memo, can be found at:
http://www.acq.osd.mil/se/docs/PDUSD-ATLMemo-Expected-Bus-Practice-PPP-
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1091
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
18Jul11.pdf .
Critical Program Information (CPI) and mission-critical functions and components are
the foundations of Program Protection. They are the technology, components, and
information that provide mission-essential capability to our defense acquisition
programs, and Program Protection is the process of managing the risks that they will be
compromised.
CPI may include classified military information that is considered a national security
asset that will be protected and shared with foreign governments only when there is a
clearly defined benefit to the United States (see DoD Instruction 5200.39 ). It may also
include Controlled Unclassified Information (CUI), which is official unclassified
information that has been determined by designated officials to be exempt from public
disclosure, and to which access or distribution limitations have been applied in
accordance with national laws and regulations such as the International Traffic in Arms
Regulations for U.S. Munitions List items and the Export Administration Regulations for
commerce controlled dual-use items. In some cases (dependent on the PM's
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1092
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
determination) a commercial-off-the shelf (COTS) technology can be designated CPI if
the COTS element is determined to fulfill a critical function within the system and the
risk of manipulation needs mitigation.
CPI identified during research and development or Science and Technology should be
safeguarded to sustain or advance the DoD technological lead in the warfighter's
battlespace or joint operational arena.
The CPI, if compromised, will significantly alter program direction; result in unauthorized
or inadvertent disclosure of the program or system capabilities; shorten the combat
effective life of the system; or require additional research, development, test, and
evaluation resources to counter the impact of its loss.
Information that may be restricted and protected is identified, marked, and controlled in
accordance with DoD Directives 5230.24 and 5230.25 or applicable national-level policy
and is limited to the following:
CPI determination is done with decision aids and Subject Matter Experts (SMEs). As
general guidance, PMs should identify an element or component as CPI if:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1093
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         has been improved or has been adapted for a new application
    •    The component / element contains a unique attribute that provides a clear
         warfighting advantage (i.e. automation, decreased response time, a force
         multiplier)
    •    The component / element involves a unique method, technique, application that
         cannot be achieved using alternate methods and techniques
    •    The component / elements performance depends on a specific production
         process or procedure
    •    The component / element affords significant operational savings and/or lower
         operational risks over prior doctrine, organization, training, materiel, leadership
         and education, personnel, and facilities (DOTMLPF) methods
    •    The Technology Protection and/or Systems Engineering (SE) Team recommends
         that the component/element is identified as CPI
    •    The component / element will be exported through Foreign Military Sales
         (FMS)/Direct Commercial Sales (DCS) or International Cooperation
PMs should contact their Component research and development acquisition protection
community for assistance in identifying CPI.
Mission-critical functions are those functions of the system being acquired that, if
corrupted or disabled, would likely lead to mission failure or degradation. Mission-critical
components are primarily the elements of the system (hardware, software, and
firmware) that implement critical functions. In addition, the system components which
implement protections of those inherently critical components, and other components
with unmediated access to those inherently critical components, may themselves be
mission-critical.
Efforts to identify mission-critical functions and components and their protection must
begin early in the lifecycle and be revised as system designs evolve and mature.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1094
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
13.3.2.1. Criticality Analysis (CA)
Also included are components that defend or have unmediated access to mission-
critical components.
The identified functions and components are assigned levels of criticality commensurate
with the consequence of their failure on the system’s ability to perform its mission, as
shown in Table 13.3.2.1.T1 .
Level I Total Mission    Program protection failure that results in total compromise of
Failure                  mission capability
Level II                 Program protection failure that results in unacceptable
Significant/Unacceptable compromise of mission capability or significant mission
Degradation              degradation
Level III                Program protection failure that results in partial compromise of
Partial/Acceptable       mission capability or partial mission degradation
Level IV Negligible      Program protection failure that results in little or no
                         compromise of mission capability
When to perform a CA? The CA is an iterative process. To be effective, many CAs must
be executed across the acquisition lifecycle, building on the growing system maturity,
knowledge gained from prior CAs, updated risk assessment information, and updated
threat and vulnerability data.
At each key decision point, system design changes may result in adding or removing
specific items from the list of critical system functions and components. Guidance for
the iterative performance of CAs includes:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1095
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         identify all critical system components and subcomponents.
Who performs the CAs? The Government program office should perform an initial CA
early in the lifecycle (pre-Milestone A). When contracts are awarded, the DoD
contracting office should develop Requests for Proposals (RFPs) that require
contractors to perform updated CAs periodically, based on earlier CAs (see Section
13.13.1.2 for guidance on what to include in the Statement of Work (SOW)
requirements).
How is a CA performed? What is the process? While the Government should perform
an initial CA during the Materiel Solution Analysis (MSA) phase, realize that it may only
be possible to execute some of the steps in the CA process given below and/or to
execute them at a high level. As noted previously, to be effective, Criticality Analyses
(CAs) must be executed iteratively across the acquisition lifecycle, building on the
growing system maturity, knowledge gained from prior CAs, updated risk assessment
information, and updated threat and vulnerability data.
For example, the first pass through the CA process, together with assessments of
vulnerabilities, threats, risks, and countermeasures, might take just a few days and
provide a 30% solution. This CA might only involve Subject Matter Expert (SME) input
during several work sessions (to address system and architecture), as opposed to
detailed information collected from numerous program documents. For an early
iteration, precision is not possible, as it takes several iterations to complete the initial
Criticality Analysis (CA).
Concept of Operations
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1096
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
2. If possible or necessary, group the mission                            Operational Representative
capabilities by relative importance. Training or
reporting functions may not be as important as                            Subject Matter Expertise (Integration
core mission capabilities.                                                Experts, Chief Engineers)
3. Identify the systems mission-critical functions                        Activity Diagrams
based on mission threads and the likelihood of
mission failure if the function is corrupted or                           Use Cases
disabled. (Mission-critical functions may include
navigating, targeting, fire control, etc.).                               Functional Decomposition
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1097
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
5. Assign levels of criticality (I, II, III, IV) to the                   Subject Matter Expertise
identified Configuration Items or components.
Factors or criteria may include:                                               •    Systems Engineer
                                                                               •    Operators Representative
    •    Frequency of component use across                                     •    Program Office
         mission threads
    •    Presence of redundancy triple-redundant
         designs can indicate critical functions.
    •    Subject matter expertise
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1098
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
9. Identify the system functions or components         Architecture Diagrams
required to support operations in the intended
environment. This may include propulsion (the
system has to roll, float, fly, etc.), thermal
regulation (keep warm in space, keep cool in
other places, etc.) or other environmentally
relevant subsystems that must be operational
before the system can perform it’s missions.
10. Identify the Information and Communications
Technologies (ICT) implementing those system
functions and any associated vulnerabilities with
the design and implementation of that Information
and Communications Technologies (ICT).
Critical Suppliers
11. Identify suppliers of critical configuration items Manufacturing Lead
or Information and Communications Technologies
(ICT) components.
Note: Repeat this process as the system architecture is refined or modified, such as at
SETRs and major acquisition milestone decision points
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1099
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
              o    Residual vulnerability risk assessments to inform follow-up CAs
The identification of critical functions and components and the assessment of system
impact if compromised is documented in the Program Protection Plan (PPP) as
discussed in Appendix C (Table C-1) of the PPP Outline.
The prioritization of Level I and Level II components for expending resources and
attention will be documented in the PPP as discussed in Appendix C (Table C-2) of the
PPP Outline.
Why is the CA performed? The level I and selected level II components from the CA are
used as inputs to the threat assessment, vulnerability assessment, risk assessment,
and countermeasure selection. The following sections describe these activities.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1100
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
13.5. Vulnerability Assessment
DoD has designated the Defense Intelligence Agency (DIA) to be the DoD enterprise
focal point for threat assessments needed by the DoD acquisition community to assess
supplier risks. DIA established the Threat Assessment Center (TAC) for this purpose.
The Threat Assessment Center (TAC) provides the enterprise management and
interface to resources within the National Counterintelligence Executive (NCIX), and
coordinates with the Defense Intelligence and Defense Counterintelligence Components
to provide standardized all-source intelligence assessments to support acquisition risk
management efforts. This enterprise integration role of the DoD Threat Assessment
Center (TAC) was designed and organized to achieve comprehensive and consistent
engagements with the United States Government (USG) across all of the Military
Departments (MILDEPs) and Defense Agencies needs for supplier threat assessments
and to ensure the efficiency and coherent use of the results provided to the acquisition
community.
Defense Intelligence Agency (DIA) Threat Assessments provide specific and timely
threat characterization of the identified suppliers to inform program management. Threat
Assessment Center (TAC) reports are used by the Program Manager and the
engineering team to assist in selecting supplier and/or architecture alternatives and
developing appropriate mitigations for supply chain risks. For the policy and procedures
regarding the request, receipts, and handling of Threat Assessment Center (TAC)
reports, refer to DoD Instruction O-5240.24.
Supplier threat assessment requests are developed based on the criticality analysis. An
annotated work breakdown structure (WBS) or system breakdown structure (SBS) that
identifies the suppliers of the critical functions components may be used to assist with
the creation of the Threat Assessment Center (TAC) requests. Supplier threat
assessment requests may be submitted as soon as sources of critical capability are
identifiable. Near the end of the Materiel Solution Analysis (MSA) Phase, as some
threat information is available from the capstone threat assessment (CTA) and
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1101
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
technologies and potential suppliers are identified, Supply Chain Risk Management
(SCRM) Threat Assessments may be used to assist in defining lowest risk
architectures, based on suppliers for particular architecture alternatives. Note that early
in the system lifecycle the threat requests may be more focused on suppliers in general
technology areas to inform architecture choices, while later in the system lifecycle they
may be more focused on critical components defined in the criticality analysis.
The criticality analysis begins early in the system acquisition lifecycle and continues to
be updated and enhanced through Milestone C, becoming more specific as architecture
decisions are made and the system boundaries are fully defined. The engineering team
may at any point, beginning prior to Milestone A, identify technology elements and
potential manufacturers and request supplier threat assessments. It is expected that the
number of supplier threat assessment requests will grow as the criticality analysis
becomes more specific and the system architecture and boundaries are fully specified,
i.e., the greatest number of Threat Assessment Center (TAC) requests will typically
occur between Milestones B and C (i.e., Preliminary Design Review (PDR) and Critical
Design Review (CDR)). See Section 13.3.2 for more information.
[This section will be updated to reflect implementation guidance for DoD Instruction O-
5240.24, but content was not ready by the submission deadline for this major update.]
The CI analytical product that results from the analysis will provide the PM with an
evaluation of foreign collection threats to specific program or project technologies, the
impact if that technology is compromised, and the identification of related foreign
technologies that could impact program or project success. The CI analytical product is
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1102
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
updated as necessary (usually prior to each major milestone decision) throughout the
acquisition process. Changes are briefed to the Program or PM within 60 days.
The program manager approves the Program Protection Plan only after the final CI
analysis of Critical Program Information (CPI) has been received from the applicable
DoD Component CI and/or intelligence support activity. Normally, the CI analysis of CPI
is returned to the requesting program office within 180 days of the CI and/or intelligence
organization receiving the request.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1103
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Which foreign interests might be targeting the CPI and why?
    •    What capabilities does each foreign interest have to collect information on the
         CPI at each location identified by the program office?
    •    Does evidence exist to indicate that a program CPI has been targeted?
    •    Has any CPI been compromised?
Potential malicious activities that could interfere with a systems operation should be
considered throughout a systems design, development testing, production, and
maintenance. Vulnerabilities identified early in a systems design can often be eliminated
with simple design changes at lower cost. Vulnerabilities found later may require add-on
protection measures or operating constraints that may be less effective and more
expensive.
    •    Access paths within the supply chain that would allow threats to introduce
         components that could cause the system to fail at some later time (components
         here include hardware, software, and firmware); and
    •    Access paths that would allow threats to trigger a component malfunction or
         failure at a time of their choosing.
Supply chain here means any point in a systems design, engineering and
manufacturing development, production, configuration in the field, updates, and
maintenance. Access opportunities may be for extended or brief periods (but potentially
exploitable).
Two design processes that have proven effective in identifying vulnerabilities are Fault
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1104
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Tree Analysis (FTA) and Failure Modes, Effects, and Criticality Analysis (FMECA). An
important twist in applying these techniques is that the potential sources of failures are
malicious actors, not random device failures. Malicious actors invalidate many
assumptions made about randomness and event independence in reliability analysis.
Both FTA and FMECA assume hypothetical system or mission failures have occurred,
and trace back through the system to determine contributing component malfunctions or
failures. For a vulnerability assessment, the possible access paths and opportunities a
threat would have to exercise to introduce the vulnerability or trigger the failure must
also be considered.
For software, a number of software tools are available that will identify common
vulnerabilities. These tools apply different criteria and often find different flaws. It is
therefore beneficial to run code through multiple tools.
Controls on access to software during development and in the field are critical to limiting
opportunities for exploitation. One approach to testing access controls and software
vulnerabilities in general is Red Teaming. Red teams typically subject a system under
test to a series of attacks, simulating the tactics of an actual threat. (See further
discussion of software tools and access controls in Section 13.7.3, Software Assurance
.)
Additional factors that should be rated include the ease or difficulty of exploiting a
vulnerability, the developers or maintainers ability to detect access used to introduce or
trigger a vulnerability, and any other deterrents to threats such as the consequences of
being caught. A summary table of the vulnerability assessment is illustrated in Table
13.5.2.T1 .
    Critical
    Components                                                    System
                                   Identified
    (Hardware,                                     Exploitability Impact                               Exposure
                                   Vulnerabilities
    Software,                                                     (I, II, III, IV)
    Firmware)
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1105
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                                           Low
                                   Vulnerability 1
    Processor X                                                                  II                    Low
                                   Vulnerability 4
                                                           Medium
    SW Module Y                    Vulnerability 1 High                          I                     High
                                   Vulnerability 2
                                   Vulnerability 3 Low
                                   Vulnerability 6
                                                   Medium
                                                           High
    SW Algorithm A                 None                    Very Low              II                    Very Low
                                   Vulnerability 1
    FPGA 123                                        Low                          I                     Low
                                   Vulnerability 23
Investigation of vulnerabilities may indicate the need to raise or at least reconsider the
criticality levels of functions and components identified in earlier criticality analyses.
Investigation of vulnerabilities may also identify additional threats, or opportunities for
threats, that were not considered risks in earlier vulnerability assessments.
Vulnerabilities inform the risk assessment and the countermeasure cost-risk-benefit
trade-off.
Discovery of a potentially malicious source from the threat assessment may warrant
additional checks for vulnerabilities in other (less-critical) products procured from that
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1106
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
source. Therefore, threat assessments can inform vulnerability assessments.
In the Program Protection Plan (PPP) the vulnerability process should be documented
at a high level along with the person responsible for the process. The date of the
vulnerability assessment, the results of the vulnerability assessment and the planned
dates or period of future vulnerability assessments is also recorded in the PPP.
For each Level I and Level II critical function or component the program performs a risk
assessment. Figure 13.6.F1 shows the overall risk assessment methodology.
The system impact level from the criticality analysis is used to determine the risk
consequence. The risk likelihood is based upon the vulnerability assessment and the
knowledge or suspicion of threats within the supply chain and potential vulnerabilities
within supplied hardware, software, and firmware products. Each Service and program
may have specific guidance on how to use the threat assessment and vulnerability
assessment to develop the risk likelihood. A basic method which may be used in the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1107
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
absence of program or service specific guidance is described in this section.
One way to translate the threat assessments and vulnerability assessments into risk
likelihood or probability is to develop specific questions for supply chain and software
assurance. The following paragraphs list two sets of sample Yes/No vulnerability
questions that a program can use to establish the risk likelihood. The first set of
vulnerability questions applies to supply chain considerations.
The second set of sample Yes / No questions apply to software/ firmware assurance
considerations.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1108
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
              o Common Vulnerabilities and Exposures ( Common Vulnerabilities and
                Exposures ( CVE ) )
             o Common Attack Pattern Enumeration and Classification ( CAPEC )
    •    Are static analysis tools used to identify and mitigate vulnerabilities?
    •    Does the software contain Fault Detection/Fault Isolation ( FDFI ) and
         tracking or logging of faults?
    •    Do the software interfaces contain input checking and validation?
    •    Is access to the development environment controlled with limited
         authorities and does it enable tracing all code changes to specific
         individuals?
    •    Are specific code test-coverage metrics used to ensure adequate testing?
    •    Are regression tests routinely run following changes to code?
Table 13.6.T2 provides an example of a table that summarizes the vulnerability and
threat assessment results used to develop the risk likelihood. A table similar to this is
beneficial to the program in understanding the rationale and should be documented in
the Risk section of the Program Protection Plan (PPP). The overall likelihood is derived
from the supply chain risk likelihood, the software assurance risk likelihood and the
threat assessment. The Overall Risk Likelihood may be derived by using a weighted
average of the three inputs or using the highest risk. In the example shown in Table
13.6.T2 , the overall risk likelihood of High was derived by applying equal weights for
the Supply Chain and Software Assurance Risk Likelihood and the Threat Assessment
Risk. The program or service may develop their own specific weightings based upon
their program and domain specific knowledge.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1109
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
           Table 13.6.T2 Risk Likelihood Derived From Vulnerability and Threat
                                      Assessments
                                        - No Supply      - No secure
                                        Chain visibility design standard
                                        - No supplier          - No static
                                        qualification          analysis
                                        process
                                                       - No Common
                                        - No receiving Vulnerabilities
                                        verification   and Exposures
                                                       (CVE), Common
                                        - No trusted   Weakness
                                        suppliers      Enumeration
                                                       (CWE), Common
                                                       Attack Pattern
                                                       Enumeration and
                                                       Classification
                                                       (CAPEC)
                                                               - No input
                                                               validation
                                                               - No Regression
                                                               test
                                                               - Low test
                                                               coverage
Component 2              II             Low                    Not Likely                  Medium              Low
                                        - No Supply
                                        Chain visibility
                                        - No supplier
                                        qualification
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1110
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
The No responses to the questions help to determine the possible countermeasures to
be considered for risk mitigation. A similar table may be created which records the
countermeasures planned and the new risk probability as a result of the planned
mitigations. Table 13.6.T3 provides an example worksheet for planning the
countermeasures and the resulting Risk Likelihood.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1111
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                           Table 13.6.T3. Risk Likelihood After Mitigations
                                                       - Regression testing
                                                       added
                                                       - Test coverage
                                                       increased to 60%
                                                       - Penetration testing
                                                       added
The risk is then incorporated into the program technical risks. The risk entry may look
similar to the following example:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1112
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Software Assurance Technical Risks                                           Mitigation Activities
R1. Field-programmable gate array (FPGA) 123                                 Establishing a wrapper to
has high exposure to software vulnerabilities                                implement secure design standards
with potential foreign influence                                             and fault logging, static analysis,
                                                                             increased test coverage, and
                                                                             penetration testing
Technical Issues
1. May impact performance, cost, and schedule
Opportunities
O1. Low investment, great benefit for program                                Low cost, benefit for program and
and overall for Missile Programs                                             command
Ensure that the top program protection risks ( very high and high ) have a risk cube and
mitigation plans.
13.7. Countermeasures
13.7.1. Anti-Tamper
13.7.1.3.1. Process
13.7.1.3.2. Sustainment
13.7.1.3.3. Packaging
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1113
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
13.7.2.2. Critical Program Information (CPI) in Other Than DoD Information
Systems
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1114
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
13.7.3.3.3. Generated Code Inspection
13.7.4.2.3. Securing the Supply Chain Through Maintaining Control Over the
Information and Information Flow
13.7.6.1. Definitions
13.7.7. Security
13.7. Countermeasures
This section describes the guidance and expectations for Program Protection
countermeasures. Countermeasures are cost-effective activities and attributes to
manage risks to Critical Program Information (CPI) and critical functions and
components. They vary from process activities (e.g., using a blind buying strategy to
obscure end use of a critical component) to design attributes (e.g., Anti-Tamper design
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1115
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
to defeat Critical Program Information (CPI) reverse engineering attempts) and should
be selected to mitigate a particular threat. For each countermeasure being
implemented, the program should identify someone responsible for its execution and a
time- or event-phased plan for implementation.
13.7.1. Anti-Tamper
Anti-Tamper (AT) is the Systems Engineering activities intended to deter and/or delay
exploitation of critical technologies in U.S. defense systems in order to impede
countermeasure development, unintended technology transfer, or alteration of a
system. ( DoDI 5200.39 ) Properly fielded Anti-Tamper (AT) should:
A subset of Critical Program Information (CPI) that specifically resides within a weapon
system, training or its support equipment, must be considered for protection by Anti-
Tamper (AT) techniques to delay or prevent Reverse Engineering (RE). Critical
Technologies can be found in: System hardware, embedded software, application
software, and data. Critical Technologies should not be confused or associated with
Critical Technology Elements (CTE), in other words, Critical Technologies (CTs) as it
apply to Anti-Tamper (AT) is not a matter of maturity or integration level.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1116
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         capability development, acquisition and Planning, Programming, Budgeting and
         Execution (PPBE) process planning cycles.
    •    Anti-Tamper (AT) requirements may affect other aspects of a program, such as
         associated maintenance and training devices, and should include end item
         assessment of cost, schedule and performance if not considered at program
         onset.
    •    Anti-Tamper (AT) requirements should be included (but not limited to) in the
         following documents; Request for Proposal (RFP), SOO, Statement of Work
         (SOW), Initial Capabilities Document (ICD), Capability Production Documents
         (CPD), Capability Development Documents (CPD), Acquisition Strategy (AS),
         Work Breakdown Structure (WBS), Test and Evaluation Master Plan (TEMP) ,
         Information Assurance Strategy (IAS), Systems Engineering Plan (SEP) , and
         Systems Engineering Management Plan (SEMP) should be included in the DoD
         Acquisition systems review process (i.e., Systems Engineering Technical
         Reviews (SETRs)). Refer to DoD Anti-Tamper (AT) Desk Reference for sample
         language and further guidance.
    •    Anti-Tamper (AT) is also applicable to DoD systems during Pre-Planned Product
         Improvement (P 3 I) upgrades as new Critical Technologies (CT) may be added
         to the system. Additionally, Anti-Tamper (AT) should be specifically addressed in
         export sales (direct commercial sales, foreign military sales) and international
         cooperative programs if those systems have Critical Technologies (CT) to
         protect.
    •    Anti-Tamper (AT) also involves risk management. The level of Anti-Tamper (AT)
         should be based on the risk of the loss of U.S. control on the asset containing
         Critical Program Information (CPI) (level of exposure) and the operational impact
         (criticality and consequence) if the Critical Program Information (CPI) is lost or
         compromised. Refer to DoD Anti-Tamper (AT) Guidelines for further guidance.
The DoD Anti-Tamper Executive Agent (ATEA) provides support to the PM by helping
to determine whether or not to implement Anti-Tamper (AT), per DODI 5200.39. The
decision to use or not to use Anti-Tamper (AT) will be documented in a classified annex
to the Program Protection Plan (PPP), referred to as the Anti-Tamper (AT) Plan. The
Anti-Tamper (AT) Plan includes, but is not limited to, the following information:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1117
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         component level;
    •    Determination of how long Anti-Tamper (AT) is intended to delay hostile, or
         foreign exploitation or reverse-engineering efforts;
    •    The effect that compromise would have on the acquisition program if Anti-
         Tamper (AT) were not implemented;
    •    The estimated time and cost required for system or component redesign if a
         compromise occurs and;
    •    Key Management Plan.
13.7.1.3.1. Process
(Consult the DoD Anti-Tamper (AT) Desk reference for further guidance.)
Note: It is highly recommended that the program contact the Component Anti-Tamper
(AT) Office of Primary Responsibility (OPR) to obtain a name of a Verification and
Validation (V&V) lead. This Verification and Validation (V&V) lead and his team will
follow the progress of the Anti-Tamper (AT) Plan implementation and provide
consultation. This Verification and Validation (V&V) lead will also determine if the Anti-
Tamper (AT) Plan meets the protection level and provide whether the Component Anti-
Tamper (AT) Office of Primary Responsibility (OPR) should concur/non-concur with the
Anti-Tamper (AT) Plan. This Verification and Validation (V&V) lead will also be the
witness to the actual testing of the Anti-Tamper (AT) Plan and provide a memo back to
the program as to whether it did complete the Anti-Tamper (AT) testing. The Verification
and Validation (V&V) lead and team will be provided to the program at no cost but only
as consultants. They will not develop the Anti-Tamper (AT) Plan. That is for the program
office/contractor.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1118
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
13.7.1.3.2. Sustainment
Note: The U.S. Government and U.S. industry should be protected against warranty
and performance claims in the event Anti-Tamper (AT) measures are activated by
unauthorized maintenance or other intrusion. Such unauthorized activities are regarded
as hostile attempts to exploit or reverse engineer the system or the Anti-Tamper (AT)
measures.
Note: Programs should also plan and budget for Anti-Tamper (AT) maintenance to
include government/contractor investigations of tamper events.
13.7.1.3.3. Packaging
Anti-Tamper (AT) affected equipment may need specially designed and approved
shipping containers ready upon delivery. The containers should provide the same level
of protection from exploitation as the protected Critical Technologies (CT) within the
container while in the supply chain or have the Anti-Tamper (AT) equipment active while
shipping.
• The fact that Anti-Tamper (AT) has been implemented on a specific system is
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1119
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         classified as Unclassified/FOUO (For Official Use Only) unless otherwise
         directed (e.g. specific direction requiring system Anti-Tamper (AT) be handled at
         a higher classification level, system security classifying system Anti-Tamper (AT)
         higher)
    •    The fact that Anti-Tamper (AT) has been implemented on a specific sub-system
         or even a component of a sub-system is classified SECRET. Refer to the DoD
         Anti-Tamper (AT) Security Classification Guide (SCG) for further clarification.
The DoD Anti-Tamper Executive Agent (ATEA) is responsible for all Anti-Tamper (AT)
policy consistent with the DoDI 5000.02 and DoDI 5200.39 . The office has established
a network of DoD Component Anti-Tamper (AT) points of contacts (POCs) to assist
program managers in responding to Anti-Tamper (AT) technology and/or
implementation questions. Additionally, the Acquisition Security Database (ASDB) has
been developed as a common shared database of Anti-Tamper (AT) related
information.
Anti-Tamper (AT) implementation is tested and verified during developmental test and
evaluation and operational test and evaluation.
The PM develops the validation plan and provides the necessary funding for the Anti-
Tamper (AT) Verification and Validation (V&V) on actual or representative system
components. The Verification and Validation (V&V) plan, which is developed to support
Milestone C, is reviewed and approved by the DoD Anti-Tamper (AT) Executive Agent ,
or Component Anti-Tamper (AT) Office of Primary Responsibility (OPR), prior to
milestone decision. The program office conducts the Verification and Validation (V&V)
of the implemented Anti-Tamper (AT) plan. The Anti-Tamper (AT) Verification and
Validation (V&V) team witnesses these activities and verifies that the Anti-Tamper (AT)
techniques described in the Anti-Tamper (AT) Plan are implemented into the system
and performs according to the Anti-Tamper (AT) plan. The validation results are
reported to the Milestone Decision Authority.
The DoD Anti-Tamper Executive Agent (ATEA) has published a DoD Anti-Tamper (AT)
and Verification and Validation (V&V) Plan Templates to assist program managers and
contractors with the required content for approval. The latest Templates can be
downloaded by registered users at https://www.at.dod.mil/ .
Domestic cases:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1120
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
The Anti-Tamper (AT) Plans (Initial and Final) are to be created by the government or
government contractor and approved first by the program manager. Then, they are
submitted to the Component Anti-Tamper (AT) Office of Primary Responsibility (OPR)
60 days prior to PDR for initial plans and 60 days prior to Critical Design Review (CDR)
for final Anti-Tamper (AT) Plans. The same approval timeline holds true for the
verification and validation plans (Initial and Final) if separated from the Anti-Tamper
(AT) Plan.
After the program manager has approved the Anti-Tamper (AT) Plan, the Anti-Tamper
(AT) Executive Agent or Component Anti-Tamper (AT) Office of Primary Responsibility
(OPR), provides an evaluation of the Anti-Tamper (AT) Plan and a letter of concurrence
to the program office and Milestone Decision Authority.
Export cases:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1121
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         obtained prior to system export.
Information Assurance (IA) is defined as measures that protect and defend information
and information systems by ensuring their availability, integrity, authentication,
confidentiality, and non-repudiation.
All mission critical functions and components, and information systems storing,
processing, or transmitting Critical Program Information (CPI) must be appropriately
protected, regardless of whether the information systems are owned and controlled by
the Department of Defense or by external entities. Programs with identified Critical
Program Information (CPI) need to ensure that the Critical Program Information (CPI) is
protected in every computing environment that hosts it, or over which it is transmitted.
With the requirement to identify system critical functions and associated components,
Information Assurance (IA) needs to determine the Information Assurance (IA) controls
needed for their protection. (See Chapter 7 for further details on Information Assurance
(IA) Implementation.)
DoDD 8500.01E and DoDI 8500.2 detail the policy, process, and procedures for
implementing appropriate Information Assurance (IA) into DoD information systems.
They mandate a controls-based approach, which considers a systems assigned Mission
Assurance Category (MAC) and Confidentiality Level (CL) in determining the required
robustness of Information Assurance (IA) controls to be implemented. DoD information
systems with Critical Program Information (CPI) must be accredited in accordance with
DoDI 8510.01 (DIACAP). The DoD Information Assurance Certification and
Accreditation Process (DIACAP) establishes a standard process, set of activities,
general task descriptions, and a management structure to certify and accredit
information systems throughout the system lifecycle. The DoD Information Assurance
Certification and Accreditation Process (DIACAP) provides an independent validation
process that verifies that appropriate protection measures have been implemented,
tested, and maintained, and that any residual risk is at an acceptable level for system
operation.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1122
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
weapons system being acquired, which may also contain Critical Program Information
(CPI).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1123
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Inventory of Critical Program Information (CPI) (item, site, system hosting, and
         Information Assurance (IA) Point of Contact (POC)) is complete.
    •    Any supplemental Information Assurance (IA) controls specified to protect Critical
         Program Information (CPI) are incorporated into the DoD Information Assurance
         Certification and Accreditation Process (DIACAP) implementation plan or
         equivalent security requirements traceability matrix.
The details of the programs Information Assurance (IA) approach to protecting all
Critical Program Information (CPI) and system critical functions and components should
be documented in the Countermeasures subsections of the Program Protection Plan
(PPP), and should address the content of Sections 13.7.2.1 and 13.7.2.2 , as
applicable.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1124
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
track software assurance protections throughout the acquisition. The progress toward
achieving the plan is measured by actual accomplishments/results that are reported at
each of the Systems Engineering Technical Reviews (SETRs) and recorded as part of
the Program Protection Plan.
The Program Protection Plan (PPP) Outline and Guidance requires acquisition
programs to address software assurance responsibilities for the planning and
implementation of program protection countermeasures. Such countermeasures
address the anticipated attacks a system may experience from the threats it will face by
eliminating or reducing vulnerabilities. The countermeasures are selected with an
understanding of which parts of the software are the most critical to the success of the
mission. The plan includes a sample Software Assurance Countermeasures Table,
which summarizes the planned and current state of a programs software assurance
activities. The table is also used as part of a vulnerability assessment to identify
operational, developmental, design, COTS and software tool vulnerabilities that that can
be addressed by planning and implementing software assurance countermeasures.
The table in the PPP is divided into 3 sections that provide different vulnerability and
countermeasure perspectives on software assurance plans and implementation:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1125
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
13.7.3.1. Development Process
The purpose of this section of the table is to measure and explicitly capture the
assurance activities conducted during software development and the integration of off-
the-shelf components. As appropriate to the risk of compromise and criticality of the
software in question, PMs are to analyze the development activities for:
Not all software will require the same level of software assurance activities and
mitigation planning and implementation in programs with millions of lines of code, there
may be some functions (perhaps a monthly reporting feature) that are less mission-
critical than other (perhaps a satellite station-keeping module). It may also be difficult to
perform some types of assessment and mitigation activities on COTS software for which
the source code is not available. Note that in such cases software related risks still
exists and may be unmitigated. The software assurance table in the PPP recognizes
these varying types of software and allows for differing plans/implementation of
assurance as needed.
The establishment and update of secure design and code standards by the program
should address the potential types of attacks the system would face and draw upon
DoD, Government, Federally Funded Research and Development Centers (FFRDC),
academia, commercial web sites and industry sources for mitigation approaches and
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1126
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
methods to address those that could impact the systems mission capabilities. The list of
attack patterns captured in the Common Attack Pattern Enumeration and Classification
(CAPEC) collection can be used to help consistently analyze a system for potential
types of attacks they may face and to bring consistency into the validation activities
when the program is verifying that the design and coding standards are being followed.
Due to the subtle nature of most weaknesses in code that lead to unreliable, insecure,
and brittle applications that are easily influenced by attackers it is important that code
inspections utilizing tools be part of the approach used to minimize these weaknesses.
There are over 700 documented types of weaknesses in code, design, architecture, and
implementation captured in the Common Weakness Enumeration (CWE) catalog but
not all of them are equal threats to any specific application or system. Programs may
wish to draw upon secure design and coding approaches defined on websites such as
top 10 secure coding practices (
https://www.securecoding.cert.org/confluence/display/seccode/Top+10+Secure+Coding
+Practices ) and the Common Weakness Enumeration (CWE)/ SysAdmin, Audit,
Network, Security (SANS) top 25 most dangerous software errors (
http://cwe.mitre.org/top25/index.html ) to establish and update their secure design and
coding standards. As a minimum the code inspection is used to inspect for conformance
to the secure design and coding standards established for the program.
An important part of the code inspection is to identify the subset of the overall CWE
collection to focus on initially. Alternate approaches to focusing in on a subset of the
weaknesses are described in the CWE paragraph below (13.7.3.1.6.) and the CAPEC
paragraph (13.7.3.1.5.) . These approaches can be used independently or in
combination if desired.
Because of the dynamic nature of the threat environment and information about how
systems can be compromised through software weaknesses, the program should have
a methodology to periodically update their secure design and coding standards so that
reviews using them address new types of attacks and types of weaknesses.
The next three sections of this document describe the middle three columns of the PPP
Software Assurance Table, which are meant to capture how the established
vulnerability (CVE), weakness (CWE), and attack pattern (CAPEC) collections are being
used by the project team to identify and mitigate the most dangerous types of
vulnerabilities in the software. These columns are further defined below but the most
critical part of completing these three columns is the analysis of which CVEs, CWEs,
and CAPECs should be used as the denominator of these percentage calculations and
the documentation within the project team of the rationale and methodology followed in
determining those lists and keeping them current throughout the project as the system
design, development and testing progresses and the threat environment and other
factors change.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1127
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
13.7.3.1.4. Common Vulnerabilities and Exposures (CVE)
Common Vulnerabilities and Exposures (CVE ) information is used to identify, track, and
coordinate mitigation activities of the publicly known vulnerabilities in commercial
(COTS) and open source software which are often used by threats actors/agents to
attack systems. Programs that incorporate COTS software into their systems should
perform regular searches of the CVE lists before purchase and throughout the software
lifecycle to understand vulnerabilities in those COTS software components and assess
potential threat to mission success.
The CVE list is a compilation of publicly known information about security vulnerabilities
and exposures. The list is international in scope, free for public use, and referenced in
most commercial tools that scan operational systems and networks for vulnerabilities.
The CVE list can be used to identify publicly known software vulnerabilities that could:
CVE is intended for use by security experts, so it assumes a certain level of knowledge.
Programs should use a tool during incremental software testing of their commercial and
open source packages that scans those operational components and matches the
results with the CVE dictionary. Alternately, a scan of the affected software packages on
the CVE list can be used to review the list for any publicly known vulnerabilities for the
software packages being used by a DoD program. A list of CVE compatible tools is
available at http://cve.mitre.org/compatible/product.html .
The CVE column in the Program Protection Plan Software Assurance table reports the
planned and actual percentages of software components that incorporate COTS or
open source that have been analyzed and acceptably remediated against the CVEs
from the CVE list that apply to those COTS and open source packages.
Supportive analysis by the project team must record the CVEs found, the remediation
applied, and the residual risk to the mission of any unresolved CVEs. To identify which
CVEs should be included in the analysis the list of CVEs for each COTS product and
open source should be tracked and those that were remediated documented as such.
For each COTS and open source package utilized as part of the system, the project
staff should determine whether an explicit vulnerability advisory/alert activity is
provided/offered by the provider/developer of those packages.
For those that do not provide publicly available advisories/alerts about security issues
that need to be resolved the project staff should carefully consider the risk they are
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1128
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
inheriting from that developer by not providing patch information in a manner that CVE
identifiers can be assigned. Without CVE identifiers it is much harder to track and
manage the state of deployed software within the DoD’s vulnerability management
practice and the automation tooling deployed within the DoD. 100% of developmental
Critical Program Information (CPI) software and developmental critical-function software
packages, whether COTS or open source, must be evaluated using CVE, to surface
exposures inherited by incorporating open source or COTS libraries or products.
If the selected tool outputs any CVE with a CVSS score above medium (4), programs
should mitigate the vulnerability with highest priority first and then work through the next
highest priority issue until the residual risk represented by the remaining vulnerabilities
is acceptable to the mission owner. CVEs that are included in any DoD Information
Assurance Vulnerability Management (IAVM) alerts and advisories should be addressed
in accordance to the priorities and timeframe included in the IAVM from DISA.
CAPEC is international in scope, free for public use, catalog of attack patterns outlining
information such as a comprehensive description of the phases and steps in attacks,
the weaknesses they are effective against (using CWEs), and a classification taxonomy
that can be used for the analysis of common attack patterns. CAPEC attack patterns
cover a wide variety of families of attacks including: data leakage attacks, resource
depletion attacks, injection attacks, spoofing attacks, time and state attacks, abuse of
functionality attacks, attacks using probabilistic techniques, attacks exploiting
authentication, attacks exploiting privilege/trust, attacks exploiting data structure,
resource manipulation attacks, network reconnaissance, social engineering attacks, as
well as some physical security attacks and supply chain attacks.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1129
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
patterns may influence the selection of the COTS and open source software products,
programming languages, and design alternatives. By understanding the attackers
perspective and how a programs software is likely to be attacked, programs can directly
consider these exploit attempt methods and mitigate them with design, architecture,
coding and deployment choices that will lead to more secure software.
Programs should identify the set of attack patterns that pose the most significant risk
and leverage them at each stage of the Software Development Lifecycle (SDLC). A
discussion of how to use CAPEC in this manner is available on the Engineering for
Attack page on the CWE site ( http://cwe.mitre.org/community/swa/attacks.html ). This
is the same basic methodology described in the new ISO/IEC Technical Report 20004,
"Refining software vulnerability analysis under ISO/IEC 15408 and ISO/IEC 18045" ,
which describes an alternate approach for doing a vulnerability analysis of a software-
based system under the Common Criteria regime. ISO/IEC 15408 and ISO/IEC 18045
are the two standards that guide and describe the Common Criteria evaluation
methodology.
Basically that page describes how an analysis using attack patterns to represent the
expected threat and identify the subset of weaknesses that are of most concern, can be
used to identify which weaknesses those attacks would be effective at exploiting and
that list can be used to influence the choices about design and architecture, considering
the planned operational use, the creation of security policies, requirements, and thinking
through the risks related to the systems intended use. This list of the weaknesses, the
ones that are exploitable by the attack patterns the systems adversary are capable of
using against the system can be used to identify the subset of relevant CWE
weaknesses to avoid and to vet for during implementation. The lists associated
CAPECs can be used to guide the software testing by identifying high priority test cases
that should be created for risk-based security testing, penetration testing, and red
teaming. [1]
The CAPEC column in the Program Protection Plan Software Assurance table reports
the planned and actual percentages of developed software components that have been
evaluated utilizing the attack patterns from the CAPEC list to identify the appropriate
sub-set of CWEs, to consider alternate design and architectures or implementations, or
to drive the creation of appropriate misuse and abuse test cases.
Supportive analysis by the project team must record the CAPECs identified as germane
to the system, the CWEs identified as being susceptible to those CAPECs and the
remediation applied along with an understanding of the residual risk to the mission of
any CWEs that weren’t tested by simulating CAPECs against the system. To identify
which CWEs should be included in the testing analysis based on CAPEC inspired test
cases the list of CWEs reviewed for the static analysis tools/services should be tracked
and those that were identified, covered by the analysis tool/service and appropriately
remediated should be documented as such. For each CWE that was not covered by a
static analysis tool/service, the project staff should determine whether an appropriate
CAPEC inspired test case or Red Team activity was conducted without finding an
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1130
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
exploitable CWE.
For those CWEs that were not covered by static analysis or testing, the project staff
should carefully consider the risk to the mission from the potential of those weaknesses
remaining in the system. Without demonstrable evidence that the CWEs that an
attacker could exploit are mitigated there will always be some level of risk but it is
incumbent on the project staff to document this residual risk for the end user so they
can manage that risk when the system is deployed within the DoD. 100% of
developmental Critical Program Information (CPI) software and developmental critical-
function software should be evaluated against the CAPEC list.
The Common Weakness Enumeration (CWE) is international in scope and free for
public use. CWE provides a unified, measurable set of software weaknesses to enable
more effective discussion, description, selection, and use of software security tools and
services to find weaknesses in source code and operational systems components as
well as to better understand and manage software weaknesses related to architecture
and design.
CWE is targeted to developers and security practitioners. Programs should use CWE-
compatible tools to scan software for CWE. A list of CWE-compatible products is
available at http://cwe.mitre.org/compatible/product.html .
The CWE column in the table reports the planned and actual percentages of developed
software components that have been evaluated utilizing the weaknesses from the CWE
list to identify the appropriate sub-set of CWEs, to consider alternate design and
architectures or alternate coding constructs.
Supportive analysis by the project team must record the subset of CWEs identified as
being most germane to the secure operation of the system. The subset of CWEs can be
taken from the CWE/SANS Top 25 Most Dangerous Software Errors list or by utilizing
the Common Weakness Risk Analysis Framework (CWRAF) to identify the subset of
CWEs that are the most dangerous to the systems mission given what the software is
doing for the mission. CWRAF allows a project team to create their own list of the most
dangerous CWEs based on the specifics of their system and which failure modes are
the most important to mitigate/prevent.
The CWE/SANS Top 25 Most Dangerous Software Errors list on the CWE and SANS
Web sites provides detailed descriptions of the top 25 programming errors along with
authoritative guidance for mitigating and avoiding them.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1131
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
The CWRAF methodology is described on the CWE web site and numerous examples
are provided to help a project team learn how to apply the methodology to their system
in combination with the Common Weakness Scoring System (CWSS).
By using the Common Weakness Scoring System (CWSS) a program can also reflect
their specific list of dangerous CWEs into their tools so the risk to the mission of the
weaknesses found during static and dynamic analysis or penetration testing reflects the
relative importance of those impacts.
The CWE web site is at http://cwe.mitre.org and the CWSS web page is at
http://cwe.mitre.org/cwss/ .
Additionally, the project team should have a documented understanding of the residual
risk to the mission of any CWEs that weren’t reviewed for by static analysis
tools/services or tested by simulating the CAPECs that would be effective against those
CWEs. For CWEs deemed to be dangerous but not covered by a static analysis
tool/service, the project staff should determine whether an appropriate CAPEC inspired
test case or Red Team activity was conducted without finding an exploitable CWE.
For those CWEs that were not covered by static analysis or testing, the project staff
should carefully consider the risk to the mission from the potential of those weaknesses
remaining in the system. Without demonstrable evidence that the CWEs that an
attacker could exploit are mitigated there will always be some level of risk but it is
incumbent on the project staff to document this residual risk for the end user so they
can manage that risk when the system is deployed within the DoD. 100% of
developmental Critical Program Information (CPI) software and developmental critical-
function software should be evaluated against the identified subset of the CWE list.
In addition to the above listed MITRE websites, PMs should consider best practices
identified at http://www.safecode.org/index.php .
Programs should report what portion of the system will undergo penetration testing. The
purpose of penetration testing is to subject the system to an attack exercise to raise
awareness of exploitable vulnerabilities in the system and accelerate their remediation.
Also the knowledge that a system will undergo penetration testing increases the
vigilance of the software engineers responsible for architecting, designing,
implementing, and fielding the systems.
The text should support the number with brief an explanation of the penetration testing
performed and a reference to any supporting reports generated by that testing.
The unit’s used for planned/actual percentages for this metric are at the discretion of the
program. They should be explained in the text and be meaningful and provide insight
into the completeness of the testing. For example a network that exposes a certain
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1132
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
number of protocols may measure the percentages in the space of protocol states. A
system with an API may measure the number of interface functions probed.
Programs should report on their planned and actual test coverage. Unit’s and metrics
for test coverage are at the discretion of the program, but should be meaningful and
yield insight into the completeness of the testing regimen.
This section refers to the software and firmware on the fielded system. Software
assurance countermeasures is a rapidly evolving area. Successful assessments,
techniques, applications, and example outcomes are frequently published in papers that
can be found at DoD, Government, Funded Research and Development Centers
(FFRDC), and commercial web sites. The FFRDC Carnegie Mellon Software
Engineering Institute (SEI) and MITRE both have searchable libraries containing
information about the approaches to Software Assurance indicated in the Program
Protection Plan Outline & Guidance , Table 5.3.3-1 Application of Software Assurance
Countermeasures.
Identical code for a failed function will most likely suffer the same failure as the original.
For redundancy in software, therefore, a completely separate implementation of the
function is needed. This independence reduces the probability that the failover code will
be susceptible to the same problem.
Design principles applied to software to isolate faults, include functions to trap, log, and
otherwise protect element failures from affecting other elements and the larger system.
Logs help trace the sources of operational faults. Logs can also be examined to
determine whether there was a malicious attack.
Programs reporting a Yes in the table should be prepared elaborate with technical detail
on how the fault isolation mechanisms were employed in the architecture and design for
the particular component or sub-system. Fail over or fault isolation is also where the
logging of the failure event and the capture of relevant data needed to determine root
cause of the failover event is best included.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1133
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
13.7.3.2.3. Least Privilege
Design principle applied to software that limits the number, size, and privileges of
system elements. Least privilege includes separate user roles, authentication, and
limited access to enable all necessary functions but minimize adverse consequences of
inappropriate actions.
Programs reporting a Yes in the table should be prepared elaborate with technical detail
on how least privilege principles were employed in the architecture and design for the
particular component or sub-system.
Programs reporting a Yes in the table should be prepared elaborate with technical detail
on how system element isolation principles were employed in the architecture and
design for the particular component or sub-system.
The degree to which software element inputs are checked and validated according to
defined criteria and functionality. Input checking and validation should ensure that out-
of-bounds values are handled without causing failures and the invalid input events are
logged
Programs reporting a Yes in the table should be prepared to elaborate on specific anti-
tamper techniques are included in the architecture, design, and implementation of the
software component or sub-system and what risks they are intended to mitigate.
Software tools used in the development environment (as opposed to the actual fielded
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1134
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
software) are another source of risk to warfighting capability and should be considered
in the Program Protection Plan (PPP). In particular a compromised development
environment could be leveraged by an attacker to insert malicious code, exploitable
vulnerabilities, and/or software backdoors into the operational software before it is
fielded.
Programs should tailor the list contents of the SW Product column in this section of the
table to enumerate the software tools pertinent to the programs development
environment(s). For each SW product listed table entries should address the items
enumerated in the following columns.
When source code is available, it becomes easier to answer some questions about the
behavior of the tool and detect potential compromise.
Is source code available for the tool? A simple yes or no should suffice. If further
information (e.g. coding language, code size, licensing cost constraints) would provide
useful insight annotate the entry with a note.
Software tools are often updated. These updates are a potential path for an attacker to
compromise the development environment and thus the operational software.
Indicate whether testing for indications of malicious insertion or tool compromise are
performed on each update of the tool before that update is incorporated into the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1135
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
development environment.
Indicate whether/how any generated code for the system is examined for malicious
code or exploitable vulnerability potentially inserted by the software tool in question.
In general, the problem of how to effectively inspect generated code for malicious
insertion remains an open area of research. From the practical standpoint, it is better to
perform some inspection than to ignore the problem entirely. That at least raises the bar
for what an attacker needs to do compromise the system undetected.
Note that in many instances simple sanity checks can be effective in detecting some
injected malware. For example: extracting, comparing and sorting strings might point to
a trigger string used to open a backdoor. Decompiling an executable may reveal the
presence of OP codes not normally generated by the compiler.
Where generated code inspection is deemed of benefit programs should tailor the form
of inspection to the unique aspects of the program and report planned and actual
percentages appropriately.
Programs should consider adding additional columns to this area of the software
assurance table with the rationale for the additions if programs judge them to
significantly reduce the risk of malicious insertion.
Controlling and accounting for printing of technical manuals and other documentation.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1136
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
[1]
http://capec.mitre.org/documents/An_Introduction_to_Attack_Patterns_as_a_Software_Assurance_Knowledge_Reso
urce.pdf
This section describes how a program can manage supply chain risks to critical
program information and critical functions and components with a variety of risk
mitigation activities throughout the entire system lifecycle. The Supply Chain Risk
Management (SCRM) guidance in this section identifies references that establish a
sample of practices for managing supply chain risks. As Supply Chain Risk
Management (SCRM) techniques and practices continue to evolve, additional guidance
will be published to refine the Departments understanding and implementation of Supply
Chain Risk Management (SCRM). There are a variety of existing resources available to
aid in the understanding and implementation Supply Chain Risk Management (SCRM).
The following is a list that includes, but is not limited to the following foundational
documents:
    •    DTM 09-016 Supply Chain Risk Management (SCRM) to Improve the Integrity of
         Components Used in DoD Systems - Establishes authority for implementing
         Supply Chain Risk Management (SCRM) throughout DoD and for developing
         initial operating capabilities.
    •    DoD Supply Chain Risk Management (SCRM) Key Practices and Implementation
         Guide Provides a set of practices that organizations acquiring goods and
         services can implement in order to proactively protect the supply chain against
         exploitation, subversion, or sabotage throughout the acquisition lifecycle.
    •    National Defense Industrial Association (NDIA) System Assurance Guidebook
         Provides guidance on how to build assurance into a system throughout its
         lifecycle, organized around the Undersecretary of Defense for Acquisition,
         Technology, and Logistics (USD(AT&L)) Life Cycle Management Framework.
    •    DoD Instruction O-5240.24, Counterintelligence (CI) Activities Supporting
         Research, Development, and Acquisition (RDA)
Supply chain risk management provides programs with a framework for analyzing all the
risks associated with the supply chain, which enables the determination of what risks
may be mitigated, and what risks may be accepted. This determination will vary based
on the purpose and mission being performed. Applying Supply Chain Risk Management
(SCRM) early in the production lifecycle will allow for earlier mitigations and a more
strategic approach for managing risk.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1137
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
13.7.4.2. Supply Chain Risk Management (SCRM) Throughout the System
Lifecycle
Figure 13.7.4.2.F1 illustrates how key Supply Chain Risk Management (SCRM)
activities align with the steps in the DoD Acquisition Lifecycle. Activities are organized
by the various roles that should perform the indicated functions/procedures. Due to the
multidisciplinary nature of Supply Chain Risk Management (SCRM), Program Protection
requires careful planning and coordination across multiple stakeholders.
Mitigation of supply chain risks is most effective when identification and implementation
occur early in a programs acquisition planning and contracting. Generally, mitigation
choices narrow and become more expensive the further into the lifecycle they occur.
Given the amount of information and supply choices that are present in the
marketplace, Operations Security (OPSEC) related to the acquisition process for
programs is vital.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1138
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
13.7.4.2.1. Criticality Analysis
A cornerstone for the identification of supply chain risks and the development of supply
chain risk management strategies and mitigations for critical components is the
criticality analysis. The Work Breakdown Structure (WBS) or System Breakdown
Structure (SBS) may be used to annotate the suppliers of the critical components
identified by the criticality analysis. A Supplier-Annotated Supply Chain Risk
Management (SCRM) Work Breakdown Structure (WBS) or System Breakdown
Structure (SBS) is a helpful tool to assist with tracking and managing the supply chain
risks. The Supply Chain Risk Management (SCRM) Work Breakdown Structure (WBS)
or System Breakdown Structure (SBS) is a detailed breakdown identifying all system
assemblies, subassemblies and components and their suppliers for, at a minimum, all
critical components identified through criticality analysis. The Supply Chain Risk
Management (SCRM) Work Breakdown Structure (WBS) or System Breakdown
Structure (SBS) may also include alternative suppliers for all critical components down
to the Commercial off-the-shelf (COTS)-item level, with the cost, schedule, and
performance impact data for each alternative. Although the Supply Chain Risk
Management (SCRM) Work Breakdown Structure (WBS) or System Breakdown
Structure (SBS) is not a current requirement, it may be an effective way to record, track
and manage the potential suppliers of critical functions as the trade-offs analysis
between security, performance, and cost is examined.
The Supply Chain Risk Management (SCRM) System Breakdown Structure (SBS) may
provide insight into any teaming arrangements based on an understanding of the
defense industrial base and subsequent supply chain. Prior to Milestone B,
manufacturers typically develop their supplier lists and enter into teaming agreements.
Because of that, programs may consider requiring oversight and input into any supplier
teaming arrangements. The program could put controls in place so that supplier lists
provide alternatives/alternative suppliers for critical components. Between Preliminary
Design Review (PDR) and Critical Design Review (CDR), the Supply Chain Risk
Management (SCRM) Work Breakdown Structure (WBS) or System Breakdown
Structure (SBS) should be provided by suppliers to the government for review and
vulnerability/risk assessment. It is essential that the DoD work with potential Prime
Contractors to develop supplier lists and gain insight to potential teaming arrangements.
This input is supported by contract clauses such as Consent to Subcontract.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1139
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
13.7.4.2.3. Securing the Supply Chain Through Maintaining Control Over the
Information and Information Flow
OPSEC
Provenance
It is important to establish and maintain the origin, development, delivery path, and
mechanisms to protect the integrity of critical components, tools, and processes, as well
as their associated changes, throughout the lifecycle. This enables accurate supply
chain (SC) risk assessment and mitigation, which requires accurate information on the
origin of components, how they are developed, how they are delivered throughout the
supply chain. This includes strong system and component configuration management to
ensure traceability against unauthorized changes. Selecting suppliers who maintain
provenance is the first step to reducing supply chain (SC) risks.
Once critical functions and components have been identified, design and engineering
protections can be employed to reduce the attack surface and reduce what is
considered critical. These protections should further protect intrinsically critical functions
and reduce existing unmediated access to them.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1140
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
on trusted suppliers of microelectronics, please refer to Sect ion 13.7.5 .
Procurement language has been developed and is available for use to help mitigate
supply chain risk through contractual requirements in the Statement of Work (SOW).
Refer to Section 13.13.1.2 below for suggested language.
Organizations can help mitigate supply chain risk down the contract stack by requiring
and encouraging suppliers and sub-suppliers to use sound security practices and allow
transparency into processes and security practices. It is recommended that contract
vehicles should require, encourage, or provide incentives for suppliers to deliver up-to-
date information on changes that affect supply chain (SC) risk, such as changes in their
suppliers, locations, process, and technology.
Use of the acquisition and procurement process early in the system lifecycle is a key
way to protect the supply chain by defining and creating supplier requirements and
incentives; using procurement carve-outs and controlled delivery path processes; and
using all-source intelligence in procurement decisions. Source selection criteria and
procedures should be developed in order to encourage suppliers to provide detailed
visibility into the organization, elements, services, and processes. Other procurement
tools may be available to manage the criticality of components and address risk in
acquisition planning and strategy development.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1141
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
website at http://www.dmea.osd.mil/trustedic.html .
13.7.6.1. Definitions
System Assurance (SA) : The justified confidence that the system functions as
intended and is free of exploitable vulnerabilities, either intentionally or unintentionally
designed or inserted as part of the system at any time during the life cycle. (National
Defense Industrial Association (NDIA) Guidebook, Engineering for System Assurance ,
Ver. 1.0)
DAG Chapter 4 provides comprehensive guidance for Systems Engineering (SE). In this
chapter, Section 13.7.6.3 provides an overview of Systems Security Engineering (SSE)
as a key countermeasure and Section 13.14 provides elaboration on how to include
Systems Security Engineering (SSE) within Systems Engineering (SE). As a contextual
starting point, the evolution of specifications and associated baselines across the
acquisition is shown in Table 13.7.6.2.T1 .
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1142
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                     Table 13.7.6.2.T1 Evolution of Specifications/Baselines
Table 13.7.6.2.T2 shows the expected maturity of the baselines across the system
lifecycle, according to the Systems Engineering Technical Reviews (SETR) events at
which they should be assessed. It is noteworthy that even as early as the Alternative
Systems Review (ASR), the preliminary system requirements should be identified. For
further details concerning the appropriate system security content of the maturing
baselines as they relate to the Systems Engineering (SE) review timeline, see Section
13.10.2 ( Systems Engineering Technical Reviews ).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1143
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
   PCA             Maintained            Maintained              Maintained                                 Finalized;
                                                                                                            Approved
As a countermeasure in its own right, key Systems Security Engineering (SSE) activities
are highlighted as follows:
Key Systems Security Engineering (SSE) criteria can be specified for each of the
phases leading up to a major program Milestone, and it is important to establish these
criteria across the full lifecycle in order to build security into the system. Further details
are provided in Section 13.14 .
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1144
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
13.7.7. Security
This section will be updated in the next Defense Acquisition Guidebook (DAG) update.
[1]
http://capec.mitre.org/documents/An_Introduction_to_Attack_Patterns_as_a_Software_Assurance_Knowledge_Reso
urce.pdf
    •    Classification standards
    •    Export Control guidelines
    •    Foreign disclosure arrangements
    •    Anti-tamper protections
    •    Information Assurance standards
    •    Physical Security Standards
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1145
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
13.8.1. Acquisition Security Database (ASDB)
The Acquisition Security Database (ASDB) is designed to support PMs, Research and
Technology Protection (RTP), Anti Tamper (AT), Counterintelligence, Operations
Security (OPSEC), and Security personnel supporting Acquisition Programs. It provides
automated tools and functionality to enable efficient and cost-effective identification and
protection of Critical Technologies (CT) and Critical Program Information (CPI). It offers
a federated search capability and facilitates a standardized identification,
implementation, and tracking of Critical Program Information (CPI) countermeasures.
The Acquisition Security Database (ASDB) supports program protection and specifically
the horizontal protection process by providing a repository for Critical Program
Information (CPI) and Countermeasures and offering a collaboration environment for
programs, counterintelligence, security Subject Matter Experts (SMEs), and enterprise
individuals supporting the program protection planning effort. The use of the Acquisition
Security Database (ASDB) by DoD Components was directed by an Acquisition,
Technology, and Logistics (AT&L) memo on July 22, 2010 which directed DoDI 5200.39
establishes policy to conduct comparative analysis of defense systems' technologies
and align Critical Program Information (CPI) protection activities horizontally throughout
the DoD.
The Acquisition Security Database (ASDB) supports the horizontal protection process in
three ways. First, the Acquisition Security Database (ASDB) features a federated
search that will allow enterprise individuals to search for similar Critical Program
Information (CPI) based on name, Military Critical Technology List (MCTL), or
technology type. The search results include the names of Program Protection Plans
(PPPs) that match the search criteria. The Critical Program Information (CPI)
description within the Program Protection Plan (PPP) can be reviewed and the
assessment made as to whether or not the two Critical Program Information (CPI) are
similar enough to require similar risk mitigation levels. Second, after the same or similar
Critical Program Information (CPI) have been identified and determined to require
equivalent risk mitigation, the protections lists in the Program Protection Plans (PPPs)
can be analyzed for applicability. Third, the Acquisition Security Database (ASDB)
provides the collaboration environment for discussions about the comparison of Critical
Program Information (CPI), counterintelligence threats, and planned protections.
The program Acquisition Security Database (ASDB) record should be created as soon
as Critical Program Information (CPI) is identified and updated periodically, as changes
occur, and at each subsequent milestone. Critical Functions/Components are not
identified in the Acquisition Security Database (ASDB).
To request access to the Acquisition Security Database (ASDB), please do the following
from a SIPRNET Terminal:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1146
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    2. Select "Register" (on left menu).
    3. Fill out User Information.
    4. Click "Submit Information".
Your access request will be placed into the workflow process for approval. Once
approved you will receive a SIPR and NIPR email from the Acquisition Security
Database (ASDB) Technical Team.
If you need assistance, please contact the Acquisition Security Database (ASDB)
Technical Team at (252) 464-5914 or DSN 451-5914.
The process for doing horizontal protection during Program Protection Plan (PPP)
creation, update, or review is outlined below:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1147
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         identify other programs with potentially similar Critical Program Information (CPI).
         Consider threat and vulnerability differences between programs.
    4.   Compare planned countermeasure protection against the similar Critical Program
         Information (CPI) and consider threat and vulnerability differences between
         programs.
    5.   If there are perceived discrepancies or concerns, adjudicate the differences at
         the lowest organizational level.
    6.   If the discrepancies are not resolved, escalate to an executive decision making
         organization as determined by Acquisition, Technology, and Logistics (AT&L).
    7.   Incorporate the results of adjudication into the Program Protection Plan (PPP).
If you have any questions or need assistance, please contact the Acquisition Security
Database (ASDB) Technical Team.
When it is likely that there will be foreign involvement or access to the resulting system
or related information, the Program Manager must plan for this foreign involvement to
assist in identifying vulnerabilities to foreign exposure and developing technology
security and foreign disclosure guidance to ensure that foreign access is limited to the
intended purpose for that involvement while protecting critical program information and
mission-critical functions and components and other sensitive information.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1148
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    System interoperability will be needed for use in future coalition operations
The Program Manager should consult with their international programs organization and
technology security and foreign disclosure offices (i.e. National Disclosure Policy, Low
Observable/Counter Low Observable (LO/CLO), Defensive Security Systems (DSS),
Anti-Tamper (AT), Communication Security (COMSEC), Intelligence, etc.) to obtain
assistance in addressing this matter. An integrated product team might be established
for this purpose. International considerations to be addressed include the following,
many of which should be available from the analysis used in developing the
International Cooperation section of the Technology Development Strategy:
Defense Exportability Features (DEF) was established in the fiscal year 2011 National
Defense Authorization Act to develop and incorporate technology protection features
into a system or subsystem during its research and development phase. By doing this,
exportable versions of a system or subsystem could be sold earlier in the Production
and Development phase, thereby (1) enabling capability to be available to allies and
friendly companies more rapidly and (2) lowering the unit cost of DoD procurements.
Prior to the Engineering and Manufacturing Development Phase, programs should
investigate the necessity and feasibility (from cost, engineering, and exportability
perspectives) of the design and development of differential capability and enhanced
protection of exportable versions of the system or subsystem. See Chapter 11.2.1.2.
International Considerations within the Acquisition Management Framework for
summary of Defense Exportability Features (DEF) nomination and feasibility
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1149
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
assessment.
13.10.1. Audit’s
13.10.4. Sustainment
    •    Can serve as a focal point for capturing critical System Security Engineering
         (SSE) analysis and assessment results as it gathers and matures.
    •    Will provide previously-missing coverage of Systems Security Engineering (SSE)
         activities and associated analyses.
    •    Should contain either references to where the information can be found or the
         actual information.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1150
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
The Program Protection Plan (PPP) is a plan a forward-looking document according to
which the execution of protection will be performed. But it is also a report, which gathers
the analysis and assessment results for effective program protection and system
security as the plan is executed.
In this manner (plan + report), the Program Protection Plan (PPP) serves to provide the
Security Assurance Case (reference International Organization for Standardization
(ISO) Standard 15026 - Part 2 for a discussion on Assurance Cases), in which the
preferred system concept represents the assurance claims, the system design
represents the assurance arguments and the test plan results and Requirements
Traceability Matrix (RTM) is the assurance evidence that can be traced to the system
requirements.
The Program Protection Plan (PPP) should tie all these things together. For example:
    •    Table 3.3-1 of the Program Protection Plan (PPP) Outline is used to provide
         traceability of critical components from mission-level documents (JCIDS (Joint
         Capabilities Integration Development System) Key Performance Parameters,
         Key System Attributes, etc.) and Critical Technology Elements (CTE) to the
         system architecture.
    •    Section 5.3 of the Program Protection Plan (PPP) Outline discusses the
         requirement to indicate the Request for Proposal (RFP) Contract Line Item
         Number (CLIN) or Data Item Description (DID) that will be used to ensure that
         Critical Program Information (CPI) and critical functions/components are
         protected in the development environment and on the system.
    •    Section 9.3 of the Program Protection Plan (PPP) Outline directs the
         implementation of Verification and Validation (V&V) to provide evidence that
         system security has been achieved, including a link to relevant discussion in the
         Test and Evaluation (T&E) documents.
The last bullet above indicates a method of checking Program Protection Plan (PPP)
implementation (i.e., Verification and Validation (V&V)). Audit’s/inspections are also
used; namely, to ensure compliance with applicable laws, regulations, and policies.
Engineering reviews are used to ensure that system security requirements are
identified, traceable, and met throughout the acquisition lifecycle. These methods of
checking Program Protection Plan (PPP) implementation are described in the following
subparagraphs.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1151
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
13.10.1. Audit’s
Program Protection Surveys and other security audit’s and inspections check for
compliance with protection requirements. These processes check that statutory and
regulatory security activities are being performed. Program Managers (PMs) should
check with their Component research and development acquisition protection resources
for guidance on performing these audit’s.
Section 9.2 of the Program Protection Plan (PPP) Outline requires that the Program
Protection Plan (PPP) answer these questions:
The Systems Engineering Technical Reviews (SETR) process provides a key Systems
Engineering health and risk assessment tool that is discussed in detail in Chapter 4 .
The following subparagraphs provide key Systems Security Engineering (SSE) and
Supply Chain Risk Management (SCRM) criteria, recommended as Systems
Engineering Technical Reviews (SETR) entry/exit criteria, in order to assess and ensure
that an appropriate level and discipline of program protection activities and analyses are
conducted across the full system context.
System security objectives and criteria are in the process of being defined and will be
included in the next update.
    •    System security and Supply Chain Risk Management (SCRM) are addressed as
         part of the alternative systems analysis and the development of the preferred
         system concept.
    •    The preferred system concept is based on an initial criticality analysis, using
         current threat data, informed by supply chain risk identification.
    •    Potential countermeasures for candidate Critical Program Information (CPI) and
         critical functions are identified.
    •    Plans are defined to protect critical functions/components, processes, tools, and
         data.
    •    The Statement of Work (SOW) for the Technology Development (TD) phase
         Request for Proposal (RFP) includes appropriate tasks for Systems Security
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1152
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         Engineering (SSE) and Supply Chain Risk Management (SCRM).
Recommended Criteria:
    •    Where system security and Supply Chain Risk Management (SCRM) addressed
         as part of the alternative systems analysis and the development of the preferred
         system concept?
            o Was an initial criticality analysis performed and documented for the
                Program Protection Plan (PPP)?
            o Did it use relevant, current threat data and potential vulnerability
                information?
            o Was the preferred system concept critical function/component alternatives
                evaluated for supply chain risks through the threat assessment center and
                used to add constraints to the system requirements?
            o Are the preferred concept engineering analysis and Request for Proposal
                (RFP) requirements being informed by supply chain risk considerations
                such as limited potential suppliers and defensive design?
            o Were criticality analysis results used to determine and evaluate candidate
                Critical Program Information (CPI) and critical functions, with rationale?
            o Have candidate countermeasures and possible sub-countermeasures
                been identified, with an emphasis on logic bearing components and supply
                chain risks?
            o Did the analysis include the full system context, including the multiple
                systems that support end-to-end mission threads?
    •    Was Systems Security Engineering (SSE) an integral part of the Milestone A
         phase systems engineering analysis?
            o Did all of the Systems Security Engineering (SSE) and Supply Chain Risk
                Management (SCRM) considerations and analyses inform the
                identification of requirements in the preliminary system requirements
                document (SRD)?
            o Have potential subsystem and component alternatives for critical functions
                been evaluated for potential suppliers, software assurance, and system
                assurance risks?
            o Has the assessment of security risks resulted in system security
                requirements in the System Requirements Document (SRD)?
            o Have residual Systems Security Engineering (SSE) based program
                protection risks and supply chain risks been identified for mitigation?
    •    Are plans in place to protect critical components, processes, tools, and data?
            o Do they promote awareness and provide personnel training on supply
                chain risks?
            o Are plans to define and protect critical processes, including the identity of
                users and system uses, included?
            o What tools are being used, how are they being protected (physically and
                operationally), and how are tools and data managed (including hardware
                development tools, software development tools, developer collaboration
                tools, and configuration management tools)?
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1153
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Have appropriate tasks been included in the Statement of Work (SOW) for the
         Technology Development (TD) phase Request for Proposal (RFP)?
            o Are specific responsibilities for Supply Chain Risk Management (SCRM)
               and for updated criticality analyses to assess critical functions and refine
               the identification of critical components included?
            o Is competitive prototyping and design included, as appropriate, for
               candidate Critical Program Information (CPI) and critical functions?
            o Are tasks to develop associated protection countermeasures included,
               based on the previously identified potential protection countermeasures
               and the system security requirements in the System Requirements
               Document (SRD)?
            o Are the use of software assurance databases and techniques (e.g.,
               Common Vulnerabilities and Exposures (CVE), Common Attack Pattern
               Enumeration and Classification (CAPEC), Common Weakness
               Enumeration (CWE), static analysis, and penetration testing) included?
Recommended Criteria:
    •    Where system security and Supply Chain Risk Management (SCRM) concerns
         considered in the development of the system performance requirements and
         non-tailorable design requirements, across the full system context (e.g., System
         of Systems (SoS))?
             o Are the system security and Supply Chain Risk Management (SCRM)
                requirements mutually understood between the Government and
                contractor(s)?
             o Are they testable or otherwise verifiable?
             o Will they lead to a final system that is operationally secure and consistent
                with cost and schedule?
    •    Is Systems Security Engineering (SSE) an integral part of the Technology
         Development (TD) phase systems engineering analysis?
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1154
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
              o  Have contractor(s) performed and summarized their initial criticality
                 analysis (which updates the Government provided initial criticality
                 analysis, if available)?
             o In the absence of contractors, has the Government performed an updated
                 criticality analysis?
             o Does it include rationale for the selection of Critical Program Information
                 (CPI) and critical functions and potential components?
             o Have lists of initial Critical Program Information (CPI), critical functions
                 (and potential components), selected countermeasures, and potential sub-
                 countermeasures been developed?
             o Have initial threat and vulnerability assessments have been performed,
                 tied to the contractors initial criticality analysis summary?
    •    Is/are the contractor(s) effectively fulfilling Technology Development (TD) phase
         Statement of Work (SOW) tasks for Systems Security Engineering (SSE) and
         Supply Chain Risk Management (SCRM)?
             o Are contractor-refined system security requirements derived from the
                 countermeasures, sub-countermeasures, and defensive design or runtime
                 features selected (e.g., design diversity and least privilege)?
             o Is there a draft allocation of sub-countermeasures and defensive
                 requirements to preliminary design (architecture)? Does that design
                 (architecture) extend to the full system context (e.g., System of Systems
                 (SoS))?
             o Does the Systems Engineering Management Plan (SEMP) describe
                 Systems Security Engineering (SSE) processes, with process updates
                 derived from the countermeasures, sub-countermeasures, and controls
                 selected?
             o Is there a draft allocation of process sub-countermeasures to the
                 acquisition time line and to management and Systems Engineering (SE)
                 sub-processes?
             o Does the contractors review package include planning to address the
                 government provided residual security risk assessment (divided into
                 acquisition, operational, and sustainment sections)?
             o Are tasks, funding, and schedule allocated in order to implement the
                 Systems Security Engineering (SSE) and Supply Chain Risk Management
                 (SCRM) requirements for the system and for management and SE sub-
                 processes?
             o Are appropriate software assurance databases and techniques (e.g.,
                 Common Vulnerabilities and Exposures (CVE), Common Attack Pattern
                 Enumeration and Classification (CAPEC), Common Weakness
                 Enumeration (CWE), static analysis, and penetration testing) being
                 planned and used?
    •    Are relevant Supply Chain Risk Management (SCRM) key practices being
         applied?
             o What development and configuration management tools are being used,
                 how are they being protected (physically and operationally), and how are
                 tools and data managed (including hardware, software, and data
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1155
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                   configuration management)?
              o    Is defensive design being used across the full system context to anticipate
                   potential ways that an element, system, or process could fail or be
                   misused, so that the architecture and requirements specification of the
                   element, system, or process can minimize failures and misuse?
              o    How are critical elements and processes being protected?
              o    How are trustworthy elements being selected?
              o    How are supply chain assurance concerns being incorporated into the
                   requirements?
              o    Does the contract language cover Supply Chain Risk Management
                   (SCRM) considerations (e.g., the right to subcontract, etc.)?
    •    System security and Supply Chain Risk Management (SCRM) concerns are
         considered in establishing the functional baseline.
    •    An updated criticality analysis is performed, together with updated threat and
         vulnerability assessments, as required.
    •    Lists are updated for candidate Critical Program Information (CPI), critical
         functions and components, selected countermeasures, and potential sub-
         countermeasures.
    •    Relevant Supply Chain Risk Management (SCRM) key practices are being
         applied.
Recommended Criteria:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1156
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
              o Have Critical Program Information (CPI) and critical-function residual
                vulnerability risks been assessed?
    •    Are the cost and schedule of lower-level Systems Security Engineering (SSE)
         tasks identified and included in the lower level cost and schedule plans?
            o Are detailed Systems Security Engineering (SSE) activities in agreement
                with the Integrated Master Plan (IMP) and Integrated Master Schedule
                (IMS)?
            o Do planning packages include required resources to complete Systems
                Security Engineering (SSE) tasks?
    •    Are relevant Supply Chain Risk Management (SCRM) key practices being
         applied?
            o What development tools are being used by suppliers, how are they being
                protected (physically and operationally), and how are tools and data
                managed (including hardware, software, and data configuration
                management)?
            o Is defensive design being applied to include defensive functions and to
                maximize resilience? Does this extend across the full system context?
            o How are critical elements and processes being protected?
            o How are trustworthy elements being selecting?
            o How thoroughly are suppliers and their supply chain(s) being evaluated?
            o Are the plans to promote awareness and provide personnel training on
                supply chain risks being executed?
    •    System security and Supply Chain Risk Management (SCRM) concerns are
         considered in establishing the system allocated baseline.
    •    Preliminary system design, including security, is ready to proceed into detailed
         design; and, stated security performance requirements can be met within cost,
         schedule, risk, and other system constraints.
    •    An updated criticality analysis is performed and an updated list of Critical
         Program Information (CPI), critical functions and components, selected
         countermeasures, and sub-countermeasures is produced, with rationale.
    •    Relevant Supply Chain Risk Management (SCRM) key practices are being
         applied.
Recommended Criteria:
    •    Have system security and Supply Chain Risk Management (SCRM) concerns
         been considered in establishing the system allocated baseline?
            o Has an updated criticality analysis summary been generated, with
               rationale for Critical Program Information (CPI) and critical component
               selection?
            o Was an updated threat and vulnerability assessment summary, with
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1157
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                respect to the updated criticality analysis summary, included, and were
                supply chain risks included? Does this extend across the full system
                context (e.g., System of Systems (SoS))?
            o Does the updated Critical Program Information (CPI)/critical component
                list include countermeasures and sub-countermeasures?
            o Are inherited Critical Program Information (CPI) and horizontal protection
                adequately assessed, are they being addressed consistently at system
                and subsystem levels, and are they documented in the updated
                Acquisition Strategy (AS), Test and Evaluation Management Plan (TEMP)
                , and Program Protection Plan (PPP)?
            o Have Systems Security Engineering (SSE) and Supply Chain Risk
                Management (SCRM) processes been updated, based on the selected
                countermeasures, sub-countermeasures, and controls?
    •    Does the preliminary system design appropriately include and address security,
         and is it ready to proceed into detailed design?
            o Where System Requirements Document (SRD) security requirements
                trades based on the Program Managers (PM) assessment of cost,
                schedule, performance, and supply chain risks?
            o Were the security requirements specifications, updated for subsystems
                and components, derived from the countermeasures, sub-
                countermeasures and defensive design or runtime features selected (e.g.,
                defense in depth)?
            o Was an updated residual security risk assessment performed for the
                summary-level critical functions and Critical Program Information (CPI),
                covering acquisition, operations, and sustainment activities (for both
                system and processes, after sub-countermeasures are applied), including
                supply chain considerations? Does this extend across the full system
                context (e.g., System of Systems (SoS))?
            o Has an Anti-Tamper (AT) plan been generated (if it is a contract
                deliverable)?
            o Are appropriate software assurance databases and techniques (e.g.,
                Common Vulnerabilities and Exposures (CVE), Common Attack Pattern
                Enumeration and Classification (CAPEC), Common Weakness
                Enumeration (CWE), static analysis, and penetration testing) used to
                assess vulnerabilities and exposures to attack, common destructive attack
                patterns, and weaknesses in the software architecture and design?
            o Have Critical Program Information (CPI) and critical components and sub-
                components that were categorized as Critical Technology Elements (CTE)
                been demonstrated at a Technology Readiness Level (TRL) 6 or better?
    •    Was an allocation of sub-countermeasures and defensive functions to the
         design/architecture below the counterintelligence (CI) level performed?
            o Was the critical functionality of each Hardware Configuration Item (HWCI)
                and Computer Software Configuration Item (CSCI) allocated to lower level
                components?
            o Were Systems Security Engineering (SSE) fault isolation tree and system
                response analysis techniques used to define appropriate sub-
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1158
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                countermeasures?
            o Does the allocated design effectively implement appropriate sub-
                countermeasures?
            o Where system designs that could expose critical functionality to
                vulnerability assessed, and were architecture trade-offs evaluated, in
                order to formulate the allocated baseline?
            o Were external and subsystem interface requirements vulnerabilities
                assessed and used as input to the sub-countermeasure selection?
            o Do planned sub-countermeasures for design and implementation include
                software assurance (e.g., fail-safe defaults, defense in depth, purging of
                temporary data, avoidance of unsafe coding constructs, secure languages
                and libraries, and static and dynamic code analysis)?
    •    Are relevant Supply Chain Risk Management (SCRM) key practices being
         applied?
            o What development tools are being used by suppliers, how are they being
                protected (physically and operationally), and how are tools and data
                managed (including hardware, software, and data configuration
                management)?
            o How are critical elements and processes being protected?
            o How are supplier roles constrained and access limited?
            o How thoroughly are suppliers and their supply chain(s) being evaluated?
            o Are the plans to promote awareness and provide personnel training on
                supply chain risks being executed?
    •    System security and Supply Chain Risk Management (SCRM) concerns are
         considered in establishing the system product baseline.
    •    Detailed system design, including security, is ready to proceed into
         fabrication/development; and, stated security performance requirements can be
         met within cost, schedule, risk, and other system constraints.
    •    An updated criticality analysis is performed and updated lists of Critical Program
         Information (CPI), critical components and sub-components, selected
         countermeasures, and specific sub-countermeasures are produced, with
         rationale. This extend across the multiple systems that support the end-to-end
         mission threads.
    •    Relevant Supply Chain Risk Management (SCRM) key practices are being
         applied.
Recommended Criteria:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1159
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
               rationale, to include a final list of updated Critical Program Information
               (CPI) and critical components and sub-components, together with
               associated countermeasures and explicit sub-countermeasures?
            o Were inherited Critical Program Information (CPI) and horizontal
               protection adequately assessed in the updated criticality analysis?
            o Were adequately robust Systems Security Engineering (SSE) tools used
               in establishing the product baseline; e.g., the use of an updated system
               response matrix and Systems Security Engineering (SSE) fault isolation
               tree techniques?
            o Were appropriate software assurance databases and techniques (e.g.,
               Common Vulnerabilities and Exposures (CVE), Common Attack Pattern
               Enumeration and Classification (CAPEC), Common Weakness
               Enumeration (CWE), static analysis, and penetration testing) used to
               reassess vulnerabilities and exposures and to reexamine weaknesses in
               the software architecture, design, and code?
            o Do sub-countermeasures for implementation and testing include software
               assurance (e.g., purging of temporary data, avoidance of unsafe coding
               constructs, secure languages and libraries, static and dynamic code
               analysis, fault injection, and patch management)?
            o Has a residual vulnerability risk assessment been performed to assess,
               mitigate and re-assess weaknesses in the detailed design, including an
               assessment of security in the operational environment?
    •    Does the detailed system design include and appropriately address security and
         Supply Chain Risk Management (SCRM) considerations, and is it ready to
         proceed into fabrication/development?
            o Have all Systems Security Engineering (SSE) requirements been flowed
               down and mapped to the detailed system design and the lifecycle
               processes?
            o Has an allocation of specific sub-countermeasures to sub-components
               and lower-level items in counterintelligence (CI) specifications and
               Statement of Work requirements been performed?
            o Have appropriate Systems Security Engineering (SSE) countermeasures
               and sub-countermeasures been allocated to the design with validation
               criteria established (e.g., engineering-in-depth for separation and layering
               of critical elements, addition of defensive function layers, and handling of
               authentication methods)?
            o Does the detailed design incorporate good Systems Security Engineering
               (SSE) practices, such as minimizing the attack surface, the number of
               critical components, and/or the number of potential weaknesses? Do
               these Systems Security Engineering (SSE) practices extend across the
               full system context (e.g., System of Systems (SoS))?
            o Are quantifiable measures being used to assess the detailed design for
               security and for application of countermeasures (corrective actions) to
               address identified deficiencies?
            o Has manufacturability been assessed, including the availability and
               identification of accredited suppliers for secure fabrication of Application-
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1160
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                specific integrated circuits (ASICs), Field-programmable gate array
                (FPGAs), and other programmable devices?
            o Has validation and verification of system security and Supply Chain Risk
                Management (SCRM) requirements been finalized and reflected in the
                Test and Evaluation Management Plan (TEMP), preliminary test plans,
                Systems Engineering Plan (SEP) , and other operational Concept of
                Operations (CONOPS) documents?
    •    Are relevant Supply Chain Risk Management (SCRM) key practices being
         applied?
            o What development tools are being used by suppliers, how are they being
                protected (physically and operationally), and how are tools and data
                managed (including hardware, software, and data configuration
                management)?
            o Was diversification of standard interfaces and defensive design used for
                architecting the system?
            o How will critical elements and processes be protected throughout the
                lifecycle, including disposal of items?
            o How will trustworthy elements continue to be selected throughout the
                lifecycle?
            o How are supplier roles constrained and access limited?
            o How thoroughly are suppliers, their supply chain(s), and their delivery
                processes being evaluated?
            o How are Government supply chain delivery mechanisms being protected?
            o Are the plans to promote awareness and provide personnel training on
                supply chain risks being executed?
    •    System security and Supply Chain Risk Management (SCRM) concerns are
         considered in establishing readiness of the system to begin formal acceptance
         testing, and this extends across the full system context (e.g., System of Systems
         (SoS)).
    •    The system test plans and objectives, including scope, procedures, and test
         facilities, are adequate for appropriately verifying all system security
         requirements.
Recommended Criteria:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1161
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
              o   Are all countermeasures for all identified vulnerabilities implemented in the
                  detailed design included and mapped?
             o Do system verification and validation plans (including flow-down from the
                  Test and Evaluation Management Plan (TEMP) to test plans and
                  procedures) adequately ensure that coding and fabrication of designed
                  components provide the required system security?
             o Is it possible to pull the thread from the initial criticality analysis identifying
                  comprehensive security requirements across the lifecycle to the
                  verification/validation efforts (comprising, for example, a set of organized
                  pointers into the Program Protection Plan (PPP), System Requirements
                  Document (SRD), flow-down requirements specifications, design
                  documents, requirements traceability matrix, and the Test and Evaluation
                  Management Plan (TEMP), test plans, and test procedures)?
    •    Are system test plans and objectives, including scope, procedures, and test
         facilities, adequate for appropriately verifying all system security and Supply
         Chain Risk Management (SCRM) requirements, across the full system context
         (e.g., System of Systems (SoS))?
             o Are security threat and attack scenarios included in testing?
             o Does system testing include penetration testing, testing for confidentiality
                  of users and uses, and configuration of elements to limit access and
                  exposure?
             o Are appropriate security test facilities and test equipment, schedule, and
                  personnel planned in the Test and Evaluation Management Plan (TEMP)
                  and lower level test plans, Integrated Master Plan (IMP), and Systems
                  Engineering Plan (SEP); and, are they adequate and available?
             o Do planned measures for verification testing and operational use include
                  software assurance sub-countermeasures/techniques (e.g., static and
                  dynamic code analysis, fault injection, patch management, white- and
                  black-box testing, penetration testing, sandboxing, and honey pot
                  systems)?
             o Have Critical Program Information (CPI) and critical components and sub-
                  components that were categorized as Critical Technology Elements
                  (CTEs) been demonstrated at a Technology Readiness Level (TRL) 7 or
                  better?
    •    The production system is compliant with all functional baseline system security
         requirements and provides full functionality, without exploitable vulnerabilities (or
         with security and supply chain risks mitigated to an acceptable level). This
         extends to the full system context (e.g., System of Systems (SoS)).
    •    An Updated Program Protection Plan (PPP) is being developed for delivery at
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1162
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         Full Rate Production (FRP).
Recommended Criteria:
    •    Is the production system compliant with all functional baseline system security
         requirements?
             o Is full system functionality provided without exploitable vulnerabilities, or
                with security and supply chain risks mitigated to an acceptable level?
                Does this include the multiple systems that support the end-to-end mission
                threads, as defined in the Concept of Operations (ConOps)?
             o Are system protection requirements adequate against the system
                specifications flowed from the Initial Capabilities Documents (ICD)?
             o Is there a complete traceability of capabilities to system security
                requirements to detailed design to protection verification methods to
                results?
             o Has a review of all analysis, inspection, demonstration, and test reports of
                compliance to meet security and Supply Chain Risk Management (SCRM)
                requirements been conducted, and do all items meet verification needs?
             o Have all planned Systems Security Engineering (SSE) activities/products
                been implemented, including software assurance, system assurance, and
                Supply Chain Risk Management (SCRM); and, have the results been
                documented?
             o Has a residual vulnerability risk assessment been performed to assess,
                mitigate, and re-assess weaknesses in the system, including an
                assessment of security in the operational environment?
             o Are plans in place to update the residual risk assessment periodically
                during sustainment?
13.10.4. Sustainment
While the primary emphasis of Program Protection is on the design and acquisition
phases of a system lifecycle, some sustainment considerations must be addressed if
the protection profile is to survive system delivery. Repair depots, for example, should
be aware of Critical Program Information (CPI) and mission-critical functions and
components on systems they are maintaining so as to appropriately protect these items
from compromise. Further guidance on sustainment considerations for Program
Protection will be provided in the next Defense Acquisition Guidebook (DAG) update.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1163
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
13.11. Compromises
13.12. Costs
13.13. Contracting
13.11. Compromises
DSS presently has responsibility for protecting Critical Program Information (CPI) that is
classified. However, the contract may specifically assign Defense Security Service
(DSS) responsibility to protect Critical Program Information (CPI) that is controlled
unclassified information. Consequently, Defense Security Service (DSS) would receive
reporting on unclassified Critical Program Information (CPI) incidents if it had specific
protection responsibility or the incident could involve foreign intelligence activity or
violate the International Traffic in Arms Regulations (ITAR) or Export Administration
Regulations (EAR) .
13.12. Costs
The cost of implementing the selected countermeasures that exceed the normal
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1164
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
National Industrial Security Program Operating Manual (NISPOM) costs are recorded in
this section of the Program Protection Plan (PPP).
Based upon this analysis the program manager can select the countermeasure or
combination of countermeasures that best fit the needs of the program. It may be to
implement the optimum countermeasure(s) do not fit within the programs constraints
and other countermeasures can reduce the risk to an acceptable level. In some cases
the program may choose to accept the risk and not implement any countermeasures.
The emphasis of this analysis is to allow the program manager to perform an informed
countermeasure trade-off with an awareness of the vulnerabilities and risks to the
system. A summary of the trade-off analysis along with the rationale for the decision
needs to be documented in this section of the Program Protection Plan (PPP).
13.13. Contracting
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1165
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
and components (described in Section 13.3 ) followed by the identification of
vulnerabilities (described in Section 13.5 ), a risk assessment and the identification of
potential countermeasures (described in Section 13.7 ).
The system requirements document should include security requirements based upon
the initial countermeasures identified at Milestone A and good security practices. For
example, if a particular Critical Program Information (CPI) component requires anti-
tamper protection it may have a requirement to have seals, encryption, environmental
and logging requirements for the component (see Section 13.7.1 ). An Information
Assurance countermeasure example may be a requirement to include one of the
controls specified in the DoD Information Assurance Certification and Accreditation
Process (DIACAP) in the component (see Section 13.7.2 ).
During the Request for Proposal (RFP) development not all of the system security
requirements and design features have been determined. As a result it is necessary to
transfer a major part of the program protection analysis, specification, and
countermeasure implementation to the contractor to protect the system from loss of
advanced technology, malicious insertion, tampering and supply chain risks. The
following responsibilities should be considered for inclusion in the Statement of Work:
    •    The contractor shall plan for and implement countermeasures which mitigate
         foreign intelligence, technology exploitation, supply chain and battlefield threats
         and system vulnerabilities that result in the catastrophic (Level I) and critical
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1166
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         (Level II) protection failures, including:
            1. The application of supply chain risk management best practices, applied
                as appropriate to the development of the system. Supply chain risk
                management key practices may be found in the National Institute of
                Standards and Technology (NIST) Interagency Report 7622, Piloting
                Supply Chain Risk Management for Federal Information Systems , and the
                National Defense Industrial Association Guidebook, Engineering for
                System Assurance , both publicly available.
            2. The enumeration of potential suppliers of critical components, as they are
                identified, including cost, schedule and performance information relevant
                for choice among alternates and planned selection for the purpose of
                engaging with the government to develop mutually-agreeable risk
                management plans for the suppliers to be solicited.
            3. The processes to control access by foreign nationals to program
                information, including, but not limited to, system design information, DoD-
                unique technology, and software or hardware used to integrate
                commercial technology.
            4. The processes and practices employed to ensure that genuine hardware,
                software and logic elements will be employed in the solution and that
                processes and requirements for genuine components are levied upon
                subcontractors.
            5. The process used to protect unclassified DoD information in the
                development environment.
    •    The preceding clauses shall be included in the solicitations and subcontracts for
         all suppliers, suitably modified to identify the parties.
    •    The contractor shall develop a set of secure design and coding practices to be
         followed for implementation of Level I and II critical components, drawing upon
         the top 10 secure coding practices (
         https://www.securecoding.cert.org/confluence/display/seccode/Top+10+Secure+
         Coding+Practices ) and the Common Weakness Enumeration (CWE)/SysAdmin,
         Audit, Network, Security (SANS) top 25 most dangerous software errors (
         http://cwe.mitre.org/top25/index.html ).
    •    The contractor shall develop a Program Protection Implementation Plan (PPIP)
         that addresses the following sections of the Program Protection Plan (PPP)
         outline and example:
             o Section 2 Identifying what to protect
             o Section 4 Vulnerabilities
             o Section 5 Countermeasures
             o Section 7 Program Protection Risks
             o Section 8 Managing and Implementing Program Protection Plan (PPP)
             o Section 9 --Process for Management and Implementation of Program
                 Protection Plan (PPP)
             o Section 10 Process for Monitoring and Reporting Compromises
             o Appendix C: Criticality Analysis
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1167
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
13.13.1.3. Instructions, Conditions and Notice to Offeror’s (Section L)
For many Request for Proposal (RFP) packages system security engineering may not
have any explicit clauses in this section. If it is determined to identify specific program
protection content for the proposal the following items should be considered:
    •    The offeror, as part of its technical proposal, shall describe the use of its systems
         security engineering process in specifying and designing a system that is
         protected from loss of advanced technology, malicious insertion, tampering and
         supply chain risks.
    •    The offer shall describe the offeror’s Critical Program Information (CPI)
         identification, mission criticality analysis, vulnerability assessment, risk evaluation
         and countermeasure implementation in arriving at its system specification and
         design.
    •    The offeror shall describe the offeror’s secure design and coding practices.
For most Request for Proposal (RFP) packages systems security engineering may not
rise to the level of an evaluation factor. If it does programs should consider following as
evaluation criteria:
    •    The extent to which the offeror employs a disciplined, structured systems security
         engineering (SSE) process, including Critical Program Information (CPI)
         identification, criticality analysis, vulnerability assessment, risk analysis and
         countermeasure implementation in arriving at its system specification and design.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1168
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
13.15. Program Protection Plan (PPP) Review/Approval
13.15.3.1. Coordination
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1169
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
for which the systems engineer is responsible.
The program manager should utilize a Working-level Integrated Product Team (WIPT)
to perform comprehensive program protection analysis. It is the responsibility of the
program manager to ensure that the Working-level Integrated Product Team (WIPT) is
comprised of appropriate representatives (including Systems Engineers (SEs), Systems
Security Engineering (SSE) Subject Matter Experts (SMEs), logistics, system user
representatives, and supporting counterintelligence, intelligence, foreign disclosure, and
security personnel) to ensure a comprehensive analysis of system technology,
hardware, software, firmware, and information. This Working-level Integrated Product
Team (WIPT) or a sub-group (such as a System Security Engineering Working Group
(SSEWG)) should focus on engineering aspects of security. They should define and
identify all Systems Security Engineering (SSE) aspects of the system, develop an
Systems Security Engineering (SSE) architecture, review the implementation of the
architecture, and participate in design validation. This sub-Working Integrated Product
Team (WIPT)/ System Security Engineering Working Group (SSEWG) should be
formed as early in the acquisition process as possible.
Systems Security Engineering (SSE) is the vehicle for interfacing research and
technology protection into the Systems Engineering (SE) acquisition process, whereby
Systems Engineering (SE) activities prevent exploitation of critical functions and
components of U.S. defense systems. The benefit of Systems Security Engineering
(SSE) is derived after acquisition is complete by mitigation of threats against the system
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1170
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
during deployment, operations, and support. Systems Security Engineering (SSE) also
addresses threats during the acquisition process (typically through the supply chain) as
well as the possible capture of the system by enemies during combat or hostile actions.
Note that Systems Security Engineering (SSE) might be required in localized situations
where stakeholder security requirements are addressed in the absence of full
implementation of Systems Engineering (SE) activities. This can occur at any stage in
the system lifecycle. Key Systems Security Engineering (SSE) criteria can be specified
for each of the phases leading up to a major program Milestone, and it is important to
establish these criteria across the full lifecycle in order to build security into the system.
During the Milestone A phase, most of the Systems Security Engineering (SSE) related
activities, criteria, and results can be mapped to content of the Milestone A Program
Protection Plan (PPP), as described in the Program Protection Plan (PPP) Outline.
Associated Milestone A engineering analyses and Program Protection Plan (PPP)
content include the following:
The threat analyses and plans/schedule to counter them, as captured in the PPP,
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1171
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
should correlate with and point to the discussion provided in Section 2.3 of the
Technology Development Strategy (TDS) (see the Technology Development Strategy
(TDS)-Acquisition Strategy (AS) Outline).
Other key Systems Security Engineering (SSE) activities during the Milestone A phase,
not necessarily captured in specific documents, include:
Other documents generated during the Milestone A phase should also contain Systems
Security Engineering (SSE) relevant content. A thorough discussion of the Systems
Engineering Plan (SEP) is given in Chapter 4 . Expected Systems Security Engineering
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1172
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
(SSE) content in the Systems Engineering Plan (SEP) can be highlighted as follows:
The Test and Evaluation Strategy (TES) provides an overall system Verification and
Validation (V&V) strategy; and, pertinent details for ensuring system security are further
discussed in Section 13.10.3. Expected Systems Security Engineering (SSE) content in
the Test and Evaluation Strategy (TES) is highlighted as follows:
Security requirements are first baselined in the System Requirements Document (SRD);
related Systems Security Engineering (SSE) criteria and requirements are flowed down
to contractor(s) via a solid Statement of Work and Request for Proposal (RFP) as
follows:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1173
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    The Technology Development (TD) phase will involve prototyping efforts and
         system design trade-off considerations for risk reduction. Ensure that the
         Statement of Work requires the following level of Systems Security Engineering
         (SSE) activities from contractor(s) engaged in these activities:
             o Use and maintain current Critical Program Information (CPI) and critical
                function and component threat assessments (current within 2 years).
             o Update the Criticality Analysis (CA) to identify critical functions and
                components, with rationale, and to allocate countermeasures and sub-
                countermeasures to the system design, the allocated baseline, and to
                follow-on development efforts (e.g., the product baseline).
             o Flow down the Systems Security Engineering (SSE) requirements from
                the System Requirements Document (SRD) to the System Specification,
                with verification criteria for risk reduction efforts.
             o Refine the allocation of countermeasures and sub-countermeasures to
                system critical components (features included in system design) as well as
                lifecycle phases (processes used to develop the system).
             o Include detailed Systems Security Engineering (SSE) criteria for Systems
                Engineering Technical Reviews (SETRs), which should be reflected in
                their System Engineering Management Plan (SEMP).
                     Include coverage of Criticality Analysis (CA) results and supply
                        chain risk information (see Section 13.10.2 for further details).
             o Include security requirements and critical function/component artifacts
                within their Systems Engineering (SE) and design Contract Data
                Requirements Lists (CDRLs).
             o Protection of all critical function and Critical Program Information (CPI)-
                related prototyping and preliminary design work during technology
                maturation efforts.
             o Demonstrate visibility into the supply chain for hardware and software
                elements to be used during the technology maturation efforts. Allow
                Government oversight reviewers to inspect results of Systems Security
                Engineering (SSE) processes (including countermeasures and Systems
                Security Engineering (SSE) activities) upon request.
    •    If an System Requirements Document (SRD) is available, the following Systems
         Security Engineering (SSE) considerations apply:
             o System Requirements Document (SRD) contains security requirements
                derived from:
                     Operational performance requirements requiring protection.
                     Security focused mission threads in the Concept of Operations
                        (CONOPS).
                     Threats and vulnerabilities in the Initial Capabilities Documents
                        (ICD) and draft Capability Development Documents (CDD).
             o System security requirements in the verification matrix should be traceable
                to Joint Capabilities Integration Development System (JCIDS)
                requirements (Initial Capabilities Documents (ICD), draft Capability
                Development Documents (CDD)) and regulatory requirements.
             o Includes system level countermeasures and sub-countermeasures
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1174
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                   requirements.
During the Technology Development (TD) phase, most of the Systems Security
Engineering (SSE) related activities, criteria, and results can be mapped to content of
the Milestone-B Program Protection Plan (PPP), as described in the Program Protection
Plan (PPP) Outline. Associated Technology Development (TD) engineering analyses
and Program Protection Plan (PPP) content include the material covered in Section
13.7.6, as well as the following:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1175
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         each level of design
The threat analyses and plans/schedule to counter them, as captured in the Program
Protection Plan (PPP), should correlate with and point to the discussion provided in
paragraph 2.3 of the Acquisition Strategy (AS) (see the Technology Development
Strategy (TDS)-Acquisition Strategy (AS) Outline).
Other key Systems Security Engineering (SSE) activities during the Technology
Development (TD) phase, not necessarily captured in specific documents, include:
Other documents generated during the Technology Development (TD) phase should
also contain Systems Security Engineering (SSE) relevant content. For example,
Systems Security Engineering (SSE) tasks to implement requirements should be
included in the Integrated Master Plan (IMP) and Integrated Master Schedule (IMS),
including security verification tied to the Test and Evaluation Management Plan (TEMP)
.
A thorough discussion of the Systems Engineering Plan (SEP), updated for Milestone B,
is given in Chapter 4. Expected Systems Security Engineering (SSE) content in the
updated Systems Engineering Plan (SEP) can be highlighted as follows:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1176
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         Systems Engineering (SE) approach, including processes used by the
         Government and contractors, as well as Technology Development (TD) phase
         Systems Security Engineering (SSE) accomplishments and guidance for
         Engineering and Manufacturing Development (EMD)
    •    Refinement and allocation of system level security requirements as part of the
         System Requirements process
    •    Technical baseline management for system security requirements at the System
         Requirements Document (SRD) and System Specification level by the
         Government and lower level specifications by the contractor(s)
    •    Technical Risk Plan includes a summary of the mission-critical components with
         risks, countermeasures and candidate sub-countermeasures, and residual risk
         (or a reference to Program Protection Plan (PPP))
    •    Comprehensive end-to-end test approach for system security
    •    Each identified Systems Engineering Technical Reviews (SETR) event includes
         Systems Security Engineering (SSE) criteria (see Section 13.7.6 for
         amplification)
The Test and Evaluation Master Plan (TEMP) provides an integrated system plan for
Verification and Validation (V&V); and, pertinent details for ensuring system security are
further discussed in Section 13.10.3. It should be noted, however, that the Test and
Evaluation (T&E) associated with critical components and their testable sub-
countermeasures will likely not be a part of a programs Test and Evaluation
Management Plan (TEMP). A large portion of security Test and Evaluation (T&E) will be
planned for and conducted by the contractor as part of the contracts Statement of Work.
That said, expected Systems Security Engineering (SSE) content in the Test and
Evaluation Management Plan (TEMP) is highlighted as follows:
Security requirements and related system functions are baselined in the Government
System Requirements Document (SRD) and the contractors System Specification.
Related Systems Security Engineering (SSE) criteria and requirements are flowed down
to contractors via a solid Statement of Work and Request for Proposal (RFP) as follows:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1177
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
              o  Request for Proposal (RFP) Section C:
                      detailed Statement of Work requirements (see below)
                      final System Requirements Document (SRD) is included (see below
                          for level of detail expected)
             o Request for Proposal (RFP) Section L: Request a lifecycle description of
                 the Systems Security Engineering (SSE) and Program Protection (PP)
                 processes with how they integrate into the overall Systems Engineering
                 (SE) process. Provide specific security scenario(s) for bidders to describe
                 their proposed system response
             o Request for Proposal (RFP) Section M: Evaluate proposed disciplined,
                 structured Systems Security Engineering (SSE) and Program Protection
                 (PP) processes, including Criticality Analysis (CA), in system specification,
                 detailed design, build/code, and test, with emphasis on countermeasure
                 and sub-countermeasure implementation
    •    The Engineering and Manufacturing Development (EMD) phase Statement of
         Work should require the following level of Systems Security Engineering (SSE)
         activities from contractor(s):
             o Update the Criticality Analysis (CA) to refine the identification of critical
                 functions and components, with rationale
             o Work with appropriate agencies (e.g., Defense Intelligence Agency (DIA)
                 and National Security Agency (NSA)) to maintain Critical Program
                 Information (CPI) and critical function and component threat assessments
                 (current within 2 years)
             o Allocate sub-countermeasures to critical components and subcomponents
                 (i.e., system detailed design and the product baseline) as well as to
                 lifecycle phases for the processes used to develop the system.
             o For Software Assurance evaluations, use:
                      Common Vulnerabilities and Exposures (CVE) To identify
                          vulnerabilities that enable various types of attacks
                      Common Attack Pattern Enumeration and Classification (CAPEC)
                          To analyze environments, code, and interfaces for common
                          destructive attack patterns
                      Common Weakness Enumeration (CWE) To examine software
                          architectures, designs, and source code for weaknesses
             o Flow down the Systems Security Engineering (SSE) requirements from
                 the System Requirements Document (SRD) to the System Specification
                 and to lower-level specifications, with verification criteria for risk reduction
                 efforts
                      Include detailed allocation of sub-countermeasures to lower-level
                          specifications
             o Include detailed Systems Security Engineering (SSE) criteria for Systems
                 Engineering Technical Reviews (SETRs), which should be reflected in the
                 contractor Systems Engineering Management Plan (SEMP)
                      Include coverage of Criticality Analysis (CA) results and supply
                          chain risk information (see Section 13.10.2 for further details)
             o Include security requirements and critical function/component artifacts
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1178
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
               within contractor Systems Engineering (SE) and design Contract Data
               Requirements Lists (CDRLs)
            o Demonstrate visibility into the supply chain for critical components and
               subcomponents. Allow Government oversight reviewers to inspect results
               of Systems Security Engineering (SSE) processes (countermeasures,
               sub- countermeasures, and activities) upon request
    •    Expected Systems Security Engineering (SSE) content in the System
         Requirements Document (SRD) and/or preliminary System Specification:
            o Specific security requirements to protect Critical Program Information
               (CPI) and critical functions and components, based on:
                    Operational performance requirements needing protection
                    Threats and vulnerabilities identified via system-specific
                      assessments as well as those contained in the Capability
                      Development Document
                    Use cases (including common-attack countermeasures) that are
                      comprehensive and traceable to the Concept of Operations
                      (CONOPS)
            o Each security requirement in the verification matrix should be traceable
               from Joint Capabilities Integration Development System (JCIDS)
               requirements (Initial Capabilities Document (ICD) and Capability
               Development Document (CDD)) to countermeasures and sub-
               countermeasures allocated to system requirements, and adjusted for
               associated cost, risk, and schedule
            o Identification of specific countermeasures and sub-countermeasures
               requirements. For example, for Software Assurance countermeasures to
               be applied to a specific component, the identification of sub-
               countermeasures, such as:
                    Static code analysis (for development process application)
                    Secure exception handling (for built-in component security)
                    Sandboxing (for operational threat mitigation)
            o All identified Critical Program Information (CPI) and critical functions and
               components have specified countermeasure and sub-countermeasure
               requirements documented in the contractors Spec Tree, with justification
               of accepted risk
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1179
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         and allocation of sub-countermeasures to design and verification
During the Engineering and Manufacturing Development (EMD) phase, most of the
Systems Security Engineering (SSE) related activities, criteria, and results can be
mapped to content of the Milestone-C Program Protection Plan (PPP), as described in
the Program Protection Plan (PPP) Outline. Associated Engineering and Manufacturing
Development (EMD) engineering analyses and Program Protection Plan (PPP) content
include the material covered in Section 13.7.6 , as well as the following:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1180
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         detailed design elements with verification criteria:
            o Allocate sub-countermeasures to counterintelligence (CI) Specifications
                (detailed design) and to verification criteria in the Statement of Work
            o Update the Systems Security Engineering (SSE) protection fault isolation
                tree and system response matrix
            o Ensure flow down of key Systems Security Engineering (SSE)
                requirements to appropriate Systems Engineering Technical Reviews
                (SETR) criteria (Critical Design Review (CDR), Test Readiness Review
                (TRR), and System Verification Review (SVR)) (see Section 13.10.2 for
                amplification)
Other key Systems Security Engineering (SSE) activities during the Engineering and
Manufacturing Development (EMD) phase include:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1181
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         Technology Readiness Level (TRL) of 7 by Milestone C) demonstrate all required
         security in an operational environment
            o Verify manufacturability of needed sub-countermeasures
    •    Ensure verification of system protection functional requirements:
            o Systems Security Engineering (SSE) functional requirements flow down to
                detailed design elements and are verified against valid criteria and
                verification methods
            o Countermeasures and sub-countermeasures are analyzed and tested
                throughout the detailed design and testing/verification phases
            o Physical and operational countermeasures and sub-countermeasures are
                included in system operational instructions and training documentation
            o Configuration management system is used to track vulnerability risks and
                mitigation via design changes
            o Performance of sub-countermeasures is verified against attacks
            o Updated test plans and procedures reflect additional verification
                requirements and stress attack scenarios
    •    Ensure that Systems Security Engineering (SSE) tasks to implement
         requirements are updated in the Integrated Master Plan (IMP) and Integrated
         Master Schedule (IMS)
    •    Test Plan activities are traced from the System Requirements Document (SRD)
         to system, subsystem, and lower-level component requirements, to verification
         testing needs
    •    Ensure that all Systems Security Engineering (SSE) testing requirements from
         system, subsystem, component, and subcomponent level documentation are
         included in the verification matrix according to agreed verification objectives
    •    Clear pass-fail criteria are identified for all tests as they apply to system security
         and protection
    •    Test processes and facilities, test equipment, Modeling and Simulation (M&S),
         and the software environment are adequately planned to validate protection
         countermeasure requirements
    •    Systems Security Engineering (SSE)-specific needs for personnel and schedule
         are considered and adequately addressed
    •    Test plans reflect the Test and Evaluation Management Plan (TEMP)
    •    Testing includes attack stress use cases
    •    Security and protection validation testing may require outside accreditation
         authorities, and appropriate schedule is allocated in the Test Plans
System Security Engineering should include an assessment of security criteria that sets
limits for international cooperative programs, direct commercial sales, and/or foreign
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1182
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
military sales cases. From this assessment, engineering and software alternatives (e.g.,
dial-down functionality, export variants, anti-tamper provisions, etc.) should be identified
that would permit such transactions.
Program Protection Plan (PPPs) must be approved by the Milestone Decision Authority
at Milestones A, B, C, and the Full-Rate Production decision (or business system
equivalent). A final draft Program Protection Plan (PPP) must be submitted for the Pre-
Engineering and Manufacturing Development (EMD) review prior to Milestone B. This
guidance summarizes the approval process that Undersecretary of Defense for
Acquisition, Technology, and Logistics (USD(AT&L)) and DoD Chief Information Officer
(CIO) jointly developed for programs under their cognizance. Programs under
Component, Program Executive Office (PEO), or other oversight should consult their
Milestone Decision Authority (MDA) for applicable approval guidance.
The Program Protection Plan (PPP) Review and Approval Process should be initiated
approximately 180 days prior to the programs Defense Acquisition Board (DAB) to allow
sufficient time for the comprehensive Office of the Secretary of Defense (OSD) review.
The review process iterates as the program responds to Office of the Secretary of
Defense (OSD) comments and resubmit’ s the Program Protection Plan (PPP) for
approval. Once all comments are resolved, the Program Protection Plan (PPP) will be
coordinated and routed to the Milestone Decision Authority (MDA) for approval.
When a Program Protection Plan (PPP) is ready for Office of the Secretary of Defense
(OSD) review, the program will send the document to Deputy Assistant Secretary of
Defense (Systems Engineering) (DASD (SE)) Major Program Support (MPS) Program
Support Team Lead (PSTL) and Action Officer (AO). The Program Protection Plan
(PPP) will be reviewed by SMEs across Office of the Secretary of Defense (OSD) to
validate that the Program Protection Plan (PPP) sufficiently addresses all aspects of
program protection planning and implementation. If comments are generated,
consolidated comments from this comprehensive review are returned to the program for
adjudication and resubmission for approval.
An important lesson learned is the program should act early to engage the Component
Anti-Tamper community and address any concerns, as Anti-Tamper (AT) Plan approval
is commonly a holdup for overall Program Protection Plan (PPP) approval. Additionally,
Program Managers (PMs) may delay receiving Program Executive Office (PEO) or
Service Acquisition Executive (SAE) signatures on Program Protection Plans (PPPs)
prior to initial Office of the Secretary of Defense (OSD) reviews, as many initial reviews
generate comments requiring rework that may need to be re-approved at those levels.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1183
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
13.15.1.1. Program-Level View of Program Protection Plan (PPP) Review Process
The program sends the draft Program Protection Plan (PPP) to its Program Support
Team Lead (PSTL) and Action Officer (AO). Approximately three weeks later, the
program will receive a comments matrix from the Program Support Team Lead (PSTL)/
Action Officer (AO) with comments the program needs to address. After addressing the
comments, the program will submit an updated Program Protection Plan (PPP) with the
adjudicated comments matrix to the Program Support Team Lead (PSTL) and Action
Officer (AO). Once all comments have been addressed, the Program Protection Plan
(PPP) Review Team will coordinate and staff the Program Protection Plan (PPP) for a
Milestone Decision Authority (MDA) signature. Once it is signed, the approved Program
Protection Plan (PPP) will be sent back to the program.
13.15.3.1. Coordination
Once all Office of the Secretary of Defense (OSD) comments are adjudicated, the
Program Protection Plan (PPP) is then sent out for Principal-level coordination across
Office of the Secretary of Defense (OSD). The following organizations submit Principal-
level coordination:
Once coordination is complete, the Program Protection Plan (PPP) is routed to the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1184
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Milestone Decision Authority (MDA) for signature.
The approval authority for Program Protection Plans (PPPs) is the Milestone Decision
Authority (MDA). The Milestone Decision Authority (MDA) for Acquisition Category
(ACAT) ID, special interest, and non-delegated Acquisition Category (ACAT) IAM
programs, is Undersecretary of Defense for Acquisition, Technology, and Logistics
(USD(AT&L)). The DoD Chief Information Officer (CIO) is the Milestone Decision
Authority (MDA) for all other IAM programs . For Acquisition Category (ACAT) ICs and
below, the Program Protection Plan (PPP) does not need to be reviewed at the Office of
the Secretary of Defense (OSD) level.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1185
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
DEFENSE ACQUISITION GUIDEBOOK
Chapter 14 - Acquisition of Services
14.0. Overview
Appendix E -- GLOSSARY
14.0. Overview
14.0.1. Purpose
14.0.2 Contents
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1186
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
14.0. Overview
14.0.1. Purpose
This chapter provides acquisition teams with a disciplined, three-phase, seven step
process, for the acquisition of services.
14.0.2. Contents
The acquisition of services plays a vital role in advancing and maintaining the mission
capability of the Department of Defense (DoD). Services acquisition covers a broad
spectrum of requirements from research and development, advisor services, information
technology support, medical, to maintaining equipment and facilities. For over ten years
the DoD has spent more on service requirements than it has on equipment acquisitions.
While the acquisition of major systems follows a much defined process, the acquisition
of services tends to be more ad hoc. Services acquisition is not about awarding a
contract; it’s about acquiring performance results that meet performance requirements
needed to successfully execute an organization’s mission.
This guidebook provides acquisition teams with a disciplined, seven step process, for
the acquisition of services. Applying this rigorous and systematic approach requires the
dedicated effort of an acquisition team composed of functional experts, contracting
specialists, contracting officer representatives, and others working together to achieve
performance results and value their mission requirements. It’s important to remember
that the Federal Acquisition Regulation (FAR) states that the acquisition process is a
shared team responsibility. Completing this process, like all acquisitions, takes allocated
planning time. Getting your acquisition team organized and focused early in the process
is a fundamental key to successfully achieving the mission results your customer’s
require.
When does the process start? It starts with a valid mission requirement for a service
essential for the successful execution of the organizations mission. The process
continues through a planning phase, which develops the foundation for defining your
requirement and business strategy, and ultimately ends with the delivery and
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1187
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
assessment of the services provided.
The service could be provided by a new contract you develop; it could be provided by
an already existing contract within your agency (or outside your agency); or could be
part of your Agency’s strategic sourcing efforts. The services acquisition process
requires that you keep an open mind about where best to source the requirement until
you have explored and assessed all the alternatives and developed a clear picture of
the requirement and supporting acquisition strategy.
Planning Phase:
Development Phase:
Execution Phase:
Each phase builds on the knowledge gained in the previous phase. Some actions within
each phase can be completed in parallel; others should be completed sequentially to
make more informed decisions based on new knowledge gained. The project plan in
Appendix B will help you tailor a plan for your service acquisition. This guidebook will
cover each of the steps in detail and illustrate how to use the requirements roadmap
tool to assist you in developing performance-based requirements documents.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1188
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                        Figure 14.1.1.F1. The Services Acquisition Process
The Planning Phase , steps 1, 2, and 3, lays the foundation for action. During the
planning phase, you form the acquisition team and get leadership support for all the
actions that must happen to ensure the mission is supported. Baseline and analyze your
current service strategies; identify problem areas and projected mission changes; and
get your stakeholders to define their key performance outcomes for this requirement.
Also analyze the market place to assess current technology and business practices,
competition and small business opportunities, existing and potential new sources of
providing the service, and determine if commercial buying practices can be adapted.
During the Development Phase , steps 4 and 5, use the requirements roadmap
process to define your High Level Objectives and tasks, standards, allowable variations,
and method of inspection. After completing the roadmap you will then be in the best
position to develop a performance work statement (PWS) and quality assurance
surveillance plan (QASP) . During this phase you will also identify your funding sources,
develop a government estimate of contract price for the required service, and get
industry feedback on your working documents. Finally, synthesize an acquisition
strategy that leverages contract type and performance incentives to deliver a best value
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1189
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
mission performance to the customer. The basic performance principle is to tell the
contractor what the performance results are, not how to do the job. Let industry develop
the solution.
In the Execution Phase , steps 6 and 7, you put all your planning and development
efforts into action. You create a solicitation document that formally communicates to
industry your requirements and strategy. You receive contractor proposals for how they
will meet your performance results and standards and then evaluate them against
criteria selected that will best determine the success of a potential contractors
approach. After contract award, the business relationship you have with the service
providing contractor should foster innovation and improvements to mission performance
outcomes. This part of the process involves two key areas: administering contract
requirements such as invoicing and payments; and managing the relationships and
expectations of both the contractor and customer’s in meeting the terms of the contract
and achieving the required mission performance results. You also start the planning
phase for a follow-on acquisition if there is a continuing need for the service being
provided.
For DoD, the various types of services are grouped into portfolio categories within the
taxonomy for the acquisition of services (reference DFARS Procedures, Guidance, and
Instruction PGI 237.102-74 ). The contracting officer is responsible for determining
whether the services needed are non-personal or personal using the definitions found in
FAR 37.101 and 37.4 and the guidelines found in FAR 37.104 . Agencies shall not
award personal service contracts unless specifically authorized by statute to do so.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1190
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
14.1.3. Non-Personal Services Requirements
Non-personal service means that the personnel rendering the services are not subject,
either by the contracts terms or by the manner of its administration, to the supervision
and control usually prevailing in relationships between the government and its
employees. Non-personal service contracts are authorized by the government in
accordance with FAR 37.102 , under general contracting authority, and do not require
specific statutory authorization.
In a personal services contract, the contractor is considered to be, and is treated as, an
employee of the government. In this type of relationship, a government officer or
employee directly supervises and controls the contractors personnel on a continuing
basis. Personal service contracts require specific authorization.
The FAR, in implementing Public Law 106-398, states that performance based
acquisition methods should be used to the maximum extent practicable. PBA for
services involves performance requirements and acquisition strategies that describe
and communicate measurable outcomes rather than direct specific performance
processes. It is structured around defining a service requirement in terms of
performance results and providing contractors the latitude to determine how to meet
those objectives. Simply put, it is a method for acquiring what results are required and
placing the responsibility for how it is accomplished on the contractor.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1191
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         will assess contractor performance against the performance standards contained
         in the PWS.
Encourage and promote the use of commercial services: The vast majority of
service requirements are commercial in nature. FAR Part 12 (Acquisition of Commercial
Items) applies to the acquisition of commercial services and provides procedures that
offer the benefit reducing the use of government-unique contract clauses and similar
requirements, which can help attract a broader industry base. However, it is often the
case that commercial services will be acquired through contracts awarded under FAR
Part 15 (Contracting by Negotiation) given the limited contract types authorized under
FAR Part 12.
Shift in risk: Much of the risk is shifted from the government to industry, since
contractors become responsible for achieving the performance results contained in the
Performance Work Statement through the use of their own best practices and
processes. Agencies should consider this shift in responsibility in determining the
appropriate acquisition incentives and contract type.
Achieve savings: Experience in both government and industry has demonstrated that
use of performance requirements results in cost savings.
PBA is not a new procurement strategy. Many procurement activities have never
stopped using PBA techniques. The Department of the Navy, as one example among
many, has used PBA techniques effectively for facilities maintenance services for
decades. The Department of the Air Force and the Army Corps of Engineers has
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1192
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
employed PBA techniques in many of their service acquisitions.
PBA techniques are applicable to a broad range of service requirements. Simply stated,
PBA methods structure a contract around the contractor achieving stated performance
results and standards. The contractors performance against the required standards
must be measurable through an objective process. This means that the government
acquisition team must describe the required performance results in clearly defined
terms with performance standards that can be effectively measured. This is often the
most difficult part of implementing PBA techniques. Writing a Performance Work
Statement in a way that describes performance results requires us to focus on the
relationship between what needs to be done and how well it must be accomplished, not
how it must be accomplished or how many full-time equivalents (FTEs) are required.
When PBA techniques are not appropriate for use, the decision shall be documented
and included in the contract file.
With this simple set of performance outcomes, contractors were given wide latitude to
develop an ordering, inventory, and delivery methodology to support Navy flying
operations. Through the innovation introduced by industry the Navy achieved the
following benefit’s:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1193
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Over $49 million net savings to the Navy over life of contract
With this new focus, the next question was how well or to what standard must the
channel be kept (not dredging)? The answer was 100 feet wide and 12 feet deep mean
low water. Now they had a performance standard, but how would they know if the
contractor was meeting that performance standard? Their answer was providing a boat
with a global positioning system (GPS) and sonar system that could measure depth and
position to ensure the channel met the specified standard. With their new performance
objective, performance standard, and a means of inspection, they were well on their
way to developing a simpler, more performance-based requirement.
No matter where you are in the services acquisition process, it’s very easy to get
trapped into a preconceived idea of how a particular function should or must be
performed. Like the examples cited above, you need to keep the focus on what mission
outcomes you are trying to achieve, not how the process must be accomplished. If you
can keep a higher view of what you’re asking a contractor to accomplish, you will have
far more success in implementing a performance-based approach for your service
requirements.
14.2.1.5. Develop and Maintain Knowledge Base over the Project Life
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1194
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
14.2.1.6. Plan and Schedule Topical Team Training
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1195
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
14.2.1.1. Ensure Senior Management Involvement and Support
Early in your acquisition efforts you should make sure you’ve got senior leadership
support. It is important to understand leaderships concerns and expectations for your
acquisition. What priority does this requirement have in their portfolio of service
requirements? Your leadership can help you get the right people on your team and
overcome roadblocks when necessary when they understand your team is committed to
the success of their mission.
The goal of every acquisition team should be to obtain quality, timely contract services
in both a legal and cost-effective manner, placing the responsibility for quality
performance on the contractor. Nonetheless, achieving this goal can be challenging.
The interdisciplinary nature of your acquisition efforts means no single individual or
function is likely to have all the requisite knowledge and experience in the majority of
cases. Therefore, personnel such as the program manager, contracting officer,
contracting officer's representative (COR), responsible fiscal officer, and legal counsel
(among others) should form the acquisition team as soon as possible in order to:
Although the composition of the acquisition team may vary depending on the nature of
the requirement, a few key members are essential to the success of any contract. They
are as follows:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1196
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
any problems or delays. This individual is responsible for drafting the PWS, which
means ensuring that performance requirements are clearly and concisely defined and
articulated. PMs identify, plan, and control various areas, such as delivery requirements,
scheduling, market research, COR nomination, cost estimating, budgeting, and specific
project formulation. The PM normally participates in the source selection as well. This
individual serves as the principal technical expert, is most familiar with the requirement,
best able to identify potential technical tradeoffs, and whether the requirement can be
met by a commercial solution.
The Contracting Officer: The warranted contracting officer is responsible for performing
all relevant contract functions, to include assisting in requirements development and
market research. Within this context, the contracting officer does not determine the
government's need, but is responsible for advising the PM in preparing a PWS. This
individual serves as the principal business advisor and principal agent for the
government responsible for developing the business strategy, solicitation, conducting
the source selection, and administrating the resultant contract and business
arrangement.
Small Business Specialist (SBS): The SBS serves as the principal advisor and advocate
for small business engagement. This individual serves as the chief analyst on small
business laws, regulations and command policy. They can provide insight for market
research and an understanding of industry small business capability. He or she may
also serve as the liaison with the Small Business Administration (SBA).
Cost/Price Analyst: The cost/price analyst evaluates the financial price and cost-based
data for reasonableness, completeness, accuracy, and affordability. Alternatively, some
agencies utilize cost engineering personnel from within an engineering division to
conduct cost/price analysis from a technical standpoint.
Finance/Budget Officer: The finance/budget officer serves as an advisor for fiscal and
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1197
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
budgetary issues.
Legal Advisor: The legal advisor ensures that the commercial practices, and terms and
conditions contemplated are consistent with the governments legal rights, duties, and
responsibilities; will review the acquisition documents for legal sufficiency; and provides
advice on acquisition strategies and contract terms to the PBA team.
Developing a team charter is an important step in getting the team focused on the
objectives to be accomplished and to assign key roles and responsibilities. Everyone
involved must understand how they will contribute to achieving the required mission
results. The charter starts with the acquisition teams vision statement. The vision
statement should capture the high level objective of the teams effort and be an objective
that unites the team.
Use the project plan (Appendix B) and tailor it for your specific acquisition. This will help
you identify all the actions needed to complete each step of the seven step process. It
also enables you to assign responsibility for specific actions and develop a time line for
how long it will take you to get to performance management. Examples of a team
charter and project plan are available in the Service Acquisition Mall (SAM, Appendix C)
( http://sam.dau.mil/ ).
Every acquisition has stakeholders. Your acquisition team should identify who are the
key stakeholders that will be impacted by your acquisition. Stakeholders often fall into
three major categories:
Internal - These are within your organization either as customer’s for the service being
procured or leaders of activities your effort will be supporting.
Governance These are individuals or organizations that must approve your requirement
and acquisition strategies. They are often at higher headquarter levels outside your
immediate organization. Their involvement is often dictated by agency policy.
External These are stakeholders not directly tied to your acquisition. They can be local
communities, industry, or anyone else who might be affected or have an interest in your
actions.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1198
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
14.2.1.4. Develop a Communication Plan
Once you have identified your key stakeholders, how will you communicate with them
and keep them advised of your progress? A communication plan is a good way to target
specific communications to specific stakeholders. Well informed stakeholders can be
effective advocates for your actions. Your communication plan should determine the
method and frequency of communications. The Communication Plan is a living
document and should also be adjusted over time as new stakeholders are discovered
and you move through the different phases of the service acquisition process.
14.2.1.5. Develop and Maintain Knowledge Base over the Project Life
Depending on the size and complexity of your service requirement it can take up to two
years from this point in the process to step seven where you are finally receiving the
service. During this period team members will leave and new ones arrive. It’s important
for the new team members to understand the decisions that have been made and the
rationale that supported them. That’s why developing a project library that can be easily
shared among the acquisition team, will help new team members get on board quickly
and provide everyone with a common understanding of the project and decisions made.
As part of your project plan identify which individuals will need specialized training such
as for the COR or for individuals involved with your source selection. Consult current
DoD directives for COR training requirements. Also consider requesting DAUs Service
Acquisition Workshop (SAW) as a total team training event. There are many training
resources available at the Defense Acquisition University (DAU), but if classroom
training is needed, plan early.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1199
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
14.2.2.8. Define (at a High Level) Desired Results
14.2.2.9. Review Current Performance and Desired Results with Stakeholders and
Users
The most effective foundation for an acquisition is the intended effect it will have in
supporting and improving an agency's mission and performance goals and objectives.
Describing an acquisition in terms of how it supports these mission-based performance
goals allows an agency to establish a clear relationship between the acquisition and the
agency’s mission. It sets the stage for crafting an acquisition in which the performance
goals of the contractor and the government are in sync. It’s important to remember that
a service acquisition is a skillful linking of the performance requirement and results with
a contract vehicle that motivates contractor performance aligned with the activities
mission objectives. This requires the best efforts of the acquisition team.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1200
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
14.2.2.1. Identify Current Initiatives/Contracts
Identify current contracts that support this requirement or are closely related to it. Are
these a part of your agencies strategic sourcing initiatives? Does your activity have new
initiatives in the planning stages that might affect this requirement? All this helps
develop a baseline for planning and minimize surprises.
To develop your baseline, identify any current performance issues; does the current
requirement still meet the mission? Interview key stakeholders and understand how
they define mission success, what their concerns are, and what mission changes do
they see in the future that will affect this requirement. Effective planning requires that we
can understand the objectives and focus on the desired outcomes. The first
consideration is answering these three questions:
Risk assessment is a process that continues through the whole service acquisition
cycle. As part of your discussion with stakeholders, begin collecting concerns and risks
that might have a mission impact. Risk assessment is a team responsibility, but the
program manager must take the lead in identifying and organizing risk areas. This
knowledge will help you as you develop the requirement and your acquisition strategy.
Risk analysis is discussed in more depth in Step Four.
This involves understanding how things are actually being done today. How do you
capture performance, what metrics are you tracking and reporting, what are the
challenges with current performance, and what are the issues associated with resolving
problems? What is the current small business strategy for the prime contracts and
subcontracts? What you are seeking to develop is a good understanding of the as is
state. Based on this, you can more effectively develop plans and actions that will
improve performance on your new requirement and implementation strategy.
In service contracts the government may furnish property or facilities for the contractors
use. Determine if this is still in the best interest of the government. Also determine the
condition of the material or facilities and if it is still suitable for use.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1201
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
14.2.2.6. Stakeholder Submits Current and Projected Requirements Forecast
Interview stakeholders to identify their current requirements, and what mission changes
they see coming that may affect the requirements you’re planning. What areas hold the
most concern for your stakeholders? How will contingency operations affect this
requirement? This knowledge will help you develop the scope for your requirement and
plan for the flexibility you may need in your contract vehicle to adjust for future
requirements. These stakeholder engagements will help ensure alignment of your
efforts with your stakeholders expectations.
As part of the baseline planning process, review current regulations and legislation that
could impact your requirement and acquisition strategy. Service contracts normally
cover several years, so be sure the plan you develop complies with current regulations
and not the ones that were in place at the beginning of the last contract.
Based on your stakeholder interviews, knowledge issues, and pending changes, start
refining the requirements desired results (outcomes). Is it providing a certain level of
help desk support to an organization? Is it a reduction of computer down time? Is it
providing a level of information assurance among its customer’s? Is it providing a level
of systems and software engineering and support? What is the ultimate intended result
of the contract and how does it relate to the agency's strategic plan? What are the
critical results your stakeholders have identified?
14.2.2.9. Review Current Performance and Desired Results with Stakeholders and
Users
Review your high level results with your stakeholders and customer’s to validate that
your team has defined the right results. Describe the gaps between current performance
and your understanding of what stakeholders are asking. Discuss the funding impact if
desired results are significantly beyond current budget levels. This feedback is vital to
ensure the actions you take in subsequent steps are aligned with your stakeholder
outcomes and results. Failure to do this now can result in a lot of rework later.
Take the feedback you generated in Section 14.2.2.9 and refine the desired results your
team has developed. Validate these refined results one more time with your
stakeholders to ensure you are moving in the right direction. Time invested here will pay
large dividends later in the process.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1202
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
14.2.3. Step Three Market Research
14.2.3.4.1. Customer’s
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1203
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
14.2.3. Step Three Market Research
Market research is required by FAR Part 10 and is a vital means of arming the
acquisition team with the knowledge needed to conduct an effective performance-based
service acquisition. This type of information helps determine the suitability of the
marketplace for satisfying a need or requirement. Market research is the continuous
process of collecting information to maximize reliance on the commercial marketplace
and to benefit from its capabilities, technologies, and competitive forces in meeting an
agency need. Market research is an essential process enabling the government to buy
best-value products and services that solve mission-critical problems. Appendix D also
provides a list of helpful sites as you are conducting your market research.
The ultimate goal of market research is to help the acquisition team become informed
consumers. To understand the cost drivers in providing the service, research what
leverage the team may discover in the marketplace that could affect both the
requirement and the business strategy. In short it helps the acquisition team optimize a
strategy for meeting their requirement. Since market research should address both
business and technical considerations of a requirement, it requires the active
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1204
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
participation of all acquisition team members as appropriate.
It is not unusual for the technical staff to conduct market research about marketplace
offerings while the contracting staff conducts market research that focuses on industry
practices and pricing. However, a better approach to conducting market research is for
the entire acquisition team to be a part of the effort. This enables the members of the
team to share in the understanding and knowledge of the marketplace and develop a
common understanding of what features, schedules, terms and conditions are key for
their projects success. The team should consider such factors as urgency, estimated
dollar value, complexity, and past experience as a guideline for determining the amount
of time and resources to invest in the effort. Don’t invest more resources (e.g., lead
time, available personnel, and money) than are warranted by the potential benefits. In
addition, when acquiring services under the SAT, conduct market research when
adequate information is not available and the circumstances justify the cost of such
research. One of the purposes of market research is to effectively identify the
capabilities of small businesses. Small businesses offer attributes of agility and
innovation in the services sector with generally lower overhead costs. Keep in mind that
each acquisition of supplies and services that are under the SAT should automatically
be reserved exclusively for small business concerns unless the contracting officer
determines there is not a reasonable expectation of obtaining two or more responsible
offers from small business concerns that are competitive in terms of market prices,
quality and delivery ( FAR 19.502-2 ).
Acquisition histories may not give the whole picture needed for planning a specific
acquisition, particularly if commercial practices or technologies to deliver the service are
changing rapidly. There may be times when this information is not adequate, such as
first time purchases, rapidly changing technology, change in market capability, and no
known sources. In determining and identifying the scope and extent of additional
research needed, you should follow these steps:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1205
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Plan the collection of additional market information (i.e. when and how) during
         the acquisition planning, pre-solicitation, solicitation, and evaluation phases.
The reason it’s critical to conduct market research as the entire acquisition team is it
makes it easier when each member of the team knows what his/her responsibility is
during this step. Determine who will do what and by when. As the team begins making
calls or visiting with providers, having a standard interview guide may help provide
accuracy and consistency. Try not to ask questions that will provide a yes or no
response. The interview guide should ask what experience they have in providing this
service.
While many are familiar with examining private-sector sources and solutions as part of
market research, looking to the public-sector is not as common a practice. Yet it makes
a great deal of sense on several levels. First, there is an increased interest in cross-
agency cooperation and collaboration. Second, agencies with similar needs may be
able to provide lessons learned and best practices. So it is important for the acquisition
team to talk to their counterparts in other agencies. Taking the time to do so may help
avert problems that could otherwise arise in the acquisition. Other resources include
state and local governments that are experienced in procuring certain services that
have not been procured by the Federal Government.
14.2.3.4.1. Customer’s
One-on-one meetings with industry leaders are not only permissible (ref
FAR15.201(c)(4)) they are highly encouraged. Note that when market research is
conducted before a solicitation or PWS is drafted, the rules are different. FAR 15.201(f)
states that general information about agency mission needs and future requirements
may be disclosed at any time. As long as the requirements have not (or should not
have) been defined, disclosure of procurement-sensitive information is not an issue.
Focus your market research on commercial and industry best practices, performance
metrics and measurements, innovative delivery methods for the required services, and
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1206
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
incentive programs that providers have found particularly effective. This type of
research can expand the range of potential solutions, change the very nature of the
acquisition, establish the performance-based approach, and represent the agency's first
step on the way to implementing an effective and meaningful "incentivized" business
relationship with a contractor.
Traditional ways to identify who can deliver the required services are to issue "sources
sought" notices at FedBizOps.gov, conduct "Industry Days," issue requests for
information, and hold pre-solicitation conferences. Also, consider reviewing current
FedBizOps solicitations. It’s also okay to pick up the phone and call private-sector
company representatives. Contact with vendors and suppliers, for purposes of market
research, is encouraged, FAR 15.201(a) specifically promotes the exchange of
information "among all interested parties, from the earliest identification of a requirement
through receipt of proposals." Once the solicitation has been issued and the
procurement is underway, the treatment of potential offeror’s must be fair and impartial
and the standards of procurement integrity ( FAR 3.104 ) must be maintained. So, the
real key is to begin market research early before the procurement action is underway.
Once the market research is completed, it’s now time to analyze the information and
data accumulated. This also is a task for the entire acquisition team. Some of the things
to consider as market research is analyzed are what are the opportunities for
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1207
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
competition and/or small business considerations? Did your market research reveal any
new emerging technologies? Sometimes market research can reveal things such as
market trends (supply/demand) which can provide leverage during negotiations. Once
the market research is analyzed, it’s time to document your findings.
The market research report is the document prepared after all information has been
compiled. It provides a summary of the market research teams activities and should
provide a logical basis for supporting your business strategy such as a commercial
service acquisition, full and open competition or small business set aside. FAR
10.002(e) encourages agencies to document the results of market research in a manner
appropriate to the size and complexity of the acquisition. The amount of research
should be commensurate with the size, complexity and criticality of the acquisition. You
should always check with your local agency for any additional requirements that may
not be listed. Your market research report can help build the business case for change
in how you approach your requirement and support your decisions on an acquisition
approach. Remember, it is easier to compile all the information gathered during your
market research into one document that will be included in the contract file.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1208
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
14.3.1.5. Develop a Performance Work Statement (PWS) and Statement of
Objectives (SOO)
14.3.1.5.1. Format
At this point of the process, the Planning Phase of the seven step service acquisition
process has been completed. The acquisition team is now ready to use the collected
data from the previous three steps (Form the Team, Current Strategy and Market
Research) to begin developing the Requirements Document (Step 4) and the
Acquisition Strategy (Step 5).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1209
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                                    Figure 14.3.1.F1. Model of Step Four
As part of the requirements development process you must identify and analyze risk
areas that can impact the performance results you are trying to achieve. Identify
possible events that can reasonably be predicted which may threaten your acquisition.
Risk is a measure of future uncertainties in achieving successful program performance
goals. Risk can be associated with all aspects of your requirement. Risk addresses the
potential variation from the planned approach and it’s expected outcome. Risk
assessment consists of two components: (1) probability (or likelihood) of that risk
occurring in the future and (2) the consequence (or impact) of that future occurrence.
Risk analysis includes all risk events and their relationships to each other. Therefore,
risk management requires a top-level assessment of the impact on your requirement
when all risk events are considered, including those at the lower levels. Risk
assessment should be the roll-up of all low-level events; however, most likely, it is a
subjective evaluation of the known risks, based on the judgment and experience of the
team. Therefore, any roll-up of requirements risks must be carefully done to prevent key
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1210
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
risk issues from slipping through the cracks.
It is difficult, and probably impossible, to assess every potential area and process. The
program or project office should focus on the critical areas that could impact your
program and thus impact your performance results. Risk events may be determined by
examining each required performance element and process in terms of sources or
areas of risk. Broadly speaking, these areas generally can be grouped as cost,
schedule, and performance, with the latter including technical risk. When the word
system is used, it refers to the requirement for services as a system with many different
activities and events. The more complex the service requirement is, the more likely it
will have the components and characteristics of a system. The following are some
typical risk areas:
    •    Business/Programmatic Risk
    •    Scheduling issues that may impact success?
    •    Technical Risk
    •    Maturity of technology and processes reliant on technology
    •    Funding Risk
    •    Are funds identified for which availability is reliant on pending events or
         approvals? Have adequate funds been identified?
    •    Process Risk
    •    Are new processes required to be implemented?
    •    Will the best contractors have time to propose?
    •    Organizational Risk
    •    Implementing change within an organization
    •    Risk Summary
    •    Overview of the risk associated with implementing the initiative e.g. Is there
         adequate service life remaining to justify this change?
Identifying risk areas requires the acquisition team to consider relationships among all
these risks and may identify potential areas of concern that would have otherwise been
overlooked. This is a continuous process that examines each identified risk (which may
change as circumstances change), isolate the cause, determine the effects, and then
determine the appropriate risk mitigation plan. If your acquisition team is requesting the
contractor to provide a solution as part of their proposal that contains a performance-
based statement of work and performance metrics and measures, then it is also
appropriate to have the contractor provide a risk mitigation plan that is aligned with that
solution.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1211
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
To learn more about risk management and using a risk mitigation plan, we suggest you
take the DAU online course, entitled Continuous Learning Module (CLM) 017, Risk
Management. Figure 14.3.1.1.F1 is a typical risk analysis model.
Like risk analysis, requirements analysis means conducting a systematic review of your
requirement given the guidance you captured from your stakeholders during the
planning phase steps One, Two, and Three. The objective of this step is to describe the
work in terms of required results rather than either how the work is to be accomplished
or the number of hours to be provided ( FAR 37.602 ). This analysis is the basis for
establishing the high level objectives, developing performance tasks, and standards,
writing the Performance Work Statement, and producing the Quality Assurance
Surveillance Plan.
The acquisition team needs to identify the essential processes, and outputs or results
required. One approach is to use the "so what?" test during this analysis. For example,
once the analysis identifies the outputs, the acquisition team should verify the continued
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1212
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
need for the output. The team should ask questions like the following:
As you build your roadmap with high level objectives and task statements, prioritize
them in descending order of importance based on risk, criticality or priority. This will help
you later when determining what you want to evaluate in a contractors proposal.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1213
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                     Figure 14.3.1.3.F1. Requirements Roadmap Worksheet
Initially, the High Level Objectives (HLO) need to be defined. What must be
accomplished to satisfy the requirement? This should have been accomplished during
steps 1 and 2 when you were talking with your stakeholders and customer’s. To define
HLOs, list what needs to be accomplished to satisfy the overall requirement, from a top-
level perspective. HLOs are similar to level two in work bread down structures.
Tasks are the results or actions required to achieve the HLO. It may take several tasks
to satisfy a HLO. Tasks consist of results, the context of what or who the results pertain
to and what actions are required to complete the task. Defining the task goes into
greater detail and expands the stakeholder analysis beyond the top-level perspective.
The goal of a task is to adequately describe what action or result is required (not how to
accomplish it).
Tasks are generally nouns and verbs and tend to be declarative statements such as the
following:
    •    Conduct a study on
    •    Provide financial analysis of
    •    Maintain vehicles
    •    Review and assess
    •    Develop a strategic plan
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1214
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Identify potential
    •    Perform and document
When developing tasks ask the question WHY is this action needed? A because answer
usually drives the focus on the performance results needed. Why do you want the river
to be dredged? Because we want the boats to be able to go in and out. Bottom line: we
need to keep the river navigable. That is the objective.
Next, identify appropriate and reasonable performance standards (i.e., how well the
work must be performed to successfully support mission requirements). The purpose is
to establish measurable standards (adjectives and adverbs) for each of the tasks that
have been identified. The focus is on adjectives and adverbs that describe the desired
quality of the services outcome. How fast, how thorough, how complete, how error free,
etc. Examples of performance standards could include the following:
    •    Response times, delivery times, timeliness (meeting deadlines or due dates), and
         adherence to schedules.
    •    Error rates or the number of mistakes or errors allowed in meeting the
         performance standard.
    •    Accuracy rates.
    •    Milestone completion rates (the percent of a milestone completed at a given
         date).
    •    Cost control (performing within the estimated cost or target cost), as applied to
         flexibly priced contracts.
This should reflect the minimum needs to meet the task results. The standards you set
are cost drivers because they describe the performance levels that the contractor must
reach to satisfy the task. Therefore, they should accurately reflect what is needed and
should not be overstated. In the case of keeping a river navigable, the standard might
be: 100 feet wide, 12 feet deep at mean low water. Thus we have a measure for what
we define as navigable.
Standards should accurately reflect what is needed and should not be overstated. We
should ask the following questions in this area:
In the case of the navigable river, 12 feet deep means low water might be sufficient for
pleasure craft type usage. However, setting a depth of 34 feet which might be needed
for larger commercial watercraft that may never use that river would be considered
overkill and a waste of money. The standard must fit, or be appropriate to, the outcomes
need.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1215
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Another way of describing a performance standard is using terms like measurement
threshold or defining it as the limit that establishes that point at which successful
performance has been accomplished. Performance standards should:
The performance standards should describe the outcome or output measures but does
not give specific procedures or instructions on how to produce them. When the
government specifies the how-to, the government also assumes responsibility for
ensuring that the design or procedure will successfully deliver the desired result. On the
other hand, if the government specifies only the performance outcome and
accompanying quality standards, the contractor must then use its best judgment in
determining how to achieve that level of performance. Remember that a key PBA tenet
is that the contractor will be entrusted to meet the governments requirements and will
be handed both the batons of responsibility and authority to decide how to best meet
the government’s needs. The governments job is to then to evaluate the contractors
performance against the standards set forth in the PWS. Those assessment methods
identified in the QASP, together with the contractors quality control plan, will also help in
evaluating the success with which the contractor delivers the contracted level of
performance.
The Automated Requirements Roadmap Tool (ARRT) is a job assistance tool that
enables users to develop and organize performance requirements into draft versions of
the performance work statement (PWS), the quality assurance surveillance plan
(QASP), and the performance requirements summary (PRS). ARRT provides a
standard template for these documents and some default text that can be modified to
suit the needs of a particular contract. This tool should be used to prepare contract
documents for all performance-based acquisitions for services. The ARRT is available
for download at: http://sam.dau.mil/ARRTRegistration.aspx
The acquisition team may also establish an AQL for the task, if appropriate. The AQL is
a recognition that unacceptable work can happen, and that in most cases zero tolerance
is prohibitively expensive. In general, the AQL is the minimum number (or percentage)
of acceptable outcomes that the government will permit. An AQL is a deviation from a
standard. For example, in a requirement for taxi services, the performance standard
might be "pick up the passenger within five minutes of an agreed upon time." The AQL
then might be 95 percent; i.e., the taxi must pick up the passenger within five minutes
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1216
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
95 percent of the time. Failure to perform at or above the 95 percent level could result in
a contract price reduction or other action. AQLs might not be applicable for all standards
especially for some services such as Advisory and Assistance Services (A&AS) or
research and development (R&D) services.
Once the team has established the AQLs, they should review them:
Traditional acquisition methods have used the term quality assurance to refer to the
functions performed by the government to determine whether a contractor has met the
contract performance standards. The QASP is the governments surveillance document
used to validate that the contractor has met the standards for each performance task.
The QASP describes how government personnel will evaluate and assess contractor
performance. It is intended to be a living document that should be revised or modified
as circumstances warrant. It’s based on the premise that the contractor, not the
government, is responsible for delivering quality performance that meets the contract
performance standards. The level of performance assessment should be based on the
criticality of the service or associated risk and on the resources available to accomplish
the assessment.
Performance assessment answers the basic question How are you going to know if it is
good when you get it? Your methods and types of assessment should focus on how you
will approach the oversight of the contractors actual performance and if they are
meeting the standards that are set in the PWS. Completing the assessment elements of
the requirements roadmap will ensure that you determine who and how you will assess
each performance task before you write the PWS. If you develop a task or standard that
cannot be assessed you should go back and reconsider or redefine the task or standard
into one that can be assessed. This section of the roadmap provides the foundation for
your QASP. The QASP is not incorporated into the contract since this enables the
government to make adjustments in the method and frequency of inspections without
disturbing the contract. An informational copy of the QASP should be furnished to the
contractor.
The COR plays an essential role in the service acquisition process and should be a key
member of your acquisition team. During the requirements development process his/her
input is vital, because they will be living with this requirement during performance. In
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1217
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
accordance with DFARS 201.602-2, A COR must be qualified by training and
experience commensurate with the responsibilities to be delegated in accordance with
department/agency guidelines.
The method for assessing the contractors performance must be addressed before the
contract is awarded. It is the responsibility of the COR, as part of the acquisition team,
to assist in developing performance requirements and quality levels/standards, because
the COR will be the one responsible for conducting that oversight. The number of
assessment criteria and requirements will vary widely depending on the task and
standard as it relates to the performance risk involved, and the type of contract
selected. Using the requirements roadmap worksheet (Appendix A) will help ensure that
the requirement and assessment strategies are aligned.
Trend analysis: Trend analysis should be used regularly and continually to assess the
contractors ongoing performance over time. It is a good idea to build a database from
data that have been gathered through performance assessment. Additionally,
contractor-managed metrics may provide any added information needed for the
analysis. This database should be created and maintained by government personnel.
Customer feedback: Customer feedback is firsthand information from the actual users of
the service. It should be used to supplement other forms of evaluation and assessment,
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1218
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
and it is especially useful for those areas that do not lend themselves to the typical
forms of assessment. However, customer feedback information should be used
prudently. Sometimes customer feedback is complaint-oriented, likely to be subjective in
nature, and may not always relate to actual requirements of the contract. Such
information requires thorough validation.
Third-party audit’s: The term third-party audit refers to contractor evaluation by a third-
party organization that is independent of the government and the contractor. All
documentation supplied to, and produced by, the third party should be made available
to both the government and the contractor. Remember, the QASP should also describe
how performance information is to be captured and documented. This will later serve as
past performance information. Effective use of the QASP, in conjunction with the
contractors quality control plan, will allow the government to evaluate the contractors
success in meeting the specified contract requirements. Those assessment methods
identified in the QASP, together with the contractors quality control plan will help
evaluate the success with which the contractor delivers the level of performance agreed
to in the contract.
In our case of the navigable river, the method of inspection might be using a boat with
sonar and GPS, thus measuring the channel depth and width from bridge A to Z. The
results would document actual depths and identify where any depths are not compliant
with the standards.
A quality control plan is a plan developed by the contractor for its internal use to ensure
that it delivers the service levels contained in the contract. The quality control plan
should be part of the contractors original proposal, and in many cases, it is incorporated
into the resultant contract. The inspection of services clause requires that the quality
control plan be acceptable to the government.
Invest some time to determine how and to whom you will present the contractors
performance results. Most often this is to your leadership and stakeholders. This should
take the form of periodic performance reviews that quickly capture summary
performance results yet also provide the drill down capability when necessary to identify
and resolve performance problems. One way to structure your performance reporting is
to use the key stakeholder outcomes as key performance indicators (KPIs). These
measures are few in number, but supported by the process and sub-process measures
in your PWS. The chart below, Figure 14.3.1.3.7.F1, illustrates this approach.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1219
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                              Figure 14.3.1.3.7.F1. Performance Indicators
Key Process Indicators (KPIs): Top level summary metrics that quickly capture
current performance status that link to your stakeholder desired outcomes.
Process Measures: Capture the overall status of each process area contained in your
PWS.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1220
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
More on performance reporting will be discussed in step seven, but remember that in
developing your approach make sure that the effort required for collection and
measurement does not exceed the value of the information.
Market research may reveal that commercially acceptable performance standards will
satisfy the customer with the potential of a lower price. The acquisition team may also
discover that industry standards and tolerances are measured in different terms than
those that the customer has used in the past. Rather than inventing metrics or quality or
performance standards, the acquisition team should use existing commercial quality
standards (identified during market research), when applicable. It is generally a best
practice to use commercial standards where they exist, unless the commercial standard
proves inappropriate for the particular requirement. Industry’s involvement,
accomplished through public meetings, requests for information (RFI), or draft request
for proposals (RFPs), will help in finding inefficiencies in the PWS, and will also lead to
cost efficiencies that can be achieved through the use of commercial practices.
The PWS comprises the heart of any service acquisition and the success or failure of a
contract is greatly dependent on the quality of the PWS. Ensure you have completed all
elements of the requirements roadmap worksheet including inspection before starting to
write the PWS. There is no mandatory template or outline for a PWS. The FAR only
requires that agencies to the maximum extent practicable:
    •    Describe work in terms of required results rather than how the work is to be
         accomplished or the number of hours to be provided.
    •    Enable assessment of work performance against measurable performance
         standards.
    •    Rely on measurable performance standards and financial incentives in a
         competitive environment to encourage innovation and cost effective methods of
         performing the work.
The roadmap worksheet contains the basic outline for the requirements section of your
PWS. The HLOs and supporting performance tasks and standards should be the main
component of your PWS. After the introduction and general sections, the nuts and bolts
of your PWS might have the HLOs listed as 3.1, 3.2, 3.3, as appropriate. Under HLO
3.1, you would list the tasks and standards associated with this HLO. For example,
3.1.1 would be task 1 under that HLO 3.1. A task can have multiple performance
standards and AQLs associated with it from your roadmap such as timeliness, quality
etc. Make sure they are accurately captured in your PWS.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1221
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
14.3.1.5.1. Format
There is no mandatory format for a PWS; however, one normally includes the following:
1. Introduction: It should capture the importance of your mission and how this
requirement contributes to the overall mission of your organization. The introduction
describes your overall acquisition vision and desired mission results. It sets your
expectations of contractor performance in terms of teamwork and improving mission
results thru efficiencies and process improvements. Keep this section focused and
relatively brief, but capture the importance of achieving mission results and your
performance expectations.
2. Background: This section briefly describes the scope of the performance requirement
and the desired outcome. Provide a brief historical description of the
program/requirement that provides the context for this effort (include who is being
supported and where). Describe the general desired outcomes of your new
requirement. Consider that a contractor will have a greater chance at success with
adequate information that clearly defines the magnitude, quality, and scope of the
desired outcomes.
7. Task Orders: If task orders will be used, you need to address their use and ensure
each task order has a well-written PWS that includes HLOs, tasks, standards, data
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1222
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
deliverables and incentives as appropriate. Task descriptions should clearly define each
deliverable outcome. Subtasks should be listed in their appropriate order and should
conform to the numbering within the basic PWS from which the task order derives. All
task orders must capture performance assessments gathered using the task order
QASP. Each task order should have a trained COR assigned.
However, the team can adapt this outline as appropriate. Before completing the PWS,
there should be final reviews, so be sure your team examines every performance
requirement carefully and delete any that are not essential. Many agencies have posted
examples of a PWS that can provide some guidance or helpful ideas. Because the
nature of performance-based acquisition is tied to mission-unique or program-unique
needs, keep in mind that another agency's solution may not be an applicable model for
your requirement.
    •    The purpose of defining your requirement at high level objectives and tasks is to
         encourage innovative solutions for your requirement. Don't specify the
         requirement so tightly that you get the same solution from each offeror. If all
         offeror’s provide the same solution, there is no creativity and innovation in the
         proposals.
    •    The acquisition team must move beyond less efficient approaches of buying
         services (time and material or labor hour), and challenge offeror’s to propose
         their own innovative solutions. Specifically, specifying labor categories,
         educational requirements, or number of hours of support required should be
         avoided because they are "how to" approaches. Instead, let contractors propose
         the best people with the best skill sets to meet the need and fit the solution. The
         government can then evaluate the proposals based both on the quality of the
         solution and the experience of the proposed personnel.
    •    Prescribing manpower requirements limits the ability of offeror’s to propose their
         best solutions, and it could preclude the use of qualified contractor personnel
         who may be well suited for performing the requirement but may be lacking -- for
         example a complete college degree or the exact years of specified experience.
    •    Remember that how the PWS is written will either empower the private sector to
         craft innovative solutions, or stifle that ability.
The most important points for writing style guidelines are summarized below:
Style: Write in a clear, concise and logical sequence. If the PWS is ambiguous, the
contractor may not interpret your requirements correctly and courts are likely to side
with the contractors interpretation of the PWS.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1223
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Sentences: Replace long, complicated sentences with two or three shorter, simpler
sentences. Each sentence should be limited to a single thought or idea.
Vocabulary: Avoid using seldom-used vocabulary, legal phrases, technical jargon, and
other elaborate phrases.
Paragraphs: State the main idea in the first sentence at the beginning of the paragraph
so that readers can grasp it immediately. Avoid long paragraphs by breaking them up
into several, shorter paragraphs.
Abbreviations: Define abbreviations the first time they are used, and include an
appendix of abbreviations for large documents.
Use shall and don’t use will: The term shall is used to specify that a provision is
binding and usually references the work required to be done by the contractor. The
word will expresses a declaration of purpose or intent.
Be careful using any or either. These words clearly imply a choice in what needs to
be done contractually. For instance, the word any means in whatever quantity or
number, great or small which leaves it at the discretion of the contractor.
Don’t use and/or since the two words together (and/or) are meaningless; that is, they
mean both conditions may be true, or only one may be true.
Avoid the use of etc. because the reader would not necessarily have any idea of the
items that could be missing.
Ambiguity: Avoid the use of vague, indefinite, uncertain terms and words with double
meanings.
    •    Does the PWS avoid specifying the number of contractor employees required to
         perform the work (except when absolutely necessary)?
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1224
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Does the PWS describe the outcomes (or results) rather than how to do the
         work?
    •    What constraints have you placed in the PWS that restrict the contractors ability
         to perform efficiently? Are they essential? Do they support your vision?
    •    Does the PWS avoid specifying the educational or skill level of the contract
         workers (except when absolutely necessary)?
    •    Can the contractor implement new technology to improve performance or to
         lower cost?
    •    Are commercial performance standards utilized?
    •    Do the performance standards address quantity, quality and timeliness?
    •    Are the performance standards objectives easy to measure and timely?
    •    Is the assessment of quality a quantitative or qualitative assessment?
    •    Will two different CORs come to the same conclusion about the contractors
         performance based on the performance standards in the PWS?
    •    Are AQLs clearly defined?
    •    Are the AQL levels realistic and achievable?
    •    Will the customer be satisfied if the AQL levels are exactly met? (Or will they only
         be satisfied at a higher quality level?
    •    Are the persons who will perform the evaluations identified?
The heart of your QASP comes directly from your roadmap. It addresses each HLO and
its tasks with their associated standards. It includes the methods and types of inspection
(who is going to do the inspection, how the inspections are to be conducted and how
often they are to be conducted). Numerous organizations use the term performance
requirements summary (PRS), while others incorporate the standards within the PWS.
Either way, as long as the HLOs and tasks are tied to the standards in the resultant
contract, that is what is important.
Recognize that the methods and degree of performance assessment may change over
time in proportion to the evaluators level of confidence in the contractors performance.
Like the PWS there is no required format for a QASP, a suggested format is shown
below:
    •    Purpose
    •    Roles and Responsibilities
    •    Performance Requirements and Assessments
    •    Objective, Standard, AQL, Assessment Methodology
    •    Assessment Rating Structure Outline (1 to 5)
    •    Performance Reporting - establish reporting frequency to leadership
    •    Metrics
    •    Remedies used and impacts
    •    CPARS Report
    •    Attachments
    •    Sample Contract Deficiency Report
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1225
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Sample Performance Report Structure
    •    Is the value of evaluating the contractors performance on a certain task worth the
         cost of surveillance?
    •    Has customer feedback been incorporated into the QASP?
    •    Have random or periodic sampling been utilized in the QASP?
    •    Are there incentives to motivate the contractor to improve performance or to
         reduce costs?
    •    Are there disincentives to handle poor performance?
    •    Will the contractor focus on continuous improvement?
Determining an accurate IGE can be a challenging task for the acquisition team. This
will involve various skills sets from the team to project demand forecasts for the service.
What sort of constraints do you have in computing your IGE? You could have cost
constraints that can limit what you require in the PWS or Statement of Objectives
(SOO). Program scope may also be an issue if it’s difficult to determine exactly what the
contractor is being asked to propose. Remember, if you can’t develop an IGE, how do
you expect the contractor to propose based on the PWS?
This is the point where it would be beneficial to revisit the customer and stakeholders to
ensure everyone is satisfied with the PWS and the way forward. It is typical to have
varying levels of resistance to the teams strategy. The key is to develop an acquisition
team approach to sell the strategy to the customer and stakeholders and then schedule
review cycles.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1226
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
14.3.2.2.2. Incentives - Recognize the Power of Profit as a Motivator
At this point in the process you should have a well-defined PWS and QASP. Now it’s
time to start developing your business strategy to determine the type of contract vehicle,
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1227
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
incentive arrangement if any, and how you will acquire a contract service provider.
Review your market research results, how competitive is the market, what are small
business opportunities, can this service be acquired using FAR Part 12 , Acquisition of
Commercial Items, how are other organizations acquiring this type of service? Is this
requirement part of your agency’s strategic acquisition initiative? During market
research did you find another agency’s existing contract suitable for use in supporting
this requirement? When reviewing external acquisition options, you should examine
your agencies external acquisition policies to make sure there are no potential conflicts.
Another important consideration when using another agency’s contract is to clearly
determine who will provide the performance oversight of the resulting contract to ensure
it delivers the required performance results.
If no other viable option is available you will need to develop an effective business case
that supports the most effective way to achieve your mission requirements. The
business strategy involves selecting the right contract type, incentive structure and
contractor selection process that will best deliver mission results.
Your acquisition strategy involves several key components: (1) what type of contract
type is best suited for your requirement; (2) what incentive strategy, if any, to use; and
(3) what method you will use to select a contractor. Developing your strategy must be a
thoughtful, integrated team effort defined by the specifics of your mission requirement.
The FAR does not make any recommendation on the type of contract to be used when
contracting for services. However, the selection of contract type must be reflective of the
nature of the service requirement and risks associated with performance. Selection of a
contract type should motivate the contractor to deliver optimum performance. Your
observations during market research provide a good basis for analyzing commercial
practices, level of competition, maturity level of the service, to guide the selection of
contract type. There are two basic types of contract types, fixed price types and cost
reimbursable types. Although the FAR provides for the use of time and materials (T&M)
contracts under part 12 commercial contracts, DoD policy discourages it’s use and
therefore T&M should only be used in those rare circumstances where it is justified.
As a general rule, contracts for routine services, or efforts involving stable requirements,
manageable performance risk, are normally a fixed-price type. Work must meet
minimum stated performance standards. Service must be delivered within a specified
time and meet the performance standards in the contract. Price should be supported by
robust competition or recent competitive pricing history.
The contract price represents full payment for the work. Exceeding this amount is at the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1228
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
contractors own risk and expense. This type of contract is used when technical and cost
can be accurately estimated (i.e., low or predictable risk). It is also the most appropriate
type of contract to use when work can be clearly defined (or when the requirement is
constant with no need for flexibility). The contractor bears full responsibility for the
performance costs and resulting profit (or loss).
Cost type contracts are used when requirements cannot be accurately defined and
performance risk is not easily quantified or managed. These types of contracts require
the contractor to deliver their best effort to provide the specified service. Reasonable,
allowable, allocable costs will be reimbursed, up to the total estimated amount specified
in the contract. This amount represents an estimate of total costs, including fee, as a
not-to-exceed ceiling that cannot be exceeded without contracting officer approval.
When using a cost type contract ensure that the contractor has an adequate accounting
system and the government monitoring during performance provides assurance of
efficient methods and effective cost controls. Cost contracts place more risk on the
government because the contractor bears less responsibility for completing the
performance requirement within the established cost ceiling.
Two common types of Cost Plus Fixed Fee (CPFF) contracts for services are either
completion or term.
    •    CPFF Completion: If the contractor fails to complete the contract within time or
         budget, then the government pays only additional costs, but no additional fee to
         complete the effort. This is an incentive since contractors are in business to earn
         fees. This type of CPFF contract is applicable when there is a clearly defined
         result at the outset, but there are considerable unknowns with risks that need to
         be shared.
    •    CPFF Term: This form of CPFF contract allows you to describe the scope of
         work in general terms and the contractor will be required to perform a specified
         level of effort in a given period of time.
Incentives will drive behavior so one of the keys to effective incentives involves
recognizing that the actions of the private sector are motivated by profit. The
government relies on industry to provide customers with products and services. We
have regulations, policies, and procedures that allow industry to be compensated for
these efforts. One contractor was heard to say, "You give us the incentive, we will earn
every available dollar." It is important to understand the cause and effect relationship
between contractor performance and the type of incentive used. In another words,
whatever your team decides to incentivize, that is the area in which the contractor will
focus or concentrate on, so your team needs to assure that you are creating a behavior
that will deliver the right mission results.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1229
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
For example, link the incentive program to high priority or high risk performance
requirements with measureable metrics. Then, incorporate share-in-savings strategies
that reward the contractor for suggesting innovations that improve performance and
reduce total overall cost. Develop an acquisition approach that aligns the interests of
both parties. In other words develop a strategy in which both the contractor and the
government benefit from economies, efficiencies, and innovations delivered during
contract performance. If the incentives are right, and if the contractor and the agency
share the same goals, risk is largely controlled and effective performance is almost the
inevitable outcome. The key to incentives is to make them work for both parties.
Performance Incentives: These are incentives designed to relate profit or fee to results
achieved by the contractor in relation to identified cost-based, performance or schedule
based targets. For example, a large Cost Plus Incentive Fee, Base Operating Support
contract, contained an incentive provision for sharing cost savings generated by the
contractor, on a 50/50 basis, when actual costs came in under target cost. In each year
of a five-year contract the contractor delivered cost savings earning additional fee for
the contractor and cost savings for the installation. This incentive structure also put the
contactors base fee at risk if performance suffered as a result of cost cutting. Schedule
incentives focus on getting a contractor to exceed delivery expectations with either
quality, or timeliness. These can be important on construction or maintenance
requirements. They can be defined in terms of calendar days or months, attaining or
exceeding milestones, or meeting urgent requirements.
Award Fee Contract Arrangements: This type of incentive uses an award fee plan that
contains the criteria for earning the incentive. Generally, award fee contracts should
only be used when objective incentive targets are not feasible for critical aspects of
performance, judgmental standards can be fairly applied, and potential award fees
would provide a meaningful incentive to motivate the service provider to perform.
Small Business Participation Incentives: There will be times when the nature or value of
an acquisition exceeds the ability for small business to be the prime contractor. Large
prime contractors develop subcontracting plans in accordance with FAR 19.702 where
the use of small business provides value to the government. DoD can incentivize prime
contractors to achieve their small business subcontracting goals thus supporting a
healthy industrial base for future competition. One means to this end is to use actual
small business participation as a factor or sub-factor in best value source selections.
Finally, the government needs follow up and ensure that small businesses that are
featured in prime contractor proposals as prospective subcontractors are actually
successful in attaining subcontract awards if the prime contractor is awarded the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1230
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
contract.
The government can incentivize the contractors performance on just about any
contractual aspect, so long as that performance incentive provides ultimate benefit to
the government. Ultimately, whatever incentives you prescribe must be based on
predetermined, objective performance standards that you can quantify, measure, and
perform surveillance as needed. The list below provides examples of positive and
negative ways to use incentives:
Positive:
Negative:
    •    When performance is below standard for a given time period, require the
         contractor to re-perform the service at no additional cost to the government.
    •    When performance is below standard for a given time period, x% of the periods
         payment will be withheld or deducted.
    •    When performance is below standard for x consecutive months, increase
         surveillance or contractor reporting.
    •    Document past-performance report card, paying particular attention to
         performance that failed to meet the standard.
Make sure incentives are realistic and attainable. They must focus on achieving the
service acquisition objectives, taking into account the mission, the key characteristics,
and other unique features of the service. The acquisition team may jointly develop and
negotiate these incentive criteria with contractor(s) and all potential stakeholders so that
all parties buy in to the merits of this approach. Additionally, soliciting stakeholders input
and feedback will help identify what the customer feels are most important. Understand
that a contractor will not spend a dime to earn a nickel. Here are some best practice
questions the team should address when developing an incentive strategy:
    •    Is the incentive consistent with the mission, goals, and operational requirements?
    •    Will it deliver additional value to the mission?
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1231
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
    •    Which areas of the requirement would benefit most from enhanced performance?
    •    Which areas do not need added incentives (or which areas can do without)?
    •    Is your agency willing to pay more to achieve a level of performance beyond the
         performance standard? Is the incentive affordable?
    •    Is what we want to incentivize measurable?
    •    How accurately can we capture and record performance data?
    •    Is there potential for using cost-sharing?
    •    Will it affect timelines or schedules in a positive way?
    •    Does the strategy work to benefit both parties?
    •    Does the contractor have complete control of performance?
There are two primary best value methods of selecting a contractor. One of the goals of
PBA is to achieve the highest degree of quality and efficiency at a reasonable price.
Best-value source selections allow the government to establish factors used to evaluate
contractor proposal submissions. These two types of selection methods are:
Trade-off Method: This process allows for consideration of technical, past performance
and cost factors. The contract is awarded to the offeror that represents the best value in
accordance with the evaluation criteria contained in the RFP. This process provides for
tradeoffs between technical factors and price. Using this method allows the source
selection authority (SSA) to select a contractor that represents the best value versus
low cost.
Both methods enable the acquisition team to define evaluation factors to be used in
selecting the successful offeror. The key to successful use of any evaluation factor is to
establish a clear relationship between the PWS, Section L of the solicitation ( either
FAR Part 12 Acquisition of Commercial Item or FAR Part 15 Contracting by Negotiation
), and Section M of the solicitation (Evaluation Factors for Award). The evaluation
factors selected should link clearly with the PWS and represent those areas that are
important to stakeholders or have been identified as high risk during risk analysis. A
good rule of thumb is to look at the roadmap you have completed and make an
assessment as to which HLOs and tasks are the highest risk, highest priority or most
critical and should carry the most weight. The Departments standard source selection
procedures are found at DFARS 215.300 .
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1232
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
14.3.2.3. Allocate Workload Within the Acquisition Team
The key documents are the Acquisition Plan, the Acquisition Strategy, and the Source
Selection Plan. The acquisition plan is prescribed by the FAR and it spells out the
business case for the selected acquisition approach. It utilizes all the information
generated from the planning phase such as the nature of the requirement, risk areas,
customer concerns, and market analysis to support the plan. Acquisition plans for
services must also describe strategies for implementing PBA methods or provide a
rationale for not using them and provide a rationale if contract type is other than firm-
fixed price (FFP). The acquisition plan also communicates the requiring activities
approach to higher approval levels. These authorities will likely ask the following
questions:
    •    Is the plan consistent with current DoD priority and/or policies? (For example,
         providing for full and open competition, small business set-aside competition and
         the appropriate use of fixed-price type contracts.)
    •    Is the plan executable?
    •    Are the top-level objectives appropriate and in the best interest of the
         Government?
The acquisition plan and the acquisition strategy serve as a permanent record of
decisions made regarding the acquisition strategy for future reference. Acquisition
strategies for services are prescribed by DoD Instruction 5000.02 (Enclosure 9).
The source selection plan outlines the membership, evaluation factors, and provides a
description of the evaluation process, including specific procedures and techniques to
be used in evaluating contractor proposals. Both documents require approvals in
accordance with agency procedures.
Issuing a draft RFP is an effective way to get industry feedback. The draft RFP contains
both the requirement and the proposed business strategy that you are contemplating.
You can request feedback on both. Drafts provide any interested party with an
opportunity to provide comments before the actual acquisition process starts. The
government can benefit from this process by considering the industry feedback and how
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1233
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
it could improve the acquisition. It also gives potential contractors an opportunity to get
an early start on planning and proposal development since we often give contractors the
minimum 30 days to prepare a proposal once we issue the formal RFP.
The primary disadvantage is the time required to issue the draft and evaluate industry
comments, so plan accordingly if you anticipate using this very effective technique.
14.4.1.2.2. Section L
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1234
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
14.4. The Execution Phase
The formal acquisition process starts with the issuance of the final RFP or if the
acquisition team has determined to use another activity’s acquisition vehicle to complete
their action the issuance of a MIPR. FAR Part 15.201 states After release of the
solicitation, the contracting officer must be the focal point of any exchange with potential
offeror’s. When specific information about a proposed acquisition that would be
necessary for the preparation of proposals is disclosed to one or more potential
offeror’s, that information must be made available to the public as soon as practicable,
but no later than the next general release of information, in order to avoid creating an
unfair competitive advantage.
The objective of source selection is to select the offeror, whose proposal represents the
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1235
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
best value in accordance with the criteria stated in the RFP. The FAR gives the
government wide latitude in setting the ground rules for how a contractors proposal will
be evaluated and for setting the basis of award.
The RFPs Section L Instructions to Offeror’s and Section M Evaluation Factors for
Award provide industry information regarding how to submit their offer and how it will be
evaluated. Both of these are critical for a successful acquisition.
Section L of the solicitation is where information and guidance are provided to instruct
offeror’s how to prepare proposals in response to the solicitation. As previously stated,
the PWS, Section L and Section M all tie together. The PWS describes the requirement.
Section L requests information relating to how the offeror will execute that requirement,
for evaluation purposes. Section M describes how their proposal will be evaluated for
source selection purposes.
14.4.1.2.2. Section L
You MUST explain in section L of the RFP the structure in which the offeror’s will submit
their proposals (proposal instructions), and the requirement to specifically address those
areas that will be evaluated and scored/rated during the source selection.
Number of volumes: You should determine how many proposal volumes you want the
contractor to submit. Proposal volumes can consist of technical, quality control plan,
past performance and cost.
Page limit’s: Technical and business proposals can be very difficult to evaluate because
of their great size and bulk, much of which may be caused by repetition. Placing a limit
on the number of pages each proposal may contain reduces this problem. The typical
limit is 50 to 100 pages, but be sure that the technical personnel concur that the
technical and business approaches can be adequately explained within the limit’s that
have been established.
Font, spacing, and other layout instructions: Instructions for these areas enforce a
certain uniformity of appearance for proposals so evaluators will not be unduly
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1236
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
influenced by a flashy layout, but will find it easier to concentrate on the essentials.
However, do not impose unnecessary restrictions on the contractors ability to
communicate the necessary information in their proposals (i.e., complicated charts and
graphics).
Section M is uniquely tailored for each procurement and is intended to give offeror’s
guidance concerning the basis of award. You must explain all the evaluation factors and
significant sub-factors that will be considered in making the source selection along with
their relative order of importance (see FAR 15.304). Section M must clearly state
whether all evaluation factors, other than cost or price, when combined, are significantly
more important than, approximately equal to, or significantly less important than cost or
price.
One of the main challenges in determining best value is assessing performance risk.
This is challenging because the offeror’s may be proposing different approaches that
can be difficult to compare (an apples to oranges comparison). While Section M of a
solicitation provides the basis for evaluation, there is no precise science to assessing
dissimilar approaches toward fulfilling a PBA requirement.
The PWS, Section L, and Section M must all tie together. The PWS describes the
requirement. Section L requests information relating to how the offeror will execute that
requirement for evaluation purposes and Section M describes how the proposal will be
evaluated for source selection purposes. The following example, Table 14.4.1.2.4.T1.,
describes one piece of a requirement to illustrate the relationship between the three
areas.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1237
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
                 Table 14.4.1.2.4.T1. Sections L and M Relationship Example
  Performance Work
                                       Section L                             Section M
  Statement
  Provide taxi service so              The offeror shall         The agency will evaluate the
  that pickup time is within           describe how taxi service offeror’s approach for meeting the
  5 minutes of request                 will be provided in       standards for taxi service. The
  time, 95% of the time.               accordance with the       offer will be evaluated for best
                                       stated requirement.       value, in terms of technical merit
                                                                 and cost, with additional
                                                                 consideration for the offeror’s
                                                                 relevant and recent past
                                                                 performance (track record).
The FAR mandates that government assess contractors past performance in order to
use this information as a significant evaluation factor in the source selection process.
Past performance data is an influential factor in motivating contractors toward
excellence. The essential premise is that a record of good performance is an important
predictor of future performance. When evaluating large business past performance an
additional factor to consider is how effective they have been in meeting their small
business subcontracting goals.
For those situations where an offeror has no past contract performance or the
performance information is either unavailable or irrelevant, the FAR states that the
offeror may not be evaluated either favorably or unfavorably on the past performance
factor.
These actions vary depending on the dollar value of the acquisition and organization
policy. They may include pre-award surveys, pre-negotiation business clearances and
Congressional notification, depending on the amount of the award.
FAR Section 15.5 requires that all unsuccessful offeror’s be given an opportunity for a
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1238
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
debriefing concerning their proposal and how they can improve their chances in future
procurements. Debriefings are conducted following contract award notification by the
contracting officer and lead technical evaluator from the technical evaluation team.
Once a contractor has been selected, the QASP needs to be updated and finalized. If
any significant changes were made to the performance requirements during the
competition, the QASP needs to be updated to reflect the final requirement. Also now
include the contractors information and the names and role of their key personnel. If a
quality control (QC) plan was a required and evaluated as part of the contractors
proposal package, the team may consider including the contractors compliance with
their QC plan as an assessment area in the QASP. Make sure the COR has been
appointed and completed all required training prior to contract award.
Following contract award, it’s advisable to conduct a "kick-off meeting" or more formally,
a "post-award conference," attended by those who will be involved in contract
performance. This meeting will help both agency and contractor personnel achieve a
clear and mutual understanding of contract requirements and further establish the
foundation for good communications and a win-win relationship. It is very important that
the contractor become part of the team, and that agency and contractor personnel work
closely together to achieve the mission results embodied in the contract.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1239
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
14.4.2. Step Seven Performance Management
Steps One through Six have prepared you for this step. Step Seven delivers the
performance results your stakeholders need to successfully support their mission. It’s
not time to declare victory and move on. Your engagement with your contractor and
stakeholders will often cover several years.
There are two key elements to this step. First are the basic functions of administering
the contract such as validating contractor invoices, tracking cost data when required,
managing change as it occurs and making sure the contractor is getting paid on a timely
basis. The second key function is managing the relationship and expectations between
three key groups; customer’s, stakeholders and the contractor. Developing an
environment of trust and fair play is vital to keeping all parties focused on achieving the
intended mission results. This includes assessing performance using the QASP,
documenting performance for any incentive arrangement you may have created, and
finally making sure performance is documented annually in the government past
performance database with a fair and objective Contractor Performance Assessment
Reporting System (CPARS).
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1240
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
14.4.2.1. Transition to Performance Management
As new contract performance starts the team must shift from acquisition to performance
management. This means focusing on ensuring that the performance results contained
in the contract are delivered. To accomplish this effectively, everyone involved clearly
understands their role and responsibilities in completing the assessment strategies
contained in the QASP. The two key responsible parties are the contracting officer and
the COR.
The duties and responsibilities of the COR are contained in a designation letter signed
by the contracting officer. Make sure the COR and anyone else involved with monitoring
contract performance has read and understands the contract and has the training,
knowledge, experience, skills, and ability to perform his/her roles. The COR must know
the performance requirements and standards in depth and understand the assessment
strategies contained in the QASP. The COR should also be effective communicators
with good interpersonal skills.
If your requirement involves a contract vehicle that uses task orders for individual
requirements make sure you develop a plan to capture performance at the task order
level. This task order performance information should flow up the contract level to be
captured and reported.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1241
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
14.4.2.3. Manage Performance Results
Following contract award you should review your communication plan and determine
how and to whom you will report contractor performance information. It’s vital to keep
the communication links open with both your contractor and stakeholders throughout
the performance period of the contract. Establish regularly scheduled meetings with the
contractor to keep everyone informed of pending actions that could impact performance
such as scheduled exercises and IG visit’s. Discuss any issues the contractor may have
such as invoicing or payment problems. Identifying potential problems early is a key
way to keep them from having performance impacts. Implement the performance
reporting structure you developed in Step Four.
    •    KPIs: The few essential performance results that leadership will use to assess
         performance.
    •    Process Measures: What performance metrics will process stakeholders use to
         determine if their areas of responsibility are meeting standards? Make sure they
         link to specific PWS performance tasks.
    •    Sub-Process Measures: What results metrics will sub-process stakeholders or
         office chiefs use to asses performance in their areas of responsibility and how
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1242
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
         well current performance is supporting the process stakeholder?
How you capture and report performance information is critical for two reasons. First, it
keeps your stakeholders well informed based on actual performance results as
measured by your CORs. Second, it provides the documented performance trends and
results to have an open and honest discussion with your contractor concerning the
results being achieved. Performance reviews should be held on a regular basis with
both your stakeholders and your contractor. The frequency of stakeholder reviews is
often dictated by the importance or complexity of the service under contract. Quarterly
performance reviews with stakeholders should be a minimum. More complex
acquisitions may require monthly reviews.
There should be time in each meeting where the agency asks, "Is there anything we are
requiring that is affecting your performance in terms of quality, cost, or schedule?"
Actions discussed should be recorded for the convenience of all parties, with
responsibilities and due dates assigned. At each review point the QASP should be
reviewed to see if the approach to the inspection should be changed or revamped. If an
objective or standard needs to be changed, then it is appropriate that both parties agree
to any modification, however, the change may have a cost impact.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1243
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
performance and should not be the sole method for reporting it to the contractor.
If you’ve been conducting regular performance reviews with your contractor there
should be no surprises at the end of the performance period about what rating the
contractor will receive. These ratings are very important to a contractor; they can affect
future business opportunities. That’s why you need to have the facts and data to
support ratings above or below satisfactory.
A best practice concerning service contracts is to schedule regular reviews with your
key stakeholders and contractor. Your communication plan should now reflect the
schedule for these reviews. As has been mentioned several times already, good
communication is absolutely essential. That’s why a regularly scheduled review is
important. It provides an opportunity to discuss current performance with the
stakeholder. It also offers a chance to gain insight on projected changes that might
require a change to the current contract. Being proactive is better than the best reactive
strategy.
As performance periods advance, the acquisition team should assess the effectiveness
of the strategy that was originally developed to see if it is still achieving the required
mission results. What should be changed or modified during the next acquisition cycle
to improve mission results? Keep a record of what improvements could be made the
next time because before you know it, it will be time to start the acquisition process all
over again.
Service contracts tend to have performance periods lasting several years. Continuous
improvement should be one of the acquisition teams goals. For example, plan on
regular meetings with the contractor to identify actions both parties can take to improve
efficiency. This might include the identification of significant "cost drivers and what
improvement actions could be taken. Sometimes agencies require management
reporting based on policy without considering what the cost of the requirement is. For
example, in one contract, an agency required that certain reports be delivered regularly
on Friday. When asked to recommend changes, the contractor suggested that report
due date be shifted to Monday because weekend processing time costs less. This type
of collaborative action will set the stage for the contractor and government to work
together to identify more effective and efficient ways to measure and manage the
performance results over the life of the contract.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1244
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Appendix A -- REQUIREMENTS ROADMAP WORKSHEET
SAM is intended to help you get your job done by providing usable tools and templates
to create your performance-based service acquisition requirements. Each of the Wings
on the mall map below contains information related to a category of services. Move your
mouse over the elements of the mall map to discover what is available in SAM. Enter
the Wing by simply clicking on the appropriate space.
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1245
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
Appendix D -- MARKET RESEARCH RESOURCES
You may find this list of sources helpful when conducting your market research:
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1246
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
in market research
Appendix E -- GLOSSARY
HQ Headquarters
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1247
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil
MIPR                       Military Interdepartmental Purchase Request
MOU                        Memorandum Of Understanding
This document is an accurate representation of the content posted on the DAG website for this Chapter, as of the date of
                                                                                                                   1248
production listed on the cover. Please refer to the DAG website for the most up to date guidance at https://dag.dau.mil