Military Readiness
Army and Marine Corps Reporting Provides Additional Data, but Actions Needed to Improve Consistency
Gao ID: GAO-11-526 June 3, 2011
To obtain visibility of the capabilities of its military forces, the Department of Defense (DOD) has developed an enterprise of interconnected readiness reporting systems. In 2010, to better meet the information needs of their leaders, the Army and Marine Corps implemented new reporting requirements. House and Senate Reports, which accompanied proposed bills for the National Defense Authorization Act for Fiscal Year 2011, directed GAO to review recent readiness reporting changes. GAO assessed the extent that 1) current readiness reporting policies have affected the content of readiness information provided to decision makers, 2) the services have consistently implemented their new policies, and 3) changes to the Army, Marine Corps, and Office of the Secretary of Defense (OSD) systems have affected the Defense Readiness Reporting System (DRRS) enterprise. GAO analyzed DOD, Army, and Marine Corps policies, readiness data, service readiness reporting systems, and spoke to headquarters officials and reporting units.
Current Army and Marine Corps guidance has generally improved the quantity and objectivity of readiness information available to decision makers. As in the past, Army Regulation 220-1 and Marine Corps Order 3000.13 direct units to report on two types of missions--the core missions for which units were designed as well as any other missions they may be assigned, but recent changes to the guidance also added new requirements. Units must now provide objective, personnel and equipment data to supplement commanders' assessments of their units' assigned mission capabilities. The updated service guidance also provides additional criteria, which are intended to help unit commanders consistently assess their units' mission capabilities. The new data and additional mission assessment criteria improve the objectivity and consistency of readiness information provided to decision makers. However, to clearly identify units that recently returned from deployment, the Army regulation now requires units to uniformly report a specific service directed readiness level rather than assess and report the unit's actual readiness level. As a result, decision makers lack a complete picture of the readiness of some units that could be called upon to respond to contingencies. While the Army and Marine Corps have taken steps to implement the revised readiness reporting guidance, units are inconsistently reporting readiness in some areas. GAO site visits to 33 Army and 20 Marine Corps units revealed that units were using inconsistent reporting time frames, and GAO data analysis showed that 49 percent of Marine Corps reports submitted between May 2010 and January 2011 were late. Furthermore, units are reporting equipment and personnel numbers differently, and some units are not linking their two types of mission assessments, in accordance with current guidance. The federal standards for internal control state management must continually assess and evaluate its internal controls to assure that the control activities being used are effective and updated when necessary. However, Marine Corps and Army quality assurance reviews have not identified all the inconsistencies and system mechanisms are not preventing the submission of inconsistent data. Until internal controls improve, decision makers will continue to rely on readiness information that is based on inconsistent reporting. While the DRRS Concept of Operations calls for a family of systems to exchange information seamlessly under an enterprise framework, DOD and the services have focused their efforts on the needs of different users and have not reached agreement on key steps to achieve interoperability. Consequently progress has been incremental. In 2009, GAO issued a report highlighting the challenges facing DRRS and recommended that DOD use GAO's report and an independent program risk assessment to redirect the program's approach, structure, and oversight. As of April 2011, the risk assessment had not been done and it is now scheduled to begin in the fall of this year. Until this assessment is complete, OSD will continue to lack the information it needs to reach consensus with the services and make any adjustments needed to achieve interoperability. GAO recommends that the Army develop an alternative means to show which units recently returned from deployment and that both services improve internal controls to enhance readiness reporting. DOD did not concur, citing the availability of other readiness data and actions taken on internal controls. GAO disagrees that the data DOD cites provides sufficient visibility; therefore, additional actions are needed.
Recommendations
Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.
Director:
Sharon L. Pickup
Team:
Government Accountability Office: Defense Capabilities and Management
Phone:
(202) 512-9619
GAO-11-526, Military Readiness: Army and Marine Corps Reporting Provides Additional Data, but Actions Needed to Improve Consistency
This is the accessible text file for GAO report number GAO-11-526
entitled 'Military Readiness: Army and Marine Corps Reporting Provides
Additional Data, but Actions Needed to Improve Consistency' which was
released on June 6, 2011.
This text file was formatted by the U.S. Government Accountability
Office (GAO) to be accessible to users with visual impairments, as
part of a longer term project to improve GAO products' accessibility.
Every attempt has been made to maintain the structural and data
integrity of the original printed product. Accessibility features,
such as text descriptions of tables, consecutively numbered footnotes
placed at the end of the file, and the text of agency comment letters,
are provided but may not exactly duplicate the presentation or format
of the printed version. The portable document format (PDF) file is an
exact electronic replica of the printed version. We welcome your
feedback. Please E-mail your comments regarding the contents or
accessibility features of this document to Webmaster@gao.gov.
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed
in its entirety without further permission from GAO. Because this work
may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this
material separately.
United States Government Accountability Office:
GAO:
Report to Congressional Committees:
June 2011:
Military Readiness:
Army and Marine Corps Reporting Provides Additional Data, but Actions
Needed to Improve Consistency:
GAO-11-526:
GAO Highlights:
Highlights of GAO-11-526, a report to congressional committees.
Why GAO Did This Study:
To obtain visibility of the capabilities of its military forces, the
Department of Defense (DOD) has developed an enterprise of
interconnected readiness reporting systems. In 2010, to better meet
the information needs of their leaders, the Army and Marine Corps
implemented new reporting requirements. House and Senate Reports,
which accompanied proposed bills for the National Defense
Authorization Act for Fiscal Year 2011, directed GAO to review recent
readiness reporting changes. GAO assessed the extent that 1) current
readiness reporting policies have affected the content of readiness
information provided to decision makers, 2) the services have
consistently implemented their new policies, and 3) changes to the
Army, Marine Corps, and Office of the Secretary of Defense (OSD)
systems have affected the Defense Readiness Reporting System (DRRS)
enterprise. GAO analyzed DOD, Army, and Marine Corps policies,
readiness data, service readiness reporting systems, and spoke to
headquarters officials and reporting units.
What GAO Found:
Current Army and Marine Corps guidance has generally improved the
quantity and objectivity of readiness information available to
decision makers. As in the past, Army Regulation 220-1 and Marine
Corps Order 3000.13 direct units to report on two types of missions”
the core missions for which units were designed as well as any other
missions they may be assigned, but recent changes to the guidance also
added new requirements. Units must now provide objective, personnel
and equipment data to supplement commanders‘ assessments of their units‘
assigned mission capabilities. The updated service guidance also
provides additional criteria, which are intended to help unit
commanders consistently assess their units‘ mission capabilities. The
new data and additional mission assessment criteria improve the
objectivity and consistency of readiness information provided to
decision makers. However, to clearly identify units that recently
returned from deployment, the Army regulation now requires units to
uniformly report a specific service directed readiness level rather
than assess and report the unit‘s actual readiness level. As a result,
decision makers lack a complete picture of the readiness of some units
that could be called upon to respond to contingencies.
While the Army and Marine Corps have taken steps to implement the
revised readiness reporting guidance, units are inconsistently
reporting readiness in some areas. GAO site visits to 33 Army and 20
Marine Corps units revealed that units were using inconsistent
reporting time frames, and GAO data analysis showed that 49 percent of
Marine Corps reports submitted between May 2010 and January 2011 were
late. Furthermore, units are reporting equipment and personnel numbers
differently, and some units are not linking their two types of mission
assessments, in accordance with current guidance. The federal
standards for internal control state management must continually
assess and evaluate its internal controls to assure that the control
activities being used are effective and updated when necessary.
However, Marine Corps and Army quality assurance reviews have not
identified all the inconsistencies and system mechanisms are not
preventing the submission of inconsistent data. Until internal
controls improve, decision makers will continue to rely on readiness
information that is based on inconsistent reporting.
While the DRRS Concept of Operations calls for a family of systems to
exchange information seamlessly under an enterprise framework, DOD and
the services have focused their efforts on the needs of different
users and have not reached agreement on key steps to achieve
interoperability. Consequently progress has been incremental. In 2009,
GAO issued a report highlighting the challenges facing DRRS and
recommended that DOD use GAO‘s report and an independent program risk
assessment to redirect the program‘s approach, structure, and
oversight. As of April 2011, the risk assessment had not been done and
it is now scheduled to begin in the fall of this year. Until this
assessment is complete, OSD will continue to lack the information it
needs to reach consensus with the services and make any adjustments
needed to achieve interoperability.
What GAO Recommends:
GAO recommends that the Army develop an alternative means to show
which units recently returned from deployment and that both services
improve internal controls to enhance readiness reporting. DOD did not
concur, citing the availability of other readiness data and actions
taken on internal controls. GAO disagrees that the data DOD cites
provides sufficient visibility; therefore, additional actions are
needed.
View [hyperlink, http://www.gao.gov/products/GAO-11-526] or key
components. For more information, contact Sharon Pickup at (202) 512-
9619 or pickups@gao.gov.
[End of section]
Contents:
Letter:
Background:
Army and Marine Corps Requirements for Readiness Reporting Have
Generally Increased the Quantity and Objectivity of Information
Available to Decision Makers:
Army and Marine Corps Units Have Implemented Revised Guidance for
Readiness Reporting, but Some Reporting Is Inconsistent:
OSD and the Services Continue to Develop Their Respective Systems in
the Enterprise:
Conclusions:
Recommendations for Executive Action:
Agency Comments and Our Evaluation:
Appendix I: Scope and Methodology:
Appendix II: Comments from the Department of Defense:
Appendix III: GAO Contact and Staff Acknowledgments:
Tables:
Table 1: Marine Corps Assessment Correlations:
Table 2: Readiness Reporting Time Frames for Core and Assigned
Missions of Army and Marine Corps Units and Installations:
Table 3: Installations and Units Visited:
Figures:
Figure 1: Phases of ARFORGEN:
Figure 2: Marine Corps Readiness Reports: Total and 31 Days or More
Since Last Report:
Figure 3: Marine Corps Units Submitting Late Reports: June 2010
through January 2011:
Figure 4: Army C-4 Units Reporting Yes for Capability Assessments:
Figure 5: Percentage of Inconsistent Assessments among Marine Corps
Units:
Abbreviations:
ARFORGEN: Army Force Generation:
DOD: Department of Defense:
DRRS: Defense Readiness Reporting System:
OSD: Office of the Secretary of Defense:
[End of section]
United States Government Accountability Office:
Washington, DC 20548:
June 3, 2011:
Congressional Committees:
In an era of persistent conflict and global uncertainty, the
President, Congress, and military and civilian leaders within the
Department of Defense (DOD) need visibility over the readiness of
DOD's forces. Over the years, the services, Joint Staff, and Office of
the Secretary of Defense (OSD) have relied on readiness information
from a variety of systems to help them guide, prepare, and deploy
forces for regular as well as nontraditional assigned missions. In the
1990s, readiness reporting systems captured unit commanders'
assessments of their unit's capabilities to execute the unit's regular
missions. These assessments were supported by underlying data that
compared on-hand personnel and equipment levels to required levels,
data concerning the material condition of on-hand equipment, and
assessments of unit training.
In 1999, Congress directed DOD to create a comprehensive readiness
reporting system to measure in an objective, accurate, and timely
manner the capability of the armed forces to carry out the National
Security Strategy prescribed by the President, the defense planning
guidance provided by the Secretary of Defense, and the National
Military Strategy prescribed by the Chairman of the Joint Chiefs of
Staff. In response, DOD is developing a family of interconnected
information systems that build upon existing processes and readiness
assessment tools to establish a capabilities-based, readiness
reporting system--referred to as the Defense Readiness Reporting
System (DRRS) Enterprise. DRRS-Army and DRRS-Marine Corps are two of
the interconnected information systems within the enterprise. These
two systems provide service leaders with the detailed information
necessary to execute their Title 10 responsibilities to man, train,
and equip their forces, and are also used to provide information to
department and congressional leaders through the OSD DRRS-Strategic
system and the Chairman of the Joint Chiefs of Staff's Global Status
of Resource and Training System. To better meet the information needs
of their service leaders, in 2010 the Army and Marine Corps
implemented new readiness reporting guidance. Among other things, the
changes in guidance affected the ways units reported their
capabilities to perform assigned missions.
House Report 111-491[Footnote 1] and Senate Report 111-201,[Footnote
2] which accompanied proposed bills for the National Defense
Authorization Act for Fiscal Year 2011, directed GAO to review the
readiness reporting changes of the Army and Marine Corps. Accordingly,
we assessed (1) the extent to which current readiness reporting
requirements have affected the content of readiness information
provided to various decision makers within and outside the DOD, (2)
the extent to which Army and Marine Corps have consistently
implemented their current readiness reporting guidance, and (3) how
system developments for the DRRS-Strategic, DRRS-Army, and DRRS-Marine
Corps have affected the enterprise.
To determine the extent to which the current readiness reporting
guidance has affected the content of information provided to various
decision makers, we analyzed Army, Marine Corps, Joint Staff, and OSD
guidance as well as laws requiring readiness reports from DOD. We also
discussed the guidance with officials from the OSD, the Office of the
Assistant Secretary of the Army, Headquarters Marine Corps, the office
of the Joint Chiefs of Staff, as well as other Army and Marine Corps
service officials. Finally, we compared the information that was
provided to key decision makers prior to the changes in reporting
requirements to the information that is currently provided. To
determine the extent to which the Army and Marine Corps have
consistently implemented their current readiness reporting guidance we
analyzed readiness reporting data from both Army and Marine Corps
units. Specifically, we compared the reporting requirements of the
respective services to the information units were reporting. We
assessed the reliability of the DRRS data and determined the data were
sufficiently reliable for the purposes of assessing the consistency of
the implementation of the current readiness reporting policies, and
discuss our findings in the report. To gain a better understanding of
the readiness reporting process and how the readiness reporting
guidance is being implemented we judgmentally selected Army and Marine
Corps locations where we could meet with a variety of different types
of reporting units. Specifically, we visited 5 Army locations, where
we met with key officials from 33 Army units, and 2 Marine Corps
installations where we met with key officials from 20 Marine Corps
units. We visited a wide variety of units to see if factors such as
unit type, size, location, component, or placement within the
deployment cycle were affecting the units' reports and implementation
of the new guidance.
We conducted this performance audit from August 2010 to June 2011, in
accordance with generally accepted government auditing standards.
Those standards require that we plan and perform the audit to obtain
sufficient, appropriate evidence to provide a reasonable basis for our
findings and conclusions based on our audit objectives. We believe
that the evidence obtained provides a reasonable basis for our
findings and conclusions based on our audit objectives. Appendix I
provides a more detailed description of our scope and methodology.
Background:
DOD's Family of Readiness Reporting Systems:
DOD established the Defense Readiness Reporting System in response to
the Strom Thurmond National Defense Authorization Act for Fiscal Year
1999.[Footnote 3] Later, DRRS evolved into the DRRS Information
Technology Enterprise Environment (enterprise). The enterprise
represents a family of service and OSD computer information systems
and selected databases. It is intended to capture DOD readiness data
from multiple sources and provide relevant elements of these data to
decision makers. Specifically, the enterprise will report assessments
of both capabilities and of training and resources. The DRRS-Army,
DRRS-Marine Corps, and DRRS-Navy systems currently provide information
and data within and outside of the enterprise. This report focuses on
the DRRS-Army and DRRS-Marine Corps systems and the relationship of
those systems to the DRRS-Strategic system.
To provide governance of the family of reporting systems that make up
the enterprise, the Under Secretary of Defense - Personnel & Readiness
established a three-tier structure. This three-tier structure is to
enhance communication between the development community, represented
by the DRRS Implementation Office and system contractors, and the user
community (which includes the Joint Staff, military services, and
combatant commands).[Footnote 4]
Representatives from the office of the Under Secretary of Defense -
Personnel & Readiness and the Joint Staff currently serve as co-chairs
of all three governance tiers.
* Tier One: This involves the DRRS Battle Staff, which is comprised of
colonels, Navy captains, and similar-graded civilians. It tracks DRRS
development and identifies issues with the system.
* Tier Two: This level involves the DRRS General and Flag Officer
Steering Committee, which discusses issues raised by the Battle Staff
(Tier One). Members are one-star generals or admirals, or civilian
equivalent.
* Tier Three: The DRRS Executive Committee is chartered to review and
approve proposals and plans to establish policy, processes, and system
requirements for DRRS, including approving software development
milestones required to reach objectives. This committee is composed of
3-star military officers and their civilian counterparts. It is
chaired by the Director of the Joint Staff and the Under Secretary of
Defense - Personnel & Readiness.
Two Types of Mission Assessments:
DOD units assess their readiness for their core[Footnote 5] and
assigned[Footnote 6] missions using two different types of mission
assessments. OSD requires that units assess their readiness using
capability measures. The Chairman of the Joint Chiefs of Staff
requires that units assess their readiness using resource and training
metrics. In September 2006, the Army designated its readiness
reporting system as DRRS-Army and the Marine Corps units began
reporting in DRRS-Marine Corps in May 2010. Both are part of the
family of information systems that make up the DRRS enterprise. Both
systems collect unit assessments that address the statutory reporting
requirements of OSD and the Chairman of the Joint Chiefs of Staff. The
systems are also designed to meet the services' reporting
requirements, as well as to collect other services-specific
information from reporting units.
The first type of unit assessment is commonly referred to as a
capability assessment. For these assessments commanders use Yes,
Qualified Yes, or No categories to rate their units' capabilities to
perform a core or assigned mission. The three categories are discussed
below:
* Yes or Y--the unit or organization can accomplish its tasks or
missions under specified conditions.
* Qualified Yes or Q--the organization is expected to accomplish the
task to standard under most conditions, but this performance has not
been observed or demonstrated in training or operations; although data
may not support a "yes" the commander believes the organization can
accomplish the rated task or mission to prescribed standards under
most conditions.
* No or N--the organization is unable to accomplish the task to
standard at the time of the assessment.
The second type of unit assessment, which is based on resources and
training metrics, is commonly referred to as a C-rating. Specifically,
C-ratings are based on personnel, equipment and supplies on hand,
equipment readiness/serviceability, and training measures, and range
from C-1 through C-5, as described below.
C-1--unit possesses the required resources and is trained to undertake
its full core mission.
C-2--unit possesses the required resources and is trained to undertake
most of its core mission.
C-3--unit possesses required resources and is trained to undertake
many, but not all, portions of its core mission.
C-4--unit requires additional resources or training to undertake core
mission.
C-5--unit is undergoing directed resource action and is not prepared
to undertake its core mission.[Footnote 7]
Sources, Users of Readiness Information:
DOD units report their readiness on a monthly basis if they execute
missions in support of the combatant commanders and service-assigned
missions. For the Army, approximately 6,000 units--including active-
duty and reserve component units not on active duty--report into the
DRRS-Army database. Reporting units range in size from small
detachments of 4 soldiers to larger combat units such as 5,000 soldier
brigade combat teams. For the Marine Corps, approximately 350 units--
including combat, combat support, and combat service support units--
report their readiness into the DRRS-Marine Corps database. Marine
Corps reporting units range in size from approximately 50 to 1,600
Marines. The number of reporting units can vary each month due to the
creation and dissolution of units and the requirement for units to
submit multiple reports in a month if significant changes occur.
Once reported, the readiness information and data inform a wide range
of decision makers as identified in laws, directives, and guidance,
including a DOD directive, Chairman of the Joint Chiefs of Staff
Instruction, Secretary of Defense Memorandums, and service regulations
and messages. Users of readiness data include Congress, the Secretary
of Defense, the Chairman of the Joint Chiefs of Staff, the combatant
commanders, the Secretaries of the military departments, and the Chief
of the National Guard Bureau. On a quarterly basis, readiness
reporting data from all services and combatant commands are combined
to create the Joint Forces Readiness Report and the Quarterly
Readiness Report to Congress. In addition, decision makers can use the
readiness data to support operation and campaign plans, determine the
readiness of units to respond to unexpected contingencies, or analyze
the top resource shortfalls affecting the units.
Army and Marine Corps Requirements for Readiness Reporting Have
Generally Increased the Quantity and Objectivity of Information
Available to Decision Makers:
Revised Army and Marine Corps Guidance Requires New Objective Measure
of Readiness:
The current Army and Marine Corps readiness reporting requirements
generally have increased the quantity and objectivity of readiness
information provided to decision makers. The updated version of Army
Regulation 220-1,[Footnote 8] implemented in April 2010, directs units
to provide information concerning their assigned missions, in addition
to what was previously required. Specifically, it requires that units
report objective personnel and equipment data to support the
commander's overall subjective assessment of the unit's capability to
perform its assigned mission, referred to as an A-level. Previously,
Army guidance required only the commanders' subjective assessment of
their unit's capabilities without any supporting personnel or
equipment data, which at that time was called a percent-effective
rating.
In July 2010, the Marine Corps also revised its guidance for reporting
unit readiness for assigned missions.[Footnote 9] The service's
percent-effective rating was preserved in name, but the updated
guidance, Marine Corps Order 3000.13, directed commanders to report
their readiness for assigned missions using the same types of
personnel, equipment, and training measurements they use to assess
their core missions. The previous rating method, which was similar to
the Army's, relied on a commander's subjective assessment and required
no specific data to be reported. The Marine Corps' new Order also
directs commanders to correlate their unit core and assigned resources
with their capability assessments, as shown in table 1 below. For
example, the Order requires that commanders who rate their units as C-
3 also assess their unit capabilities as "No" indicating that the unit
is not able to accomplish its regular mission at this time. Marine
Corps officials told us that the Marine Corps only deploys units rated
C-1 or C-2. Marine Corps officials stated that the alignment of the
core resource and training mission assessment (C-ratings) do not match
perfectly with the three-tier capability assessments (Yes, Qualified
Yes, and No ratings), but the relationship provides additional,
helpful information for decision makers.
Table 1: Marine Corps Assessment Correlations:
Core and assigned mission assessment training and resources: C-1 (unit
can undertake full missions);
Core and assigned capability assessment: Yes or qualified yes.
Core and assigned mission assessment training and resources: C-2 (unit
can undertake most of the missions);
Core and assigned capability assessment: Yes or qualified yes.
Core and assigned mission assessment training and resources: C-3 (unit
can undertake many, but not all, portions of the missions);
Core and assigned capability assessment: No.
Core and assigned mission assessment training and resources: C-4 (unit
requires additional resources or training to undertake missions);
Core and assigned capability assessment: No.
Core and assigned mission assessment training and resources: C-5 (unit
is not prepared to undertake missions);
Core and assigned capability assessment: No.
Source: Marine Corps Order 3000.13.
[End of table]
While the Army and Marine Corps updated their readiness reporting
guidance in 2010 to include more objective assigned mission ratings,
the services' guidance retained many of the previous reporting
requirements for core missions, such as the requirements to report
personnel and equipment information, training assessments, commander
comments that provide additional information about their unit's
reported resources, and installation readiness reports. Reporting time
frames also remained unchanged and are shown in table 2.
Table 2: Readiness Reporting Time Frames for Core and Assigned
Missions of Army and Marine Corps Units and Installations:
Army active and reserve component units:
Reports are to be submitted the 15th of every month, or within 24
hours of a change that affects the unit's overall readiness or
capability level.
Marine Corps active and reserve units:
Reports are to be submitted every 30 days, or within 24 hours of a
change that affects the unit's overall readiness or capability level.
Army and Marine Corps installations:
Reports are to be submitted quarterly for the Army and every 90 days
for the Marine Corps.
Source: GAO analysis of Army Regulation 220-1 and Marine Corps Order
3000.13.
[End of table]
Army and Marine Corps leaders are regularly briefed on the additional
assigned mission metrics the services are now requiring their units to
report--personnel and equipment metrics in the case of the Army and
personnel, equipment, and training metrics in the case of the Marine
Corps. These additional assigned mission readiness metrics are
available to decision makers outside of the services and have been
included in service briefings to congressional committees. In
addition, the information that is presented in two of the key
readiness reports currently required by law--the Quarterly Readiness
Report to Congress[Footnote 10] and the Joint Forces Readiness
Review[Footnote 11]--has been affected by the services' updated
guidance which provided additional criteria to help unit commanders
better align their two types of readiness assessments.
Reporting Requirements Vary during the Three Phases of the Army Force
Generation Cycle:
The Army Force Generation (ARFORGEN) model is a rotational readiness
model that is used to synchronize planning and resourcing to generate
trained and ready forces. In ARFORGEN, active and reserve component
units complete a monthly Unit Status Report indicating their current
readiness levels for their core mission and, if directed, their
assigned mission. Active and reserve component units entering the
"Available" phase may deploy to conduct operational missions or may
continue training while remaining available for contingency missions.
Once units have completed their time in the available phase, the unit
enters the "RESET" phase. Active units will spend 6 months in the
RESET phase or approximately 16 percent of their overall ARFORGEN
cycle time.[Footnote 12] Units then move into the "Train/Ready" phase
where there are no prescribed time lengths because the ARFORGEN cycle
is driven by the unit's total length of time deployed.
Figure 1: Phases of ARFORGEN:
[Refer to PDF for image: illustration]
Train/Ready:
Units build increased readiness. Deploying forces train for their
operational mission; contingency forces, those without an operational
mission, train for full-spectrum operations. Active units may be
deployed; reserve-component units may be mobilized.
Available:
Units may or may not deploy. Units that do not deploy may conduct
training or exercises; some units may remain in the phase as
contingency forces. Deployed units will return to RESET upon
redeployment; units that do not deploy will return to RESET after 12
months.
Reset:
Active and reserve-component units remain in this phase for a minimum
6 and 12 months, respectively. Activities include: family
reintegration, block leave, equipment repairs, and individual and
institutional training. This phase often, but not always, follows a
unit deployment to Iraq and Afghanistan.
Source: GAO analysis of Army data.
[End of figure]
As a result of changes to reporting guidance, Army units are reporting
C-5 and T-5 as directed rather than their actual training or actual
overall mission assessment while in the RESET phase. Under the
regulation, unit commanders report their personnel, equipment status,
and equipment readiness during all three phases of the ARFORGEN cycle.
Commanders also report their actual unit training metric and actual
overall core mission assessments (C-rating) during the Train/Ready and
Available phases of the cycle. However, during the RESET phase Army
Regulation 220-1 directs unit commanders to report their overall
mission status as C-5 (when a unit is undergoing directed resource
action and is not prepared to undertake its core mission) regardless
of the units' actual core mission capabilities. According to Army
officials, this change in guidance is intended to provide a means to
show which units are currently in RESET, i.e., by having them all
report C-5. According to both Army and Joint Staff officials, current
business rules within the Chairman of the Joint Chiefs of Staff's
readiness reporting system do not allow units to report a C-5 unless
one of the four measured areas--personnel, equipment status, equipment
readiness, and training--is rated as a 5; the Army has directed its
RESET units to report training as a 5 so they will be able to report
overall status as C-5. In contrast, under the 2006 version of Army
Regulation 220-1 commanders reported their unit training metric and
overall core mission assessments during all phases of the deployment
cycle.[Footnote 13] As units comply with the Army's direction to
report C-5 and T-5 during RESET, decision makers lose visibility over
the unit's actual training and overall readiness status.
Under Title 10 of the U.S. Code, the Secretary of Defense is required
to submit a quarterly report to Congress detailing overall military
readiness.[Footnote 14] For units that received a mission assessment
rating of C-3 or below for any month during the quarter covered by the
report, the report is to include, among other things, information
about the resource area or areas (personnel, equipment and supplies on
hand, equipment condition, or training) that adversely affected the
unit's readiness rating during that quarter. In addition, according to
the Army Force Generation Regulation[Footnote 15] with regard to
readiness reporting requirements, in order to manage the total force
in ARFORGEN, the Army must achieve situational awareness of its
forces' readiness status in all force-generation phases, including the
RESET phase, to be able to manage its total force. According to the
regulation units in the RESET phase provide the Army with strategic
flexibility because those units retain their capability to perform
civil support operations or respond to combatant commander
requirements. As a result of the Army's 2010 regulation, decision
makers in DOD and Congress do not have a complete picture of units'
actual training and overall readiness status in RESET to determine
which units have retained their capability to conduct non-wartime-
related missions or respond to combatant commander requirements.
Army and Marine Corps Units Have Implemented Revised Guidance for
Readiness Reporting, but Some Reporting Is Inconsistent:
While the Army and Marine Corps have taken steps to implement the
revised readiness reporting guidance, we identified several areas
where units were inconsistently reporting readiness. The services are
not consistent in selecting and meeting time frames to report
readiness, Army units vary in identifying their status in ARFORGEN,
some units are not linking their resource and training mission
assessments with their capability assessments, and units vary in how
they report resources and capabilities.
Reporting Time Frames Are Inconsistent:
We found that Army and Marine Corps units are using different time
frames when reporting their readiness data. The Marine Corps'
readiness reporting order requires that units submit reports at least
every 30 days, but it does not require a specific reporting date for
its units. From the implementation of DRRS-Marine Corps in May 2010
through January 2011, Marine Corps units submitted a total of 2,838
unit readiness reports. However, 1,395 of these reports (approximately
49 percent) were submitted late (more than 31 days since the last
report). Figure 2 shows the breakdown of the 2,838 reports and 1,395
late reports by month.
Figure 2: Marine Corps Readiness Reports: Total and 31 Days or More
Since Last Report:
[Refer to PDF for image: combined vertical bar and line graph]
Month: June 2010;
Total reports: 336;
Late reports (more than 30 days since last report): 98.
Month: July 2010;
Total reports: 354;
Late reports (more than 30 days since last report): 153.
Month: August 2010;
Total reports: 349;
Late reports (more than 30 days since last report): 190.
Month: September 2010;
Total reports: 361;
Late reports (more than 30 days since last report): 170.
Month: October 2010;
Total reports: 339;
Late reports (more than 30 days since last report): 174.
Month: November 2010;
Total reports: 356;
Late reports (more than 30 days since last report): 200.
Month: December 2010;
Total reports: 372;
Late reports (more than 30 days since last report): 130.
Month: January 2011;
Total reports: 371;
Late reports (more than 30 days since last report): 280.
Source: GAO analysis of DRRS-Marine Corps data.
Note: The figure may understate late reporting because we counted only
the reports that exceeded the 30-day threshold for reporting. We did
not assess the extent to which units submitted reports within 24 hours
of a change of status.
[End of figure]
Figure 3 shows additional details on the range of the 1,395 late
reports. It shows that 784 (approximately 56 percent) of the reports
were 1 to 4 days late, while 80 (approximately 6 percent) of the
reports were more than 30 days late.
Figure 3: Marine Corps Units Submitting Late Reports: June 2010
through January 2011:
[Refer to PDF for image: vertical bar graph]
Days late: 1-4;
Number of reports: 784.
Days late: 5-9;
Number of reports: 255.
Days late: 10-14;
Number of reports: 98.
Days late: 15-19;
Number of reports: 73.
Days late: 20-24;
Number of reports: 65.
Days late: 25-29;
Number of reports: 40.
Days late: 30 or more;
Number of reports: 80.
Source: GAO analysis of DRRS-Marine Corps data.
Note: The figure may understate late reporting because we counted only
the reports that exceeded the 30-day threshold for reporting. We did
not assess the extent to which units submitted reports within 24 hours
of a change of status.
[End of figure]
The Army's readiness reporting regulation directs units to report
their readiness on the 15th of each month. While units are permitted
to pull their personnel, equipment, and training data anytime between
the 1st and the 15th of the month, Army Regulation 220-1 requires the
units to project these data elements to the 15th of the month.
According to Army officials, approximately 97 percent of Army units
adhere to this policy requirement of reporting on the 15th. However,
during our visits with Army units, we found that not all units were
projecting their data to the 15th of the month; rather, units began
collecting data on different dates and also submitted their reports
for review by their higher headquarters on different dates:
* Some units stated they prepare their unit status reports the month
prior to the official reporting date. For example, one readiness
reporting official told us he prepared and briefed to the commander
his January unit status report--officially due on January 15, 2011--on
December 14, 2010. The official added that his unit needed to report
early so his unit's data could be incorporated into his higher
command's readiness report.
* Other units stated that they are required to report sometime within
the first week of the month.
* We also found units that stated they create a cut-off date for
extracting personnel and equipment data for the unit status reports.
For example, if the 12th is the last day to pull data and a soldier
becomes nondeployable on the 13th, that information will not be
updated until the following month's unit status report.
* Conversely, officers at other units we visited stated that they
project data for their unit status reports to the 15th of each month,
which is the required procedure according to Army Readiness Division
officials and Army regulation.
Army officials stated that the majority of units are accessing the
authoritative data sources within 15 days of the reporting date, as
required in Army Regulation 220-1. As the length of time between
reports grows, it increases the potential that unit readiness could
change and decision makers do not have access to timely, updated
information.
Army Units Are Not Consistently Reporting Force Generation Status:
During our Army unit visits, we found that units were inconsistently
reporting their status in the Army's three-phase ARFORGEN cycle. Army
Regulation 220-1 directs units to report their Army force generation
phase (i.e., RESET, Train/Ready, or Available) and expeditionary force-
type designation (i.e., contingency, deployment, or ready
expeditionary forces). Officials at some of the units we visited said
they filled in these fields in their monthly readiness report.
However, many units told us they did not know how these force
generation fields in DRRS-Army were determined. Some units stated the
force generation fields rolled over from previous reports. Other units
stated their higher command populated these fields. Additionally, we
spoke with one unit that stated it was not part of the ARFORGEN cycle
and did not report its force generation phase and expeditionary force-
type designation, but a review of DRRS-Army indicated that the unit
actually reported Train/Ready and deployment expeditionary force. Army
Readiness Branch officials told us that lower-level units may not
understand or may not receive information on their force-type
designations, and they said there has been informal discussion on
whether there could be a tool to autopopulate this information into
DRRS-Army. As a result of unclear guidance on the force generation
data fields, these specific data may not be consistent among units or
dependable for decision makers.
Units Are Not Consistent in Linking Resources and Training Assessments
with Capability Assessments:
Army and Marine Corps units are not consistently linking their two
types of mission assessments, i.e., their resource and training
mission assessments (the C-levels) with their capability assessments
(Yes, Qualified Yes, and No), in accordance with service guidance.
These assessments are linked because units should take into account
their resource levels in assessing their capabilities. Army Regulation
220-1 states that it would be inconsistent and illogical for a unit to
report C-4 (meaning it needs additional resources or training) while
concurrently reporting "Yes" or "Qualified Yes" for its capability
assessments. Furthermore, if commanders report both C-4 and Yes, the
Army Regulation directs them to provide an explanatory comment. We
reviewed a sample of reports from units reporting both C-4 and Yes,
and in some cases they included commander comments but in other cases
they did not include comments. Figure 4, below, identifies the
percentage of Army units reporting C-4, by month, that reported Yes.
Figure 4: Army C-4 Units Reporting Yes for Capability Assessments:
[Refer to PDF for image: vertical bar graph]
Month: January 2009;
Percent of reports: 14.2%.
Month: February 2009;
Percent of reports: 14.5%.
Month: March 2009;
Percent of reports: 14%.
Month: April 2009;
Percent of reports: 12.8%.
Month: May 2009;
Percent of reports: 12.9%.
Month: June 2009;
Percent of reports: 13%.
Month: July 2009;
Percent of reports: 13.4%.
Month: August 2009;
Percent of reports: 13.7%.
Month: September 2009;
Percent of reports: 14.4%.
Month: October 2009;
Percent of reports: 13.4%.
Month: November 2009;
Percent of reports: 13.3%.
Month: December 2009;
Percent of reports: 7.6%.
Month: January 2010;
Percent of reports: 15.3%.
Month: February 2010;
Percent of reports: 15.1%.
Month: March 2010;
Percent of reports: 15.4%.
Month: April 2010;
Percent of reports: 14.6%.
Month: May 2010;
Percent of reports: 14.3%.
Month: June 2010;
Percent of reports: 15%.
Month: July 2010;
Percent of reports: 15.5%.
Month: August 2010;
Percent of reports: 15%.
Month: September 2010;
Percent of reports: 17.2%.
Month: October 2010;
Percent of reports: 20.1%.
Month: November 2010;
Percent of reports: 20.4%.
Month: December 2010;
Percent of reports: 20.8%.
Month: January 2011;
Percent of reports: 21.5%.
Source: GAO analysis of DRRS-Army data.
[End of figure]
Marine Corps Order 3000.13 also directs units to correlate their C-
levels with Yes, Qualified Yes, and No assessments. Between May 2010,
when Marine Corps units began reporting in DRRS-Marine Corps, and
January 2011, the percentage of units that did not comply with the
requirement to correlate C-level assessments with capability
assessments ranged from 24 percent to 33 percent. Officials within the
Marine Corps Readiness Branch stated that a partial explanation for
the noncompliance may be misunderstandings among unit commanders.
Figure 5 shows the percentage of units that did not correctly
correlate their two types of readiness assessments.
Figure 5: Percentage of Inconsistent Assessments among Marine Corps
Units:
[Refer to PDF for image: Vertical bar graph]
Month: May 2010;
Percent of reports: 28.8%.
Month: June 2010;
Percent of reports: 26.9%.
Month: July 2010;
Percent of reports: 24.3%.
Month: August 2010;
Percent of reports: 26.1%.
Month: September 2010;
Percent of reports: 25.5%.
Month: October 2010;
Percent of reports: 27.3%.
Month: November 2010;
Percent of reports: 30.3%.
Month: December 2010;
Percent of reports: 33%.
Month: January 2011;
Percent of reports: 30.6%.
Source: GAO analysis of DRRS-Marine Corps data.
[End of figure]
Units Are Inconsistent in Reporting Resources and Capability
Assessments:
We also found inconsistencies in how units report data about their
resources, including the availability of personnel and equipment, and
assessments about their capability. For example, Army units we visited
interpreted the readiness regulation differently and therefore
inconsistently reported personnel data. Some units reported the actual
personnel on hand, whereas other units reported what was included in
their official manning document[Footnote 16] even if that document
differed from the actual personnel on hand. Additionally, Marine Corps
Readiness Branch officials told us that while equipment numbers are
automatically populated into DRRS-Marine Corps, some units have
adjusted the equipment data so that it does not match the
authoritative data source. They stated that only combat logistics
battalions (Marine Expeditionary Unit) and combat logistic companies
may adjust the equipment data, but they added that other units have
improperly adjusted the equipment data as well. These inconsistencies
in reporting resources could affect a unit's personnel, equipment, and
ultimately C-level ratings; as such, decision makers may not receive
an accurate reflection of a unit's readiness.
Moreover, we found inconsistencies in capability assessments,
specifically in the "qualified" assessments. For example, officials
from the Marine Corps Readiness Branch told us that a qualified
assessment means that the unit has trained for the mission but has not
been certified, whereas officials at some Marine Corps units told us
that qualified meant that the unit has not yet trained in the mission
but should be able to do the mission. Officials from the DRRS
Implementation Office agreed that "qualified" assessments are not
consistent within or among the services, but said the inconsistency is
acceptable because commanders are expected to include comments
explaining the qualified rating. We found that most commanders are
meeting that expectation. After reviewing a random sample of
commanders' comments from all Marine Corps units that reported a
Qualified Yes in January 2011, we estimate that 94 percent[Footnote
17] included an explanation of the qualified rating.
Internal Controls Are Not Preventing Readiness Reporting
Inconsistencies:
According to federal standards for internal control, management must
continually assess and evaluate its internal controls to assure that
the control activities being used are effective and updated when
necessary. Also, managers need to compare actual performance to
planned or expected results and analyze significant differences. While
the Army and Marine Corps conduct quality assurance reviews, the
reviews are not identifying all the inconsistencies in their units'
reporting methods and in the reported data. For example, some Army
installations have offices that review readiness reports on a monthly
basis, and higher-level commands from both services review the reports
from their subordinates. However, officials from the Army Readiness
Division and Marine Corps Readiness Branch acknowledged that the
quality assurance reviews are incomplete and have not prevented
inconsistent data from being reported. Also, the Army has a management
control evaluation checklist that commanders can use to review their
readiness reports, but Army Readiness Division officials said the use
of this checklist is not required, and some officials we spoke with
said the checklist is not used.
Furthermore, neither the Army nor Marine Corps have comprehensive
mechanisms in place to prevent the submission of data that fail to
comply with the reporting requirements. Army Readiness Division
officials said these mechanisms could be added to the DRRS-Army system
in the future, but this is currently not a top priority. The Army
officials said there are some warnings in the system, such as one that
asks commanders to explain if they choose both C-4 and Yes. Marine
Corps Readiness Branch officials also said they have warning flags in
their system, but these warnings do not prevent submission of
inconsistent unit status reports. Without further clarifying guidance,
effective quality assurance reviews, or system mechanisms to prevent
the submission of inconsistent information, the services cannot be
assured that they are providing decision makers within and outside of
DOD with timely and consistent readiness reporting data.
OSD and the Services Continue to Develop Their Respective Systems in
the Enterprise:
The DRRS Concept of Operations[Footnote 18] calls for a family of
systems to be developed and operated under a single framework to share
information requirements and data elements seamlessly across the
enterprise. Specifically, since we last reported on DRRS in September
2009, DOD and the services have continued to take steps to develop
their respective systems--DRRS-Army, DRRS-Marine Corps, and DRRS-
Strategic. Because the developers have focused on the needs of
different system users, and have yet to reach agreement on key
elements, progress in achieving interoperability among the three
individual systems and across the enterprise has been incremental. In
our 2009 report we noted that a number of issues including unclear
requirements were affecting system development and the system's
ability to display service readiness data.[Footnote 19] On August 2,
2010 DOD issued a memorandum signed by the Deputy Assistant Secretary
of Defense (C3, Space and Spectrum), Deputy Under Secretary of Defense
for Readiness, and the Director, Joint Staff, that addresses DRRS
standards and technical interface specifications for interoperability.
It directed the services to submit plans for implementing the
interoperability and technical standards within 60 days of the date of
the memo and stated that the services should convert to those
standards, in most cases, within 6 months of the signature date of the
memorandum.[Footnote 20] As of April 1, 2011, the Army and Marine
Corps have submitted their plans to OSD but have not fully implemented
their plans. The Army and Marine Corps project they will achieve
implementation in October 2012 and July 2012, respectively. However,
OSD and the services have not reached consensus on various issues,
including the type of information, specific steps, or time frames for
successfully implementing the standards and plans to increase the
interoperability of the enterprise system.
Our September 2009 report stated that a lack of oversight was also
hindering system development and integration. In commenting on our
report, OSD commented that the DRRS Executive Committee governance
process would continue to provide sustained functional oversight of
the DRRS program. Since our report was issued, the DRRS Executive
Committee has met twice. As of April 2011, the services' plans for
implementing the system interoperability and technical standards
memorandum have not been briefed to the DRRS Executive Committee.
However, the General Officers Steering Committee, the second level of
the DRRS governance structure, is scheduled to be briefed in April
2011. In our report, we also recommended that DOD conduct an
independent program risk assessment of DRRS, and use the findings in
our report and the risk assessment to decide how to redirect the
program structure, approach, funding, management, and oversight. The
Enterprise Planning and Investment Business Transformation Agency was
planning to begin its risk assessment in April 2011. However, as we
were finishing this review the risk assessment was postponed and is
currently scheduled to begin in the fall of 2011. Until this
assessment is completed and presented to the DRRS Executive Committee
for any actions, OSD will not have the information needed to reach
consensus with the services and make any adjustments needed to achieve
interoperability.
Conclusions:
Army and Marine Corps unit readiness information has become
increasingly important as DOD has deployed units, or parts of units,
to provide combat commanders with needed capabilities. Recent changes
to Army and Marine Corps readiness reporting guidance have improved
both the quantity and the objectivity of assigned mission capability
data available to Congress and DOD decision makers. However, some
readiness data are currently being reported in an inconsistent manner
that diminishes its value to decision makers. Furthermore, Army units
are reporting T-5 and C-5 as directed rather than their actual
training or actual overall mission assessment. Without actual training
and the overall readiness status of Army units throughout the entire
force generation cycle, decision makers in DOD and Congress may have
limited information to determine which Army units have the
capabilities to respond to unexpected missions or combatant commander
requirements. Moreover, without additional clarity in guidance,
effective quality assurance reviews, or system mechanisms to prevent
the submission of inconsistent information, the Army and Marine Corps
cannot be assured they are providing decision makers within and
outside of DOD with timely and consistent readiness data.
Recommendations for Executive Action:
To increase the visibility over the capabilities of units in RESET, we
recommend that the Secretary of Defense direct the Secretary of the
Army in consultation with other system developers within the
enterprise to:
* Develop an alternative means of indicating which units are in RESET
without using C-5 as a means to flag units in RESET.
To increase the timeliness and consistency of readiness information
and thus enhance the usefulness of this information to decision
makers, we recommend that the Secretary of Defense direct the
Secretary of the Army and Commandant of the Marine Corps to:
* Provide additional internal controls, which could include clarifying
policy guidance, increasing quality assurance reviews, or putting
system technical checks in place to prevent submission of data that
does not comply with service readiness reporting requirements.
Agency Comments and Our Evaluation:
In written comments on a draft of this report, DOD did not concur with
our recommendations. Specifically, DOD did not concur with our
recommendation that the Secretary of Defense direct the Secretary of
the Army to develop an alternative means of indicating which units are
in RESET without using C-5 as a means to flag units in RESET. In its
comments, DOD stated that the use of the "C-5" flag is appropriate and
consistent as the readiness indicator for units in RESET. Further, DOD
stated that the Army is fully aware of the readiness needs for those
units in RESET, and both the Army and DOD enterprise have the
information required to understand the needs and capabilities of those
forces. Specifically, DOD also noted that the DRRS enterprise provides
visibility into the capabilities of a unit at any phase of the force
rotation cycle, including reset. Even if an Army unit is reporting a C-
5 assessment, DOD stated that DRRS provides an assessment of the
remaining unit capabilities through the mission essential task list
construct.
We recognize that different types of information beyond C-ratings are
reflected in the DRRS enterprise including mission essential task
assessments. As we noted in the report, DRRS is intended to capture
readiness data from multiple sources and to report relevant data to a
range of decision makers within DOD and the Congress. Specifically the
enterprise reports assessments of both capabilities (as captured in
mission essential task ratings) and training and resources (as
captured in C-ratings, and the associated personnel, equipment, and
training ratings). Decision makers use readiness data, including C-
ratings, to support operation plans, determine the readiness of units
to respond to unexpected contingencies, or analyze resource needs.
However, as a result of the Army policy's change, units in RESET no
longer report their actual C-level or their actual level of training
but rather are directed to report an overall C-5 rating and a T-5 in
training. Without actual C-ratings decision makers do not have
complete information and now must rely solely on the units' subjective
mission essential task assessments as the means for evaluating their
core mission capabilities.
While we agree, as DOD suggests in its comments, that mission
essential task assessments are a valuable piece of readiness
information, they are not totally independent. To illustrate, the Army
and Marine Corps have both recently provided their units with guidance
that clarifies the important complementary nature of C-ratings and
mission essential task assessments. Specifically, they are correlated
because a unit should take into account its resource levels in
assessing its ability to perform mission essential tasks. Furthermore,
our report shows, that units do not always properly rate their ability
to perform mission essential tasks. For example, on a monthly basis
since January 2009, approximately 15 percent of C-4 reporting units
had mission essential task ratings that were, according to Army
guidance, inconsistent and illogical when compared to their resource
(C-level) ratings. Until Army units in RESET report their actual C-
ratings, decision makers will not have complete information on the
readiness status of units nor will they be able to compare mission
essential task ratings to actual C-ratings to see whether the ratings
are logical and consistent. For these reasons, we do not agree with
DOD's view that using C-5 as a means of indicating which units are in
RESET rather than requiring units to report their actual readiness
status is an appropriate and consistent readiness indicator. We
therefore continue to believe the Secretary of Defense should direct
the Secretary of the Army to develop an alternative means of
indicating which units are in RESET without using C-5 as a means to
flag units in RESET.
DOD also did not concur with our recommendation that the Secretary of
Defense direct the Secretary of the Army and the Commandant of the
Marine Corps to provide additional internal controls, which could
include clarifying policy guidance, increasing quality assurance
reviews, or putting system technical checks in place to prevent
submission of data that does not comply with service readiness
reporting requirements. DOD stated that internal controls are
adequate. Specifically, it noted that the Army is currently updating
its unit status reporting process and software applications, and that
these changes will serve to strengthen compliance, promote
consistency, and ensure uniformity of the system. It also noted that
the Marine Corps is completing a plan to modify policy and implement
procedures for improving compliance with readiness reporting ratings,
timelines, and data. During the course of our review, we briefed the
results of our findings to the services, and we are aware that they
are in various stages of taking action. We have not had the
opportunity to evaluate the services' efforts; however, we believe the
description provided by DOD reflects the types of internal controls
covered under our recommendations.
DOD also noted that it did not agree with the report's statement in
the summary that the DRRS program does not have sufficient information
to achieve interoperability among the services and OSD. DOD stated
that the statement does not represent the routine and informed
decisions that are made across OSD and the services. It further stated
that a September 2010 technology assessment found that DRRS-Strategic
had no critical technology roadblocks to system integration and that
the system currently consumes data from the service unique systems
while continuously working to improve transfer methods. DOD noted that
DRRS-Navy, DRRS-Army, and DRRS-Marine Corps will be able to transfer
data even more efficiently and effectively within the next 18 months.
As stated in our report, we specifically recognize that DRRS has
evolved into an enterprise which represents a family of service and
OSD computer information systems and databases. We further note that
the enterprise is intended to capture readiness data from multiple
sources and that DRRS-Army, DRRS-Marine Corps and DRRS-Navy systems
currently provide information and data to a number of other systems
within and outside of the enterprise. In a September 2009 report, we
identified a number of challenges facing DRRS and concluded that an
independent assessment was needed to assess the program's risk. We
recommended that DOD conduct this assessment and use the results to
redirect the program's approach, structure, and oversight. DOD
concurred with our recommendation and stated that the assessment would
be conducted by the middle of fiscal year 2010. Because of our prior
work and the fact that this assessment has not yet been done, we did
not, as part of the work reflected in this current report, perform a
technical review of the current state of interoperability. Rather, as
stated in the report, we addressed the steps DOD and the services have
continued to take to develop their respective systems since we last
reported in September 2009. Based on our work, we found that OSD and
the services have not reached consensus on various issues, including
the type of information, specific steps, or timeframes for
successfully implementing interoperability and technical standards and
plans to increase the interoperability of the enterprise system. Given
that the DRRS Concept of Operations calls for a family of systems to
be developed and operated under a single framework to share
information requirements and data elements seamlessly across the
enterprise, achieving consensus on standards and other aspects needed
to achieve interoperability is critical. Because a risk assessment
includes assessing system vulnerabilities and identifying mitigation
solutions, we continue to believe it would produce information that
could assist DOD in reaching consensus with the services and in making
any adjustments needed to achieve interoperability.
The full text of DOD's written comments is reprinted in appendix II.
We are sending copies of this report to the appropriate congressional
committees, the Secretary of Defense, the Secretary of the Army, and
the Commandant of the Marine Corps. The report will also be available
at no charge on the GAO Web site at [hyperlink, http://www.gao.gov].
If you or your staffs have questions about this report, please contact
me at pickups@gao.gov or (202) 512-9619. Contact points for our
Offices of Congressional Relations and Public Affairs may be found on
the last page of this report. GAO staff who made key contributions to
this report are listed in appendix III.
Signed by:
Sharon L. Pickup:
Director, Defense Capabilities and Management:
List of Congressional Committees:
The Honorable Carl Levin:
Chairman:
The Honorable John McCain:
Ranking Member:
Committee on Armed Services:
United States Senate:
The Honorable Daniel K. Inouye:
Chairman:
The Honorable Thad Cochran:
Ranking Member:
Committee on Appropriations:
United States Senate:
The Honorable Howard McKeon:
Chairman:
The Honorable Adam Smith:
Ranking Member:
Committee on Armed Services:
House of Representatives:
The Honorable Harold Rogers:
Chairman:
The Honorable Norm Dicks:
Ranking Member:
Committee on Appropriations:
House of Representatives:
[End of section]
Appendix I: Scope and Methodology:
To assess the extent to which current readiness reporting requirements
have affected the content of readiness information provided to various
decision makers within and outside of the Department of Defense (DOD),
we interviewed officials from the Department of Army-Readiness
Division and Headquarters Marine Corps Readiness Branch. We analyzed
Army and Marine Corps 2010 readiness reporting guidance, Army
Regulation 220-1 and Marine Corps Order 3000.13, and compared the
updated guidance to the previous versions of relevant Army and Marine
Corps readiness reporting guidance. Further, we compared the changes
in the services' guidance with DOD Directive 7730.65 and Chairman of
the Joint Chiefs of Staff Instruction 3401.02A to determine if the
service guidance aligned with DOD and Joint Chiefs of Staff readiness
reporting requirements. We also reviewed related readiness reporting
documents, such as Army and Marine Corps readiness briefings to DOD
and Congress. To assess the extent to which the readiness information
available to DOD (Office of the Secretary of Defense (OSD), Chairman
of the Joint Chiefs of Staff, and the services) and the Congress has
changed since 2009, we compared the currently available information to
the information that was previously provided through the statutorily
required reports (Quarterly Readiness Report to Congress and Joint
Forces Readiness Report). We also interviewed officials responsible
for submitting and overseeing readiness reports to determine how
available information has changed.
To assess the extent to which the Army and Marine Corps units have
consistently implemented their current readiness reporting guidance,
we first reviewed the data within each service's respective readiness
reporting systems and compared the data with system criteria--Army
Regulation 220-1 and Marine Corps Order 3000.13. Specifically, within
the Army Readiness Management System (the data output tool for DRRS-
Army) and the Marine Corps Readiness Management Output Tool (the data
output tool for DRRS-Marine Corps), we conducted queries of all
reporting units from January 2009 through January 2011. To further
assess inconsistencies between resource and training mission
assessments (the C-levels) and their capability assessments (Yes,
Qualified Yes, and No), we selected random samples of units from
subsets of the Army and Marine Corps units with potential
inconsistencies. Specifically, we selected a random sample of Army
units that reported C-4 and Yes, and a random sample of Marine Corps
units that reported a Qualified Yes in January 2011. The sample size
and total population of units for each sample is classified. For each
selected unit we reviewed commander comments contained in the
assessment reports and determined whether these comments addressed the
inconsistencies in the assessments. Based on these reviews, we
generated estimates and 95 percent confidence intervals that allow us
to generalize the results to the subsets of Army and Marine Corps
units with potential inconsistencies. We chose January 2009 as the
baseline for our queries because the Army began implementing
significant changes in December 2009. Because the Marine Corps did not
begin using DRRS-Marine Corps until May 2010, our queries of Marine
Corps unit data only provided output from May 2010 through January
2011. When we reviewed samples of data, we took a statistical random
sample and determined our estimates to a 95 percent confidence
interval.
We assessed the reliability of the DRRS data presented in this report.
Specifically, the Army and Marine Corps provided information based on
data reliability assessment questions we provided, which included
information on an overview of the data, data collection processes and
procedures, data quality controls, and overall perceptions of data
quality. We received documentation about how the systems are
structured; a data dictionary that includes data element definitions,
descriptions, codes, and values; written procedures in place to ensure
that the appropriate information is collected for each category of
unit readiness; and specific guidelines on the correct classification
of readiness data taken into specific categories. Additionally, we
interviewed the Army Readiness Division and Marine Corps Readiness
Branch to obtain further clarification on data reliability. We
interviewed relevant officials at reporting units to discuss how the
data were collected and reported into the system. We also analyzed
system data for selected data fields. After assessing the data, we
determined that the data were sufficiently reliable for the purposes
of assessing the consistency of the implementation of the current
readiness reporting guidance, and we discuss our findings in the
report.
We met with officials from several installations and units to
complement our data analysis. In choosing which of the Army's over
6,000 reporting units and which of the Marine Corps' approximately 350
reporting units to review, we made a nonprobability selection of
installations that have a variety of different unit levels in order to
better maximize our coverage of units. For the Army, we chose to visit
3 of 68 reporting installations. We then reviewed the 10 largest
installations and compiled data on the number of units present,
components represented, and Army Force Generation phases represented
at each identified installation. We also chose to visit 1 of 6
reporting Army National Guard installations. We also met with
officials from Army National Guard units located in Washington, D.C.
because the units were close to our office and the visits did not
require any travel costs. For the Marine Corps, we selected all
reporting Marine Corps installations and ranked them by number of
units present at each installation and components represented. We
chose to visit 2 of 15 Marine Corps installations. Within each
installation, our criteria for identifying units from which to obtain
information included command level, component type, force generation
status, core-level and core-mission task assessment, and assigned-
level and assigned-mission task assessment. Table 3 shows the units we
met with and their locations. During our unit visits, to gain a better
understanding of the readiness reporting process and how the readiness
reporting changes are being implemented, we interviewed the officials
responsible for inputting the unit's readiness information into DRRS-
Army and DRRS-Marine Corps as well as the officials who are
responsible for making the mission assessments and verifying the
information. These unit visits serve as examples, and information
about them is not meant to be generalized to all readiness reporting
processes and procedures.
Table 3: Installations and Units Visited:
U.S. Army:
Installation: Fort Benning, Georgia; Number of units visited: 7.
1st Battalion, 15th Infantry Regiment;
3rd Heavy Brigade Combat Team, 3rd Infantry Division;
14th Combat Support Hospital;
60th Engineering Company; 63rd Engineering Company;
U.S. Army Garrison Fort Benning;
U.S. Army Marksmanship Unit.
Installation: U.S. Army: Fort Picket, Virginia; Number of units
visited: 3.
183rd Regiment Regional Training Institute;
Maneuver Training Center;
Virginia Army National Guard G-3.
Installation: U.S. Army: Fort Stewart, Georgia; Number of units
visited: 10.
2nd Heavy Brigade Combat Team;
24th Ordnance Company;
139th Military Police Company;
226 Quarter Master Supply Company;
495th Movement Control Team;
514 Engineer Detachment;
U.S. Army Garrison Fort Stewart Directorate of Plans, Training,
Mobilization and Security;
U.S. Army Medical Department Activity Fort Stewart;
U.S. Army Reserve Element 188th Infantry Brigade;
U.S. Army Reserve Element 349th Regiment, Logistics Support Battalion.
Installation: U.S. Army: Joint Base Lewis-McChord and Camp Murray,
Washington; Number of units visited: 8.
1st Battalion, 23rd Infantry Regiment;
4th Squadron, 6th Air Cavalry;
22nd Engineering Company;
23rd Chemical Battalion Headquarters Detachment;
56th Army Band;
702 Brigade Support Battalion;
Joint Base Lewis-McChord Installation;
Washington National Guard 81st Heavy Brigade Combat Team.
Installation: U.S. Army: D.C. Armory,[A] Washington, D.C.; Number of
units visited: 5.
104th Maintenance Detachment;
273rd Military Police Combat Support Team;
275th Military Police Company;
372nd Military Police Headquarters and Headquarter Detachment Support
Team;
547th Transportation Company.
Marine Corps:
Installation: Camp Pendleton, California; Number of units visited: 12.
1st Dental Battalion, 1st Marine Logistic Group;
1st Marine Expeditionary Force;
4th Light Armored Reconnaissance Battalion, 4th Marine Division
(Marine Corps Reserves);
5th Marine Regiment;
5th Marine Regiment Headquarters Company;
11th Marine Regiment, 1st Marine Division;
Headquarters Battalion, 1st Marine Division;
Headquarters, Marine Corps Installations West;
Headquarters, Marine Corps Base Camp Pendleton;
Marine Aircraft Group 39;
Marine Light Attack Helicopter Squadron 367;
Marine Wing Support Squadron 372; Number of units visited: [Empty].
Installations: Marine Corps Air Station Miramar, California; Number of
units visited: 8.
Marine Air Control Group 38;
Marine Corps Air Station Miramar;
Marine Fighter Attack Squadron 232;
Marine Heavy Helicopter Squadron 465;
Marine Medium Helicopter Squadron 163;
Marine Medium Tiltrotor Squadron 166;
Marine Wing Communications Squadron 38;
Marine Wing Support Group 37.
Source: GAO.
[A] The D.C. Armory is not a reporting installation, although we met
with D.C. Army National Guard units at that location.
[End of table]
To assess how system developments for the Defense Readiness Reporting
Systems, DRRS-Army, DRRS-Marine Corps, and DRRS-Strategic, affected
the enterprise, we interviewed officials from the DRRS Implementation
Office who are responsible for the system development of the
enterprise and DRRS-Strategic. We also interviewed officials from the
Joint Staff who are responsible for assisting the Chairman in
executing his statutory readiness reporting responsibilities. Members
of the Joint Staff co-chair the DRRS governance structure at all
levels. We also interviewed Army and Marine Corps officials who are
responsible for their service-specific readiness reporting systems
that are part of the enterprise. Finally, we interviewed officials
from the Enterprise Planning and Investment Business Transformation
Agency, which is conducting the DRRS-Strategic risk assessment.
Further, we reviewed the 2010 DOD memorandum for DRRS Standards and
Technical Interface Specifications for Interoperability, the 2009 DRRS
Concept of Operations, and the DRRS Interim Implementation guidance
1.0, 2.0, 3.0, and 4.0. We also reviewed Army and Marine Corps
memoranda and plans implementing the requirements in the 2010 DOD
memorandum for DRRS Standards and Technical Interface Specifications
for Interoperability.
We conducted this performance audit from August 2010 to June 2011, in
accordance with generally accepted government auditing standards.
Those standards require that we plan and perform the audit to obtain
sufficient, appropriate evidence to provide a reasonable basis for our
findings and conclusions based on our audit objectives. We believe
that the evidence obtained provides a reasonable basis for our
findings and conclusions based on our audit objectives.
[End of section]
Appendix II: Comments from the Department of Defense:
Office Of The Under Secretary Of Defense:
Personnel And Readiness:
4000 Defense Pentagon:
Washington, D.C. 20301-4000
Ms. Sharon L. Pickup:
Director, Defense Capabilities and Management:
U.S. Government Accountability Office:
441 G Street N.W.
Washington, D.C. 20548:
Dear Ms. Pickup:
This is the Department of Defense (DoD) response to the GAO draft
report, GA0-11-526, Military Readiness: Army and Marine Corps
Reporting Provides Additional Data, but Actions Needed to Improve
Consistency; dated April 15, 2011 (GAO Code 35 522).
Thank you for the opportunity to review this report and associated
recommendations. We disagree with the recommendation that the
Secretary of the Army should be directed to develop an alternative
means of indicating which units are in reset. The use of the "C-5"
flag is appropriate and consistent as the readiness indicator for
units in reset. Furthermore, the Army is fully aware of the readiness
needs for those units in reset, and both the Army and DoD enterprise
have the information required to understand the needs and capabilities
of those forces.
In fact, the Defense Readiness Reporting System (DRRS) enterprise
provides visibility into the capabilities of a unit at any phase of
the force rotation cycle, including reset. Even if an Army unit is
reporting a "C-5" assessment, DRRS provides an assessment of the
remaining unit capabilities through the mission essential task list
construct.
We also do not agree with the recommendation that the Army and the
Marine Corps need to provide additional internal controls. The Army is
currently updating its unit status reporting process and software
applications, and these changes will serve to strengthen compliance,
promote consistency, and ensure uniformity of the system. The Marine
Corps is also completing a plan to modify policy and implement
procedures for improving compliance with readiness reporting ratings,
timelines, and data.
Finally, we do not agree with the report's statement in the summary
that the DRRS program does not have sufficient information to achieve
interoperability among the Services and OSD. The statement does not
represent the routine and informed decisions that are made across OSD
and the Services. The September 2010 technology assessment, in
response to the NDAA FY 2010 Senate Report, found that DRRS-Strategic
has no critical technology roadblocks to systems integration. DRRS-S
currently consumes data from the Service unique systems while
continuously working to improve these transfer methods. To that end,
DRRS-Navy, DRRS-Army, and DRRS-Marine Corps will all be able to
transfer data even more efficiently and effectively within the next 18
months.
Signed by:
Samuel D. Kleinman:
Deputy Assistant Secretary of Defense Readiness:
[End of letter]
GAO Draft Report Dated April 15, 2011:
GA0-11-526 (GAO CODE 351522):
"Military Readiness: Army And Marine Corps Reporting Provides
Additional Data, But Actions Needed To Improve Consistency"
Department Of Defense Comments To The GAO Recommendations:
Recommendation 1: The GAO recommends that the Secretary of Defense
direct the Secretary of the Army to develop an alternative means of
indicating which units are in RESET without using C-5 as a means to
flag units in RESET. (See page 22/GAO Draft Report.)
DoD Response: Non-concur. Alternative means already exist to indicate
which units are in reset. The Army and DoD are currently able to see
the rotation status of Army units through the Defense Readiness
Reporting System. DRRS provides the status of existing capabilities of
a unit even if that unit is in reset.
Recommendation 2: The GAO recommends that the Secretary of Defense
direct the Secretary of the Army and the Commandant of the Marine
Corps to provide additional internal controls, which could include
clarifying policy guidance, increasing quality assurance reviews, or
putting system technical checks in place to prevent submission of data
that does not comply with service readiness reporting requirements.
(See page 23/GAO Draft Report.)
DoD Response: Non-concur. The Army is currently updating its unit
status reporting process and software applications, and these changes
will serve to strengthen compliance, promote consistency, and ensure
uniformity of the system. The Marine Corps is also completing a plan
to modify policy and implement procedures for improving compliance
with readiness reporting ratings, timelines, and data. The internal
controls are adequate.
[End of section]
Appendix III: GAO Contact and Staff Acknowledgments:
GAO Contact:
Sharon L. Pickup, (202) 512-9619 or pickups@gao.gov:
Staff Acknowledgments:
In addition to the contact named above, key contributors to this
report were Michael Ferren (Assistant Director), Jim Ashley, Randy De
Leon, Nicole Harms, Richard Powelson, Terry Richardson, Jodie Sandel,
Amie Steele, and Nicole Volchko.
[End of section]
Footnotes:
[1] H.R. Rep. No. 111-491, at 259 (2010). This report accompanied H.R.
5136.
[2] S. Rep. No. 111-201, at 119 (2010). This report accompanied S.
3454.
[3] Pub. L. No. 105-261, §373 (1998).
[4] The six geographic combatant commands--U.S. Central Command, U.S.
European Command, U.S. Northern Command, U.S. Pacific Command, U.S.
Southern Command, and U.S. Africa Command--are responsible for U.S.
military operations within their areas, and contingency planning and
commanding U.S. forces in their regions.
[5] Core missions are also referred to as primary missions and are the
wartime missions the unit was organized and designed to perform. The
Army recently replaced the phrase primary mission with "core
functions/design capabilities" which is used to indicate the full
spectrum of functions and capabilities the unit was designed to
perform.
[6] Assigned missions are also known as directed missions and
generally describe those missions assigned to units by operations
plans or operations orders. Assigned missions may or may not be the
same as the unit's core mission.
[7] Chairman of the Joint Chiefs of Staff Instruction, Force Readiness
Reporting, 3401.02B, Sept. 21, 2010, states that units in C-5 status
may be capable of undertaking non-traditional, non-wartime related
missions.
[8] Army Regulation 220-1, Army Unit Status Reporting and Force
Registration-Consolidated Policies (Apr. 15, 2010).
[9] Marine Corps Order 3000.13, Marine Corps Readiness Reporting
Standard Operating Procedures (SOP) (July 30, 2010).
[10] Section 482 of Title 10 of the U.S. Code requires that the
Secretary of Defense, on a quarterly basis, submit to Congress a
report regarding military readiness.
[11] Section 117 of Title 10 of the U.S. Code requires the Chairman of
the Joint Chiefs of Staff to conduct, on a quarterly basis, a joint
readiness review to assess the capability of the armed forces to
execute their wartime missions based upon their posture at the time
the review is conducted and to submit a report containing the results
of each quarterly review to the congressional defense committees.
[12] Based on the current 12 month deployments and the Army's goal for
active units to spend twice as much time at home as deployed, units
would spend a total of 6 months out of their 36 month cycle in RESET,
or approximately 16 percent. Reserve Component units have different
goals for their time at home, but also spend a relatively small
portion of the cycle in RESET.
[13] Army Regulation 220-1, Unit Status Reporting (Mar. 16, 2006).
[14] 10 U.S.C. §482.
[15] Army Regulation 525-29, Army Force Generation (Mar. 14, 2011).
[16] The officials used form AAA-162, Unit Personnel Accountability
Report.
[17] The 95 percent confidence interval for this estimate ranges from
90 to 98 percent. The sample size and total population of units rated
as Qualified Yes are classified.
[18] DOD: Defense Readiness Reporting System Concept of Operations,
Version 3.0 (Jan. 22, 2009).
[19] GAO, Military Readiness: DOD Needs to Strengthen Management and
Oversight of the Defense Readiness Reporting System, [hyperlink,
http://www.gao.gov/products/GAO-09-518] (Washington, D.C.: Sept. 25,
2009).
[20] Office of the Under Secretary of Defense, Personnel and
Readiness, Memorandum: Defense Readiness Reporting System (DRRS)
Standards and Technical Interface Specifications for Interoperability
(Aug. 2, 2010).
[End of section]
GAO's Mission:
The Government Accountability Office, the audit, evaluation and
investigative arm of Congress, exists to support Congress in meeting
its constitutional responsibilities and to help improve the performance
and accountability of the federal government for the American people.
GAO examines the use of public funds; evaluates federal programs and
policies; and provides analyses, recommendations, and other assistance
to help Congress make informed oversight, policy, and funding
decisions. GAO's commitment to good government is reflected in its core
values of accountability, integrity, and reliability.
Obtaining Copies of GAO Reports and Testimony:
The fastest and easiest way to obtain copies of GAO documents at no
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each
weekday, GAO posts newly released reports, testimony, and
correspondence on its Web site. To have GAO e-mail you a list of newly
posted products every afternoon, go to [hyperlink, http://www.gao.gov]
and select "E-mail Updates."
Order by Phone:
The price of each GAO publication reflects GAO‘s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black and
white. Pricing and ordering information is posted on GAO‘s Web site,
[hyperlink, http://www.gao.gov/ordering.htm].
Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537.
Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional
information.
To Report Fraud, Waste, and Abuse in Federal Programs:
Contact:
Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]:
E-mail: fraudnet@gao.gov:
Automated answering system: (800) 424-5454 or (202) 512-7470:
Congressional Relations:
Ralph Dawn, Managing Director, dawnr@gao.gov:
(202) 512-4400:
U.S. Government Accountability Office:
441 G Street NW, Room 7125:
Washington, D.C. 20548:
Public Affairs:
Chuck Young, Managing Director, youngc1@gao.gov:
(202) 512-4800:
U.S. Government Accountability Office:
441 G Street NW, Room 7149:
Washington, D.C. 20548: