Information Technology
OMB Has Made Improvements to Its Dashboard, but Further Work Is Needed by Agencies and OMB to Ensure Data Accuracy
Gao ID: GAO-11-262 March 15, 2011
Each year the federal government spends billions of dollars on information technology (IT) investments. Given the importance of oversight, the Office of Management and Budget (OMB) established a public Web site, referred to as the IT Dashboard, that provides detailed information on about 800 federal IT investments, including assessments of actual performance against cost and schedule targets (referred to as ratings). In the second of a series of Dashboard reviews, GAO was asked to (1) determine OMB's efforts to improve the Dashboard and how it is using data from the Dashboard, and (2) examine the accuracy of the Dashboard's cost and schedule performance ratings. To do so, GAO analyzed documentation on OMB oversight efforts and Dashboard improvement plans, compared the performance of 10 major investments from five agencies with large IT budgets against the ratings on the Dashboard, and interviewed OMB and agency officials.
Since GAO's first review, in July 2010, OMB has initiated several efforts to increase the Dashboard's value as an oversight tool, and has used the Dashboard's data to improve federal IT management. These efforts include streamlining key OMB investment reporting tools, eliminating manual monthly submissions, coordinating with agencies to improve data, and improving the Dashboard's user interface. Recent changes provide new views of historical data and rating changes. OMB anticipates that these efforts will increase the reliability of the data on the Dashboard. To improve IT management, OMB analysts use Dashboard data to track investment changes and identify issues with performance. OMB officials stated that they use these data to identify poorly performing IT investments for review sessions by OMB and agency leadership. OMB reported that these sessions and other management reviews have resulted in a $3 billion reduction in life-cycle costs, as of December 2010. While the efforts above as well as initial actions taken to address issues GAO identified in its prior review--such as OMB's updated ratings calculations to factor in ongoing milestones to better reflect current status--have contributed to data quality improvements, performance data inaccuracies remain. The ratings of selected IT investments on the Dashboard did not always accurately reflect current performance, which is counter to the Web site's purpose of reporting near real-time performance. Specifically, GAO found that cost ratings were inaccurate for six of the investments that GAO reviewed and schedule ratings were inaccurate for nine. For example, the Dashboard rating for a Department of Homeland Security investment reported significant cost variances for 3 months in 2010; however, GAO's analysis showed lesser variances from cost targets for the same months. Conversely, a Department of Transportation investment was reported as on schedule on the Dashboard, which does not reflect the significant delays GAO has identified in recent work. These inaccuracies can be attributed to weaknesses in how agencies report data to the Dashboard, such as providing erroneous data submissions, as well as limitations in how OMB calculates the ratings. Until the selected agencies and OMB resolve these issues, ratings will continue to often be inaccurate and may not reflect current program performance. GAO is recommending that selected agencies take steps to improve the accuracy and reliability of Dashboard information and OMB improve how it rates investments relative to current performance and schedule variance. Agencies generally concurred with the recommendations; OMB did not concur with the first recommendation but concurred with the second. GAO maintains that until OMB implements both, performance may continue to be inaccurately represented on the Dashboard.
Recommendations
Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.
Director:
David A. Powner
Team:
Government Accountability Office: Information Technology
Phone:
(202) 512-9286
GAO-11-262, Information Technology: OMB Has Made Improvements to Its Dashboard, but Further Work Is Needed by Agencies and OMB to Ensure Data Accuracy
This is the accessible text file for GAO report number GAO-11-262
entitled 'Information Technology: OMB Has Made Improvements to Its
Dashboard, but Further Work Is Needed by Agencies and OMB to Ensure
Data Accuracy' which was released on March 15, 2011.
This text file was formatted by the U.S. Government Accountability
Office (GAO) to be accessible to users with visual impairments, as
part of a longer term project to improve GAO products' accessibility.
Every attempt has been made to maintain the structural and data
integrity of the original printed product. Accessibility features,
such as text descriptions of tables, consecutively numbered footnotes
placed at the end of the file, and the text of agency comment letters,
are provided but may not exactly duplicate the presentation or format
of the printed version. The portable document format (PDF) file is an
exact electronic replica of the printed version. We welcome your
feedback. Please E-mail your comments regarding the contents or
accessibility features of this document to Webmaster@gao.gov.
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed
in its entirety without further permission from GAO. Because this work
may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this
material separately.
[This report was revised on March 15, 2011, to correct Appendix IV.
This appendix initially and inadvertently contained an extra page,
which has been removed.]
United States Government Accountability Office:
GAO:
Report to Congressional Requesters:
March 2011:
Information Technology:
OMB Has Made Improvements to Its Dashboard, but Further Work Is Needed
by Agencies and OMB to Ensure Data Accuracy:
GAO-11-262:
GAO Highlights:
Highlights of GAO-11-262, a report to congressional requesters.
Why GAO Did This Study:
Each year the federal government spends billions of dollars on
information technology (IT) investments. Given the importance of
oversight, the Office of Management and Budget (OMB) established a
public Web site, referred to as the IT Dashboard, that provides
detailed information on about 800 federal IT investments, including
assessments of actual performance against cost and schedule targets
(referred to as ratings). In the second of a series of Dashboard
reviews, GAO was asked to (1) determine OMB‘s efforts to improve the
Dashboard and how it is using data from the Dashboard, and (2) examine
the accuracy of the Dashboard‘s cost and schedule performance ratings.
To do so, GAO analyzed documentation on OMB oversight efforts and
Dashboard improvement plans, compared the performance of 10 major
investments from five agencies with large IT budgets against the
ratings on the Dashboard, and interviewed OMB and agency officials.
What GAO Found:
Since GAO‘s first review, in July 2010, OMB has initiated several
efforts to increase the Dashboard‘s value as an oversight tool, and
has used the Dashboard‘s data to improve federal IT management. These
efforts include streamlining key OMB investment reporting tools,
eliminating manual monthly submissions, coordinating with agencies to
improve data, and improving the Dashboard‘s user interface. Recent
changes provide new views of historical data and rating changes. OMB
anticipates that these efforts will increase the reliability of the
data on the Dashboard. To improve IT management, OMB analysts use
Dashboard data to track investment changes and identify issues with
performance. OMB officials stated that they use these data to identify
poorly performing IT investments for review sessions by OMB and agency
leadership. OMB reported that these sessions and other management
reviews have resulted in a $3 billion reduction in life-cycle costs,
as of December 2010.
While the efforts above as well as initial actions taken to address
issues GAO identified in its prior review”such as OMB‘s updated
ratings calculations to factor in ongoing milestones to better reflect
current status”have contributed to data quality improvements,
performance data inaccuracies remain. The ratings of selected IT
investments on the Dashboard did not always accurately reflect current
performance, which is counter to the Web site‘s purpose of reporting
near real-time performance. Specifically, GAO found that cost ratings
were inaccurate for six of the investments that GAO reviewed and
schedule ratings were inaccurate for nine. For example, the Dashboard
rating for a Department of Homeland Security investment reported
significant cost variances for 3 months in 2010; however, GAO‘s
analysis showed lesser variances from cost targets for the same
months. Conversely, a Department of Transportation investment was
reported as on schedule on the Dashboard, which does not reflect the
significant delays GAO has identified in recent work. These
inaccuracies can be attributed to weaknesses in how agencies report
data to the Dashboard, such as providing erroneous data submissions,
as well as limitations in how OMB calculates the ratings (see table).
Until the selected agencies and OMB resolve these issues, ratings will
continue to often be inaccurate and may not reflect current program
performance.
Table: Reasons for Agencies‘ Dashboard Rating Inaccuracies:
Department: Homeland Security;
Agency practices: Inconsistent program baselines: [Check];
Agency practices: Missing data submissions: [Check];
Agency practices: Erroneous data submissions: [Check];
Agency practices: Unreliable source data: [Empty];
Dashboard calculations: Current performance not emphasized: [Check];
Dashboard calculations: Schedule variance understated: [Empty].
Department: Transportation;
Agency practices: Inconsistent program baselines:
Agency practices: Missing data submissions: [Check];
Agency practices: Erroneous data submissions: [Empty];
Agency practices: Unreliable source data: [Empty];
Dashboard calculations: Current performance not emphasized: [Check];
Dashboard calculations: Schedule variance understated: [Check].
Department: Treasury;
Agency practices: Inconsistent program baselines: [Check];
Agency practices: Missing data submissions: [Empty];
Agency practices: Erroneous data submissions: [Check];
Agency practices: Unreliable source data: [Check];
Dashboard calculations: Current performance not emphasized: [Check];
Dashboard calculations: Schedule variance understated: [Empty].
Department: Veterans Affairs;
Agency practices: Inconsistent program baselines: [Check];
Agency practices: Missing data submissions: [Check];
Agency practices: Erroneous data submissions: [Check];
Agency practices: Unreliable source data: [Empty];
Dashboard calculations: Current performance not emphasized: [Check];
Dashboard calculations: Schedule variance understated: [Empty].
Department: Social Security Administration;
Agency practices: Inconsistent program baselines: [Empty];
Agency practices: Missing data submissions: [Empty];
Agency practices: Erroneous data submissions: [Check];
Agency practices: Unreliable source data: [Empty];
Dashboard calculations: Current performance not emphasized: [Check];
Dashboard calculations: Schedule variance understated: [Empty].
Source: Agency officials and GAO analysis of Dashboard data.
[End of table]
What GAO Recommends:
GAO is recommending that selected agencies take steps to improve the
accuracy and reliability of Dashboard information and OMB improve how
it rates investments relative to current performance and schedule
variance. Agencies generally concurred with the recommendations; OMB
did not concur with the first recommendation but concurred with the
second. GAO maintains that until OMB implements both, performance may
continue to be inaccurately represented on the Dashboard.
View [hyperlink, http://www.gao.gov/products/GAO-11-262] or key
components. For more information, contact David A. Powner at (202) 512-
9286 or pownerd@gao.gov.
[End of section]
Contents:
Letter:
Background:
OMB Has Multiple Efforts Under Way to Further Refine the Dashboard and
Uses the Dashboard to Improve IT Management:
Dashboard Ratings Did Not Always Reflect True Investment Cost and
Schedule Performance:
Conclusions:
Recommendations for Executive Action:
Agency Comments and Our Evaluation:
Appendix I: Objectives, Scope, and Methodology:
Appendix II: Selected Investment Descriptions:
Appendix III: Comments from the Department of Veterans Affairs:
Appendix IV: Comments from the Social Security Administration:
Appendix V: GAO Contact and Staff Acknowledgments:
Tables:
Table 1: Causes of Inaccurate Ratings on the Dashboard:
Table 2: Investment Management Details:
Figures:
Figure 1: Dashboard Cost and Schedule Ratings Scale:
Figure 2: Overall Performance Ratings of Major IT Investments on the
Dashboard:
Figure 3: Comparison of Selected Investments' Dashboard Cost Ratings
with Investment Cost Performance:
Figure 4: Comparison of Selected Investments' Dashboard Schedule
Ratings with Investment Schedule Performance:
Abbreviations:
C4ISR: Command, Control, Communications, Computers, Intelligence,
Surveillance & Reconnaissance:
CIO: chief information officer:
DHS: Department of Homeland Security:
DOT: Department of Transportation:
IT: information technology:
OMB: Office of Management and Budget:
SSA: Social Security Administration:
Treasury: Department of the Treasury:
USCIS: United States Citizenship and Immigration Services:
VA: Department of Veterans Affairs:
[End of section]
United States Government Accountability Office:
Washington, DC 20548:
March 15, 2011:
The Honorable Joseph I. Lieberman:
Chairman:
The Honorable Susan M. Collins:
Ranking Member:
Committee on Homeland Security and Governmental Affairs:
United States Senate:
The Honorable Thomas R. Carper:
Chairman:
Subcommittee on Federal Financial Management, Government Information,
Federal Services, and International Security:
Committee on Homeland Security and Governmental Affairs:
United States Senate:
Billions of taxpayer dollars are spent on information technology (IT)
investments each year; federal IT spending has now risen to an
estimated $79 billion for fiscal year 2011. During the past several
years, we have issued multiple reports and testimonies and made
numerous recommendations to the Office of Management and Budget (OMB)
to improve the transparency, oversight, and management of the federal
government's IT investments.[Footnote 1] As part of its response to
our prior work, OMB deployed a public Web site in June 2009, known as
the IT Dashboard, which provides detailed information on federal
agencies' major IT investments, including assessments of actual
performance against cost and schedule targets (referred to as ratings)
for approximately 800 major federal IT investments.[Footnote 2] The
Dashboard aims to improve the transparency and oversight of these
investments.
In July 2010, we completed our first review of the Dashboard and
reported that the cost and schedule ratings on OMB's Dashboard were
not always accurate because of limitations with OMB's calculations.
[Footnote 3] We recommended that OMB report to Congress on the effect
of its planned Dashboard calculation changes on the accuracy of
performance information and provide guidance to agencies that
standardizes milestone reporting.
This is the second report in our series of Dashboard reviews and
responds to your request that we (1) determine what efforts OMB has
under way to improve the Dashboard and the ways in which it is using
data from the Dashboard to improve IT management and (2) examine the
accuracy of the cost and schedule performance ratings on the Dashboard
for selected investments.
To address our first objective, we interviewed OMB officials and
analyzed supporting OMB guidance and documentation to determine the
efforts OMB has under way to improve the Dashboard and the ways in
which OMB is using the data to improve IT management.
To address our second objective, we selected five agencies--the
Departments of Homeland Security (DHS), Transportation (DOT), the
Treasury (Treasury), and Veterans Affairs (VA), as well as the Social
Security Administration (SSA)--and 10 investments to review.[Footnote
4] The five agencies account for 22 percent of the planned IT spending
for fiscal year 2011. The 10 investments selected for case study
represent about $1.27 billion in total planned spending in fiscal year
2011. We analyzed monthly cost and schedule performance reports and
program management documents for the 10 investments to assess program
performance against planned cost and schedule targets. We then
compared our analyses of investment performance against the
corresponding ratings on the Dashboard to determine if the ratings
were accurate. Additionally, we interviewed officials from OMB and the
agencies to obtain further information on their efforts to ensure the
accuracy of the data used to rate investment performance on the
Dashboard. We did not test the adequacy of the agency or contractor
cost-accounting systems. Our evaluation of these cost data was based
on the documentation the agencies provided.
We conducted this performance audit from July 2010 to March 2011 in
accordance with generally accepted government auditing standards.
Those standards require that we plan and perform the audit to obtain
sufficient, appropriate evidence to provide a reasonable basis for our
findings and conclusions based on our audit objectives. We believe
that the evidence obtained provides a reasonable basis for our
findings and conclusions based on our audit objectives. Further
details of our objectives, scope, and methodology are provided in
appendix I.
Background:
Each year, OMB and federal agencies work together to determine how
much the government plans to spend on IT investments and how these
funds are to be allocated. According to the President's Budget for
Fiscal Year 2011, the total planned spending on IT in fiscal year 2011
is an estimated $79.4 billion, a 1.2 percent increase from the fiscal
year 2010 budget level of $78.4 billion. OMB plays a key role in
helping federal agencies manage their investments by working with them
to better plan, justify, and determine how much they need to spend on
projects and how to manage approved projects.
To assist agencies in managing their investments, Congress enacted the
Clinger-Cohen Act of 1996, which requires OMB to establish processes
to analyze, track, and evaluate the risks and results of major capital
investments in information systems made by federal agencies and report
to Congress on the net program performance benefits achieved as a
result of these investments.[Footnote 5] Further, the act places
responsibility for managing investments with the heads of agencies and
establishes chief information officers (CIO) to advise and assist
agency heads in carrying out this responsibility. Another key law is
the E-Government Act of 2002,[Footnote 6] which requires OMB to report
annually to Congress on the status of e-government.[Footnote 7] In
these reports, referred to as Implementation of the E-Government Act
reports, OMB is to describe the administration's use of e-government
principles to improve government performance and the delivery of
information and services to the public.
To help carry out its oversight role, in 2003, OMB established the
Management Watch List, which included mission-critical projects that
needed to improve performance measures, project management, IT
security, or overall justification for inclusion in the federal
budget. Further, in August 2005, OMB established a High-Risk List,
which consisted of projects identified by federal agencies, with the
assistance of OMB, as requiring special attention from oversight
authorities and the highest levels of agency management.
Over the past several years, we have reported and testified on OMB's
initiatives to highlight troubled IT projects, justify investments,
and use project management tools.[Footnote 8] We have made multiple
recommendations to OMB and federal agencies to improve these
initiatives to further enhance the oversight and transparency of
federal projects. Among other things, we recommended that OMB develop
a central list of projects and their deficiencies and analyze that
list to develop governmentwide and agency assessments of the progress
and risks of the investments, identifying opportunities for continued
improvement.[Footnote 9] In addition, in 2006 we also recommended that
OMB develop a single aggregate list of high-risk projects and their
deficiencies and use that list to report to Congress on progress made
in correcting high-risk problems.[Footnote 10] As a result, OMB
started publicly releasing aggregate data on its Management Watch List
and disclosing the projects' deficiencies. Furthermore, OMB issued
governmentwide and agency assessments of the projects on the
Management Watch List and identified risks and opportunities for
improvement, including risk management and security.
OMB's Dashboard Publicizes Investment Details and Performance Status:
More recently, to further improve the transparency and oversight of
agencies' IT investments, and to address data quality issues, in June
2009, OMB publicly deployed a Web site, known as the IT Dashboard,
which replaced the Management Watch List and High-Risk List. It
displays federal agencies' cost, schedule, and performance data for
the approximately 800 major federal IT investments at 27 federal
agencies. According to OMB, these data are intended to provide a near
real-time perspective on the performance of these investments, as well
as a historical perspective. Further, the public display of these data
is intended to allow OMB; other oversight bodies, including Congress;
and the general public to hold the government agencies accountable for
results and progress.
The Dashboard was initially deployed in June 2009 based on each
agency's Exhibit 53 and Exhibit 300 submissions.[Footnote 11] After
the initial population of data, agency CIOs have been responsible for
updating cost, schedule, and performance fields on a monthly basis,
which is a major improvement from the quarterly reporting cycle OMB
previously used for the Management Watch List and High-Risk List.
For each major investment, the Dashboard provides performance ratings
on cost and schedule, a CIO evaluation, and an overall rating, which
is based on the cost, schedule, and CIO ratings. As of July 2010, the
cost rating was determined by a formula that calculates the amount by
which an investment's total actual costs deviate from the total
planned costs. Similarly, the schedule rating is the variance between
the investment's planned and actual progress to date. Figure 1
displays the rating scale and associated categories for cost and
schedule variations.
Figure 1: Dashboard Cost and Schedule Ratings Scale:
[Refer to PDF for image: horizontal bar graph]
Variance (percentage) from planned costs or schedule: 0-10;
Rating: 10. Normal.
Variance (percentage) from planned costs or schedule: 0-10;
Rating: 9. Normal.
Variance (percentage) from planned costs or schedule: 0-10;
Rating: 8. Normal.
Variance (percentage) from planned costs or schedule: 10-30;
Rating: 7. Needs attention.
Variance (percentage) from planned costs or schedule: 10-30;
Rating: 6. Needs attention.
Variance (percentage) from planned costs or schedule: 10-30;
Rating: 5. Needs attention.
Variance (percentage) from planned costs or schedule: 10-30;
Rating: 4. Needs attention.
Variance (percentage) from planned costs or schedule: 30-50;
Rating: 3. Significant concerns.
Variance (percentage) from planned costs or schedule: 30-50;
Rating: 2. Significant concerns.
Variance (percentage) from planned costs or schedule: 30-50;
Rating: 1. Significant concerns.
Variance (percentage) from planned costs or schedule: 50+;
Rating: 0. Significant concerns.
Source: GAO based on OMB's Dashboard.
[End of figure]
Each major investment on the Dashboard also includes a rating
determined by the agency CIO, which is based on his or her evaluation
of the performance of each investment. The rating is expected to take
into consideration the following criteria: risk management,
requirements management, contractor oversight, historical performance,
and human capital. This rating is to be updated when new information
becomes available that would affect the assessment of a given
investment.
Last, the Dashboard calculates an overall rating for each major
investment. This overall rating is an average of the cost, schedule,
and CIO ratings, with each representing one-third of the overall
rating. However, when the CIO's rating is lower than both the cost and
schedule ratings, the CIO's rating will be the overall rating. Figure
2 shows the overall performance ratings of the 805 major investments
on the Dashboard as of March 2011.
Figure 2: Overall Performance Ratings of Major IT Investments on the
Dashboard:
[Refer to PDF for image: pie-chart]
Normal: 497 investments (62%);
Needs attention: 268 investments (33%);
Significant concerns: 40 investments (5%).
Source: OMB‘s Dashboard.
[End of figure]
Earned Value Management Provides Additional Insight on Program Cost
and Schedule:
To better manage IT investments, OMB issued guidance directing
agencies to develop comprehensive policies to ensure that their major
IT investments and high-risk development projects use earned value
management to manage their investments.[Footnote 12] Earned value
management is a technique that integrates the technical, cost, and
schedule parameters of a development contract and measures progress
against them. During the planning phase, a performance measurement
baseline is developed by assigning and scheduling budget resources for
defined work.[Footnote 13] As work is performed and measured against
the baseline, the corresponding budget value is "earned." Using this
earned value metric, cost and schedule variances, as well as cost and
time to complete estimates, can be determined and analyzed.
Without knowing the planned cost of completed work and work in
progress (i.e., the earned value), it is difficult to determine a
program's true status. Earned value allows for this key information,
which provides an objective view of program status and is necessary
for understanding the health of a program. As a result, earned value
management can alert program managers to potential problems sooner
than using expenditures alone, thereby reducing the chance and
magnitude of cost overruns and schedule slippages. Moreover, earned
value management directly supports the institutionalization of key
processes for acquiring and developing systems and the ability to
effectively manage investments--areas that are often found to be
inadequate on the basis of our assessments of major IT investments.
OMB Has Taken Steps to Address Prior GAO Recommendations on Improving
Dashboard Accuracy:
In July 2010, we reported that the cost and schedule ratings on OMB's
Dashboard were not always accurate for selected agencies.[Footnote 14]
Specifically, we found that several selected investments had notable
discrepancies in their cost or schedule ratings, the cost and schedule
ratings did not take into consideration current performance, and the
number of milestones (activities) reported by agencies varied widely.
[Footnote 15] We made a number of recommendations to OMB to better
ensure that the Dashboard provides meaningful ratings and accurate
investment data. In particular, we recommended that OMB report on its
planned Dashboard changes to improve the accuracy of performance
information and provide guidance to agencies that standardizes
activity reporting. OMB agreed with the two recommendations and
reported it had initiated work to address them.
OMB Has Multiple Efforts Under Way to Further Refine the Dashboard and
Uses the Dashboard to Improve IT Management:
Since our last report, OMB has initiated multiple efforts to increase
the Dashboard's value as a management and oversight tool, and has used
data in the Dashboard to improve the management of federal IT
investments. Specifically, OMB is focusing its efforts in four main
areas: streamlining key OMB investment reporting tools, eliminating
manual monthly submissions, coordinating with agencies to improve
data, and improving the user interface.
* OMB's plan to reform federal IT management commits OMB to
streamlining two of the Dashboard's sources of information--
specifically, the OMB Exhibits 53 and 300.[Footnote 16] OMB has
committed, by May 2011, to reconstruct the exhibits around distinct
data elements that drive value for agencies and provide the
information necessary for meaningful oversight. OMB anticipates that
these changes will also alleviate the reporting burden and increase
data accuracy, and that the revised exhibits will serve as its
authoritative management tools.
* According to OMB officials, the Dashboard no longer accepts manual
data submissions. Instead, the Dashboard allows only system-to-system
submissions. Officials explained that this update allows the Dashboard
to reject incomplete submissions and those that do not meet the
Dashboard's data validation rules. By eliminating direct manual
submissions, this effort is expected to improve the reliability of the
data shown on the Dashboard.
* Further, OMB officials stated that they work to improve the
Dashboard through routine interactions with agencies and IT portfolio
management tool vendors, training courses, working groups, and data
quality letters to agencies. Specifically, OMB officials stated that
they held 58 TechStat reviews (discussed later in this report), hosted
four online training sessions (recordings of which OMB officials
stated are also available online), collaborated with several Dashboard
working groups, and sent letters to agency CIOs identifying specific
data quality issues on the Dashboard that their agencies could
improve. Further, OMB officials explained that in December 2010, OMB
analysts informed agency CIOs about specific data quality issues and
provided analyses of agency data, a comparison of agency Dashboard
performance with that of the rest of the government, and expected
remedial actions. OMB anticipates these efforts will increase the
Dashboard's data reliability by ensuring that the agencies are aware
of and are working to address issues.
* Finally, OMB continues to improve the Dashboard's user interface.
For instance, in November 2010, OMB updated the Dashboard to provide
new views of historical data and rating changes and provide new
functionality allowing agencies to make corrections to activities and
performance metrics (conforming to rebaselining guidance[Footnote
17]). Officials also described a planned future update, which is
intended to contain updated budget data, display corrections and
changes made to activities, and reflect increased validation of agency-
submitted data. OMB anticipates these efforts will increase the
transparency and reliability of investment information on the
Dashboard by providing agencies and users additional ways to view
investment information and by improving validation of submitted data.
Additionally, OMB uses the Dashboard to improve the management of IT
investments. Specifically, OMB analysts are using the Dashboard's
investment trend data to track changes and identify issues with
investments' performance in a timely manner. OMB analysts also use the
Dashboard to identify data quality issues and drive improvements to
the data. The Federal CIO stated that the Dashboard has greatly
improved oversight capabilities compared with those of previously used
mechanisms, such as the annual capital asset plan and business case
(Exhibit 300) process. Additionally, according to OMB officials, the
Dashboard is one of the key sources of information that OMB analysts
use to identify IT investments that are experiencing performance
problems and to select them for a TechStat session--a review of
selected IT investments between OMB and agency leadership that is led
by the Federal CIO. As of December 2010, OMB officials stated that 58
TechStat sessions have been held with federal agencies. According to
OMB, these sessions have enabled the government to improve or
terminate IT investments that are experiencing performance problems.
Information from the TechStat sessions and the Dashboard was used by
OMB to identify, halt, and review all federal financial IT systems
modernization projects. Furthermore, according to OMB, these sessions
and other OMB management reviews have resulted in a $3 billion
reduction in life-cycle costs, as of December 2010. OMB officials
stated that, as of December 2010, 11 investments have been reduced in
scope and 4 have been terminated as a result of these sessions. For
example,
* The TechStat on the Department of Housing and Urban Development's
Transformation Initiative investment found that the department lacked
the skills and resources necessary and would not be positioned to
succeed. As a result, the department agreed to reduce the number of
projects from 29 to 7 and to limit fiscal year 2010 funds for these 7
priority projects to $85.7 million (from the original $138 million).
* The TechStat on the National Archives and Records Administration's
Electronic Records Archives investment resulted in six corrective
actions, including halting fiscal year 2012 development funding
pending the completion of a strategic plan.
According to OMB officials, OMB and agency CIOs also used the
Dashboard data and TechStat sessions, in addition to other forms of
research (such as reviewing program documentation, news articles, and
inspector general reports), to identify 26 high-risk IT projects and,
in turn, coordinate with agencies to develop corrective actions for
these projects at TechStat sessions. For example, the Department of
the Interior is to establish incremental deliverables for its Incident
Management Analysis and Reporting System, which will accelerate
delivery of services that will help 6,000 law enforcement officers
protect the nation's natural resources and cultural monuments.
Dashboard Ratings Did Not Always Reflect True Investment Cost and
Schedule Performance:
While the efforts previously described are important steps to
improving the quality of the information on the Dashboard, cost and
schedule performance data inaccuracies remain. The Dashboard's cost
and schedule ratings were not always reflective of the true
performance for selected investments from the five agencies in our
review. More specifically, while the Dashboard is intended to present
near real-time performance, the ratings did not always reflect the
current performance of these investments. Dashboard rating
inaccuracies were the result of weaknesses in agency practices, such
as the Dashboard not reflecting baseline changes and the reporting of
erroneous data, as well as limitations of the Dashboard's
calculations. Until the agencies submit complete, reliable, and timely
data to the Dashboard and OMB revises its Dashboard calculations,
performance ratings will continue to be inaccurate and may not reflect
current program performance.
Cost and Schedule Performance Was Not Always Accurately Depicted in
Dashboard Ratings:
Most of the Dashboard's cost ratings of the nine selected investments
did not match the results of our analyses over a 3-month
period.[Footnote 18] Specifically, four investments had inaccurate
ratings for 2 or more months, and two were inaccurate for 1 month,
while three investments were accurately depicted for all 3 months. For
example, Intelligent Disability's cost performance was rated "red" on
the Dashboard for July 2010 and "green" for August 2010, whereas our
analysis showed its current cost performance was "yellow" for those
months. Further, Medical Legacy's cost ratings were "red" on the
Dashboard for June through August 2010, while the department's
internal rating showed that the cost performance for 105 of the 107
projects that constitute the investment was "green" in August 2010;
similar ratings were also seen for June and July 2010. Overall, the
Dashboard's cost ratings generally showed poorer performance than our
assessments. Figure 3 shows the comparison of the selected
investments' Dashboard cost ratings with GAO's ratings based on
analysis of agency data for the months of June 2010 through August
2010.
Figure 3: Comparison of Selected Investments' Dashboard Cost Ratings
with Investment Cost Performance:
[Refer to PDF for image: illustrated table]
Agency: DHS;
Investment: C4ISR;
Dashboard: June 2010: Needs attention;
GAO: June 2010: Needs attention;
Dashboard: July 2010: Needs attention;
GAO: July 2010: Needs attention;
Dashboard: August 2010: Needs attention;
GAO: August 2010: Needs attention.
Agency: DHS;
Investment: USCIS-Transformation;
Dashboard: June 2010: Significant concerns;
GAO: June 2010: Needs attention;
Dashboard: July 2010: Significant concerns;
GAO: July 2010: Needs attention;
Dashboard: August 2010: Significant concerns;
GAO: August 2010: Needs attention.
Agency: DOT;
Investment: Automatic Dependent Surveillance-Broadcast;
Dashboard: June 2010: Significant concerns;
GAO: June 2010: Needs attention;
Dashboard: July 2010: Needs attention;
GAO: July 2010: Needs attention;
Dashboard: August 2010: Needs attention;
GAO: August 2010: Needs attention.
Agency: DOT;
Investment: En Route Automation Modernization;
Dashboard: June 2010: Normal;
GAO: June 2010: Normal;
Dashboard: July 2010: Normal;
GAO: July 2010: Normal;
Dashboard: August 2010: Normal;
GAO: August 2010: Normal.
Agency: SSA;
Investment: Disability Case Processing System;
Dashboard: June 2010: Needs attention;
GAO: June 2010: Needs attention;
Dashboard: July 2010: Needs attention;
GAO: July 2010: Needs attention;
Dashboard: August 2010: Needs attention;
GAO: August 2010: Needs attention.
Agency: SSA;
Investment: Intelligent disability;
Dashboard: June 2010: Significant concerns;
GAO: June 2010: Significant concerns;
Dashboard: July 2010: Significant concerns;
GAO: July 2010: Needs attention;
Dashboard: August 2010: Normal;
GAO: August 2010: Needs attention.
Agency: Treasury;
Investment: Modernized e-File;
Dashboard: June 2010: Significant concerns;
GAO: June 2010: Needs attention;
Dashboard: July 2010: Needs attention;
GAO: July 2010: Needs attention;
Dashboard: August 2010: Needs attention;
GAO: August 2010: Needs attention.
Agency: Treasury;
Investment: Payment Application Modernization;
Dashboard: June 2010: Needs attention;
GAO: June 2010: Not applicable;
Dashboard: July 2010: Needs attention;
GAO: July 2010: Not applicable;
Dashboard: August 2010: Needs attention;
GAO: August 2010: Not applicable.
Agency: VA;
Investment: HealthVet Core;
Dashboard: June 2010: Significant concerns;
GAO: June 2010: Normal;
Dashboard: July 2010: Significant concerns;
GAO: July 2010: Normal;
Dashboard: August 2010: Significant concerns;
GAO: August 2010: Normal.
Agency: VA;
Investment: Medical Legacy;
Dashboard: June 2010: Significant concerns;
GAO: June 2010: Normal;
Dashboard: July 2010: Significant concerns;
GAO: July 2010: Normal;
Dashboard: August 2010: Significant concerns;
GAO: August 2010: Normal.
Sources: OMB‘s Dashboard, agency data, and GAO analysis of agency data.
Note: For the Payment Application Modernization investment, we
determined that the underlying cost and schedule performance data were
unreliable and thus did not evaluate this investment.
[End of figure]
Regarding schedule, most of the Dashboard's ratings of the nine
selected investments did not match the results of our analyses over a
3-month period. Specifically, seven investments had inaccurate ratings
for 2 or more months, and two were inaccurate for 1 month. For
example, Automatic Dependent Surveillance-Broadcast's schedule
performance was rated "green" on the Dashboard in July 2010, but our
analysis showed its current performance was "yellow" that month.
Additionally, the "green" schedule ratings for En Route Automation
Modernization did not represent how this program is actually
performing. Specifically, we recently reported that the program is
experiencing significant schedule delays,[Footnote 19] and the CIO
evaluation of the program on the Dashboard has indicated schedule
delays since February 2010. As with the cost ratings, the Dashboard's
schedule ratings generally showed poorer performance than our
assessments. Figure 4 shows the comparison of the selected
investments' Dashboard schedule ratings with GAO's ratings based on
analysis of agency data for the months of June 2010 through August
2010.
Figure 4: Comparison of Selected Investments' Dashboard Schedule
Ratings with Investment Schedule Performance:
[Refer to PDF for image: illustrated table]
Agency: DHS;
Investment: C4ISR;
Dashboard: June 2010: Needs attention;
GAO: June 2010: Normal;
Dashboard: July 2010: Needs attention;
GAO: July 2010: Normal;
Dashboard: August 2010: Significant concerns;
GAO: August 2010: Needs attention.
Agency: DHS;
Investment: USCIS-Transformation;
Dashboard: June 2010: Normal;
GAO: June 2010: Needs attention;
Dashboard: July 2010: Normal;
GAO: July 2010: Needs attention;
Dashboard: August 2010: Normal;
GAO: August 2010: Needs attention;
Agency: DOT;
Investment: Automatic Dependent Surveillance-Broadcast;
Dashboard: June 2010: Needs attention;
GAO: June 2010: Needs attention;
Dashboard: July 2010: Normal;
GAO: July 2010: Needs attention;
Dashboard: August 2010: Needs attention;
GAO: August 2010: Needs attention.
Agency: DOT;
Investment: En Route Automation Modernization;
Dashboard: June 2010: Normal;
GAO: June 2010: Significant concerns;
Dashboard: July 2010: Normal;
GAO: July 2010: Significant concerns;
Dashboard: August 2010: Normal;
GAO: August 2010: Significant concerns.
Agency: SSA;
Investment: Disability Case Processing System;
Dashboard: June 2010: Needs attention;
GAO: June 2010: Normal;
Dashboard: July 2010: Needs attention;
GAO: July 2010: Normal;
Dashboard: August 2010: Needs attention;
GAO: August 2010: Normal.
Agency: SSA;
Investment: Intelligent disability;
Dashboard: June 2010: Normal;
GAO: June 2010: Needs attention;
Dashboard: July 2010: Normal;
GAO: July 2010: Normal;
Dashboard: August 2010: Normal;
GAO: August 2010: Normal.
Agency: Treasury;
Investment: Modernized e-File;
Dashboard: June 2010: Significant concerns;
GAO: June 2010: Needs attention;
Dashboard: July 2010: Needs attention;
GAO: July 2010: Needs attention;
Dashboard: August 2010: Needs attention;
GAO: August 2010: Needs attention.
Agency: Treasury;
Investment: Payment Application Modernization;
Dashboard: June 2010: Needs attention;
GAO: June 2010: Normal;
Dashboard: July 2010: Normal;
GAO: July 2010: Normal;
Dashboard: August 2010: Normal;
GAO: August 2010: Needs attention.
Agency: VA;
Investment: HealthVet Core;
Dashboard: June 2010: Needs attention;
GAO: June 2010: Normal;
Dashboard: July 2010: Needs attention;
GAO: July 2010: Normal;
Dashboard: August 2010: Needs attention;
GAO: August 2010: Normal.
Agency: VA;
Investment: Medical Legacy;
Dashboard: June 2010: Needs attention;
GAO: June 2010: Normal;
Dashboard: July 2010: Needs attention;
GAO: July 2010: Normal;
Dashboard: August 2010: Needs attention;
GAO: August 2010: Normal.
Sources: OMB‘s Dashboard, agency data, and GAO analysis of agency data.
Note: For the Payment Application Modernization investment, we
determined that the underlying cost and schedule performance data were
unreliable and thus did not evaluate this investment.
[End of figure]
Dashboard Rating Inaccuracies Are a Result of Weaknesses in Agencies'
Practices and Limitations with OMB's Calculations:
OMB guidance, as of June 2010, states that agencies are responsible
for maintaining consistency between the data in their internal systems
and the data on the Dashboard.[Footnote 20] Furthermore, the guidance
states that agency CIOs should update their evaluation on the
Dashboard as soon as new information becomes available that affects
the assessment of a given investment. According to our assessment of
the nine selected investments, agencies did not always follow this
guidance. In particular, there were four primary weaknesses in agency
practices that resulted in inaccurate cost and schedule ratings on the
Dashboard: the investment baseline on the Dashboard was not reflective
of the investment's actual baseline, agencies did not report data to
the Dashboard, agencies reported erroneous data, and unreliable earned
value data were reported to the Dashboard. In addition, two
limitations of OMB's Dashboard calculations contributed to ratings
inaccuracies: a lack of emphasis on current performance and an
understatement of schedule variance. Table 1 shows the causes of
inaccurate ratings for the selected investments.
Table 1: Causes of Inaccurate Ratings on the Dashboard:
Agency: DHS;
Investment: C4ISR;
Agency practices: Inconsistent program baseline: [Empty];
Agency practices: Missing data submissions: [Check];
Agency practices: Erroneous data submissions: [Check];
Agency practices: Unreliable source data: [Empty];
Dashboard calculations: Current performance not emphasized: [Check];
Dashboard calculations: Schedule variance understated: [Empty].
Agency: DHS;
Investment: USCIS-Transformation;
Agency practices: Inconsistent program baseline: [Check];
Agency practices: Missing data submissions: [Empty];
Agency practices: Erroneous data submissions: [Check];
Agency practices: Unreliable source data: [Empty];
Dashboard calculations: Current performance not emphasized: [Check];
Dashboard calculations: Schedule variance understated: DOT: [Empty].
Agency: DOT;
Investment: Automatic Dependent Surveillance-Broadcast;
Agency practices: Inconsistent program baseline: [Empty];
Agency practices: Missing data submissions: [Check];
Agency practices: Erroneous data submissions: [Empty];
Agency practices: Unreliable source data: [Empty];
Dashboard calculations: Current performance not emphasized: [Check];
Dashboard calculations: Schedule variance understated: [Empty].
Agency: DOT;
Investment: En Route Automation Modernization;
Agency practices: Inconsistent program baseline: [Empty];
Agency practices: Missing data submissions: [Empty];
Agency practices: Erroneous data submissions: [Empty];
Agency practices: Unreliable source data: [Empty];
Dashboard calculations: Current performance not emphasized: [Check];
Dashboard calculations: Schedule variance understated: SSA: [Check].
Agency: SSA;
Investment: Disability Case Processing System;
Agency practices: Inconsistent program baseline: [Empty];
Agency practices: Missing data submissions: [Empty];
Agency practices: Erroneous data submissions: [Check];
Agency practices: Unreliable source data: [Empty];
Dashboard calculations: Current performance not emphasized: [Check];
Dashboard calculations: Schedule variance understated: [Empty].
Agency: SSA;
Investment: Intelligent Disability;
Agency practices: Inconsistent program baseline: [Empty];
Agency practices: Missing data submissions: [Empty];
Agency practices: Erroneous data submissions: [Check];
Agency practices: Unreliable source data: [Empty];
Dashboard calculations: Current performance not emphasized: [Check];
Dashboard calculations: Schedule variance understated: [Empty].
Agency: Treasury;
Investment: Payment Application Modernization;
Agency practices: Inconsistent program baseline: [Empty];
Agency practices: Missing data submissions: [Empty];
Agency practices: Erroneous data submissions: [Empty];
Agency practices: Unreliable source data: [Check];
Dashboard calculations: Current performance not emphasized: [Check];
Dashboard calculations: Schedule variance understated: [Empty].
Agency: Treasury;
Investment: Modernized e-File;
Agency practices: Inconsistent program baseline: [Check];
Agency practices: Missing data submissions: [Empty];
Agency practices: Erroneous data submissions: [Check];
Agency practices: Unreliable source data: [Empty];
Dashboard calculations: Current performance not emphasized: [Check];
Dashboard calculations: Schedule variance understated: VA: [Empty].
Agency: VA;
Investment: HealtheVet Core;
Agency practices: Inconsistent program baseline: [Check];
Agency practices: Missing data submissions: [Empty];
Agency practices: Erroneous data submissions: [Check];
Agency practices: Unreliable source data: [Empty];
Dashboard calculations: Current performance not emphasized: [Check];
Dashboard calculations: Schedule variance understated: [Empty].
Agency: VA;
Investment: Total: Medical Legacy;
Agency practices: Inconsistent program baseline: [Empty];
Agency practices: Missing data submissions: [Check];
Agency practices: Erroneous data submissions: [Check];
Agency practices: Unreliable source data: [Empty];
Dashboard calculations: Current performance not emphasized: [Check];
Dashboard calculations: Schedule variance understated: Total: [Empty].
Agency: Total;
Agency practices: Inconsistent program baseline: 3;
Agency practices: Missing data submissions: 3;
Agency practices: Erroneous data submissions: 7;
Agency practices: Unreliable source data: 1;
Dashboard calculations: Current performance not emphasized: 10;
Dashboard calculations: Schedule variance understated: 1.
Source: Agency officials and GAO analysis of Dashboard data.
[End of table]
* Inconsistent program baseline: Three of the selected investments
reported baselines on the Dashboard that did not match the actual
baselines tracked by the agencies. Agency officials responsible for
each of these investments acknowledged this issue. For example,
according to Modernized e-File officials, the investment was in the
process of a rebaseline in June 2010; thus, officials were unable to
update the baseline on the Dashboard until July 2010. For another
investment--HealtheVet Core--officials stated that it was stopped in
August, and thus the HealtheVet Core baseline on the Dashboard is
incorrect. As such, the CIO investment evaluation should have been
updated to reflect that the investment was stopped. In June 2010, OMB
issued new guidance on rebaselining, which stated that agencies should
update investment baselines on the Dashboard within 30 days of
internal approval of a baseline change and that this update will be
considered notification to OMB.[Footnote 21] However, agencies still
must go through their internal processes to approve a new baseline,
and during this process the baseline on the Dashboard will be
inaccurate. As such, investment CIO ratings should disclose that
performance data on the Dashboard are unreliable because of baseline
changes. However, the CIO evaluation ratings for these investments did
not include such information. Without proper disclosure of pending
baseline changes and resulting data reliability weaknesses, OMB and
other external oversight groups will not have the appropriate
information to make informed decisions about these investments.
* Missing data submissions: Three investments did not upload complete
and timely data submissions to the Dashboard. For example, DHS
officials did not submit data to the Dashboard for the C4ISR
investment from June through August 2010. According to DHS officials,
C4ISR investment officials did not provide data for DHS to upload for
these months. Further compounding the performance rating issues of
this investment is that in March 2010, inaccurate data were submitted
for nine of its activities; these data were not corrected until
September 2010. Until officials submit complete, accurate, and timely
data to the Dashboard, performance ratings may continue to be
inaccurate.
* Erroneous data submissions: Seven investments reported erroneous
data to the Dashboard. For example, SSA submitted start dates for
Intelligent Disability and Disability Case Processing System
activities that had not actually started yet. SSA officials stated
that, because of SSA's internal processes, their start dates always
correspond to the beginning of the fiscal year. In addition, according
to a Treasury official, Internal Revenue Service officials for the
Modernized e-File investment provided inaccurate data for the
investment's "actual percent complete" fields for some activities.
Until officials submit accurate data to the Dashboard, performance
ratings may continue to be inaccurate.
* Unreliable source data: Treasury's Payment Application Modernization
investment used unreliable earned value data as the sole source of
data on the Dashboard. As such, this raises questions about the
accuracy of the performance ratings reported on the Dashboard.
Investment officials stated that they have taken steps to address
weaknesses with the earned value management system and are currently
evaluating other adjustments to investment management processes.
However, without proper disclosure about data reliability in the CIO
assessment, OMB and other external oversight groups will not have the
appropriate information to make informed decisions about this
investment.
Additionally, two limitations in the Dashboard method for calculating
ratings contributed to inaccuracies:
* Current performance calculation: The Dashboard is intended to
represent near real-time performance information on all major IT
investments, as previously discussed. To OMB's credit, in July 2010,
it updated the Dashboard's cost and schedule calculations to include
both ongoing and completed activities in order to accomplish this.
However, the performance of ongoing activities is combined with the
performance of completed activities, which can mask recent
performance. As such, the cost and schedule performance ratings on the
Dashboard may not always reflect current performance. Until OMB
updates the Dashboard's cost and schedule calculations to focus on
current performance, the performance ratings may not reflect
performance problems that the investments are presently facing, and
OMB and agencies are thus missing an opportunity to identify solutions
to such problems.
* Schedule variance calculation: Another contributing factor to
certain schedule inaccuracies is that OMB's schedule calculation for
in-progress activities understates the schedule variance for
activities that are overdue. Specifically, OMB's schedule calculation
does not recognize the full variance of an overdue activity until it
has actually completed. For example, as of September 13, 2010, the
Dashboard reported a 21-day schedule variance for an En Route
Automation Modernization activity that was actually 256 days overdue.
Until OMB updates its in-progress schedule calculation to be more
reflective of the actual schedule variance of ongoing activities,
schedule ratings for these activities may be understated.
Conclusions:
The Dashboard has enhanced OMB's and agency CIOs' oversight of federal
IT investments. Among other things, performance data from the
Dashboard are being used to identify poorly performing investments for
executive leadership review sessions. Since the establishment of the
Dashboard, OMB has worked to continuously refine it, with multiple
planned improvement efforts under way for improving the data quality
and Dashboard usability.
However, the quality of the agency data reported to the Dashboard
continues to be a challenge. Specifically, the cost and schedule
ratings on the Dashboard were not always accurate in depicting current
program performance for most of the selected investments, which is
counter to OMB's goal to report near real-time performance. The
Dashboard rating inaccuracies were due, in part, to weaknesses in
agencies' practices and limitations in OMB's calculations. More
specifically, the agency practices--including the inconsistency
between Dashboard and program baselines, reporting of erroneous data,
and unreliable source data--and OMB's formulas to track current
performance have collectively impaired data quality. Until agencies
provide more reliable data and OMB improves the calculations of the
ratings on the Dashboard, the accuracy of the ratings will continue to
be in question and the ratings may not reflect current program
performance.
Recommendations for Executive Action:
To better ensure that the Dashboard provides accurate cost and
schedule performance ratings, we are making eleven recommendations to
the heads of each of the five selected agencies. Specifically, we are
recommending that:
* The Secretary of the Department of Homeland Security direct the CIO
to:
- ensure that investment data submissions include complete and
accurate investment information for all required fields;
- comply with OMB's guidance on updating the CIO rating as soon as new
information becomes available that affects the assessment of a given
investment, including when an investment is in the process of a
rebaseline; and:
- work with C4ISR officials to comply with OMB's guidance on updating
investment cost and schedule data on the Dashboard at least monthly.
* The Secretary of the Department of Transportation direct the CIO to
work with Automatic Dependent Surveillance-Broadcast officials to
comply with OMB's guidance on updating investment cost and schedule
data on the Dashboard at least monthly.
* The Secretary of the Department of the Treasury direct the CIO to:
* comply with OMB's guidance on updating the CIO rating as soon as new
information becomes available that affects the assessment of a given
investment, including when an investment is in the process of a
rebaseline;
* work with Modernized e-File officials to report accurate actual
percent complete data for each of the investment's activities; and:
* work with Payment Application Modernization officials to disclose
the extent of this investment's data reliability issues in the CIO
rating assessment on the Dashboard.
* The Secretary of the Department of Veterans Affairs direct the CIO
to:
- comply with OMB's guidance on updating the CIO rating as soon as new
information becomes available that affects the assessment of a given
investment, including when an investment is in the process of a
rebaseline;
- work with Medical Legacy officials to comply with OMB's guidance on
updating investment cost and schedule data on the Dashboard at least
monthly; and:
- ensure Medical Legacy investment data submitted to the Dashboard are
consistent with the investment's internal performance information.
* The Commissioner of the Social Security Administration direct the
CIO to ensure that data submissions to the Dashboard include accurate
investment information for all required fields.
In addition, to better ensure that the Dashboard provides meaningful
ratings and reliable investment data, we are recommending that the
Director of OMB direct the Federal CIO to take the following two
actions:
* develop cost and schedule rating calculations that better reflect
current investment performance and:
* update the Dashboard's schedule calculation for in-progress
activities to more accurately represent the variance of ongoing,
overdue activities.
Agency Comments and Our Evaluation:
We provided a draft of our report to the five agencies in our review
and to OMB. In commenting on the draft, four agencies generally
concurred with our recommendations. One agency, the Department of
Transportation, agreed to consider our recommendation. OMB agreed with
one of our recommendations and disagreed with the other. In addition,
OMB raised concerns about the methodology used in our report. Agencies
also provided technical comments, which we incorporated as
appropriate. Each agency's comments are discussed in more detail below.
* In e-mail comments on a draft of the report, DHS's Departmental
Audit Liaison stated that the department concurred with our
recommendations.
* In e-mail comments, DOT's Director of Audit Relations stated that
DOT would consider our recommendation; however, he also stated that
the department disagreed with the way its investments were portrayed
in the draft. Specifically, department officials stated that our
assessment was not reasonable because our methodology only
incorporated the most recent 6 months of performance rather than using
cumulative investment performance. As discussed in this report,
combining the performance of ongoing and completed activities can mask
recent performance. As such, we maintain that our methodology is a
reasonable means of deriving near real-time performance, which the
Dashboard is intended to represent.
* In oral comments, Treasury's Chief Architect stated that the
department generally concurred with our recommendations and added that
the department would work to update its Dashboard ratings for the two
selected investments.
* In written comments, VA's Chief of Staff stated that the department
generally concurred with our recommendations and agreed with our
conclusions. Further, he outlined the department's planned process
improvements to address the weaknesses identified in this report. VA's
comments are reprinted in appendix III.
* In written comments, SSA's Deputy Chief of Staff stated that the
Administration agreed with our recommendation and had taken corrective
actions intended to prevent future data quality errors. SSA's comments
are reprinted in appendix IV.
Officials from OMB's Office of E-Government & Information Technology
provided the following oral comments on the draft:
* OMB officials agreed with our recommendation to update the
Dashboard's schedule calculation for in-progress activities to more
accurately represent the variance of ongoing, overdue activities.
These officials stated that the agency has long-term plans to update
the Dashboard's calculations, which they believe will provide a
solution to the concern identified in this report.
* OMB officials disagreed with our recommendation to develop cost and
schedule rating calculations that better reflect current investment
performance. According to OMB, real-time performance is always
reflected in the ratings since current investment performance data are
uploaded to the Dashboard on a monthly basis.
Regarding OMB's comments, our point is not that performance data on
the Dashboard are infrequently updated, but that the use of historical
data going back to an investment's inception can mask more recent
performance. For this reason, current investment performance may not
always be as apparent as it should be, as this report has shown. Until
the agency places less emphasis on the historical data factored into
the Dashboard's calculations, it will be passing up an opportunity to
more efficiently and effectively identify and oversee investments that
either currently are or soon will be experiencing problems.
* OMB officials also described the agency's plans for enhancing
Dashboard data quality and performance calculations. According to OMB,
plans were developed in February 2011 with stakeholders from other
agencies to standardize the reporting structure for investment
activities. Further, OMB officials said that their plans also call for
the Dashboard's performance calculations to be updated to more
accurately reflect activities that are delayed. In doing so, OMB
stated that agencies will be expected to report new data elements
associated with investment activities. Additionally, OMB officials
noted that new agency requirements associated with these changes will
be included in key OMB guidance (Circular A-11) no later than
September 2011.
OMB officials also raised two concerns regarding our methodology.
Specifically,
* OMB stated that our reliance on earned value data as the primary
source for determining investment performance was questionable. These
officials stated that, on the basis of their experience collecting
earned value data, the availability and quality of these data vary
significantly across agencies. As such, according to these officials,
OMB developed its Dashboard cost and schedule calculations to avoid
relying on earned value data.
We acknowledge that the quality of earned value data can vary. As
such, we took steps to ensure that the data we used were reliable
enough to evaluate the ratings on the Dashboard, and discounted the
earned value data of one of the selected investments after determining
its data were insufficient for our needs. While we are not critical of
OMB's decision to develop its own method for calculating performance
ratings, we maintain that our use of earned value data is sound.
Furthermore, earned value data were not the only source for our
analysis; we also based our findings on other program management
documentation, such as inspector general reports and internal
performance management system performance ratings, as discussed in
appendix I.
* OMB also noted that, because we used earned value data to determine
investment performance, our ratings were not comparable to the ratings
on the Dashboard. Specifically, OMB officials said that the Dashboard
requires reporting of all activities under an investment, including
government resources or operations and maintenance activities. OMB
further said that this is more comprehensive than earned value data,
which only account for contractor-led development activities.
We acknowledge and support the Dashboard's requirement for a
comprehensive accounting of investment performance. Further, we agree
that earned value data generally only cover development work
associated with the investments (thus excluding other types of work,
such as planning and operations and maintenance). For this reason, as
part of our methodology, we specifically selected investments for
which the majority of the work being performed was development work.
We did this because earned value management is a proven technique for
providing objective quantitative data on program performance, and
alternative approaches do not always provide a comparable substitute
for such data. Additionally, as discussed above, we did not base our
analysis solely upon earned value data, but evaluated other available
program performance documentation to ensure that we captured
performance for the entire investment. As such, we maintain that the
use of earned value data (among other sources) and the comparison of
selected investments' Dashboard ratings with our analyses resulted in
a fair assessment.
We are sending copies of this report to interested congressional
committees; the Secretaries of the Departments of Homeland Security,
Transportation, the Treasury, and Veterans Affairs, as well as the
Commissioner of the Social Security Administration; and other
interested parties. In addition, the report will be available at no
charge on GAO's Web site at [hyperlink, http://www.gao.gov].
If you or your staff have any questions on the matters discussed in
this report, please contact me at (202) 512-9286 or pownerd@gao.gov.
Contact points for our Offices of Congressional Relations and Public
Affairs may be found on the last page of this report. GAO staff who
made major contributions to this report are listed in appendix V.
Signed by:
David A. Powner:
Director, Information Technology Management Issues:
[End of section]
Appendix I: Objectives, Scope, and Methodology:
Our objectives were to (1) determine what efforts the Office of
Management and Budget (OMB) has under way to improve the Dashboard and
the ways in which it is using data from the Dashboard to improve
information technology (IT) management and (2) examine the accuracy of
the cost and schedule performance ratings on OMB's Dashboard.
To address the first objective, we examined related OMB guidance and
documentation to determine the ongoing and planned improvements OMB
has made to the Dashboard and discussed these improvements with OMB
officials. Additionally, we evaluated OMB documentation of current and
planned efforts to oversee and improve the management of IT
investments and the Dashboard, such as memos detailing the results of
investment management review sessions, and interviewed OMB officials
regarding these efforts.
To address the second objective, we selected 5 agencies and 10
investments to review. To select these agencies and investments, we
first identified the 12 agencies with the largest IT budgets as
reported in OMB's fiscal year 2011 Exhibit 53. This list of agencies
was narrowed down to 10 because 2 agencies did not have enough
investments that met our criteria (as defined in the following text).
[Footnote 22] We then excluded agencies that were assessed in our
previous review of the Dashboard.[Footnote 23] As a result, we
selected the Departments of Homeland Security (DHS), Transportation
(DOT), the Treasury, and Veterans Affairs (VA), and the Social
Security Administration (SSA). In selecting the specific investments
at each agency, we identified the 10 largest investments that,
according to the fiscal year 2011 budget, were spending more than half
of their budget on IT development, modernization, and enhancement
work. To narrow this list, we excluded investments whose four
different Dashboard ratings (overall, cost, schedule, and chief
information officer) were generally "red" because they were likely
already receiving significant scrutiny. We then selected 2 investments
per agency. As part of this selection process, we considered the
following: investments that use earned value management techniques to
monitor cost and schedule performance, and investments whose four
different Dashboard ratings appeared to be in conflict (e.g., cost and
schedule ratings were "green," yet the overall rating was "red"). The
10 final investments were DHS's U.S. Citizenship and Immigration
Service (USCIS)-Transformation program and U.S. Coast Guard-Command,
Control, Communications, Computers, Intelligence, Surveillance &
Reconnaissance (C4ISR) program; DOT's Automatic Dependent Surveillance-
Broadcast system and En Route Automation Modernization system;
Treasury's Modernized e-File system and Payment Application
Modernization investment; VA's HealtheVet Core and Medical Legacy
investments; and SSA's Disability Case Processing System and
Intelligent Disability program. The 5 agencies account for 22 percent
of the planned IT spending for fiscal year 2011. The 10 investments
selected for case study represent about $1.27 billion in total planned
spending in fiscal year 2011.
To assess the accuracy of the cost and schedule performance ratings on
the Dashboard, we evaluated earned value data of 7 of the selected
investments to determine their current cost and schedule performances
and compared them with the performance ratings on the Dashboard.
[Footnote 24] The investment earned value data were contained in
contractor earned value management performance reports obtained from
the programs. To perform the current performance analysis, we averaged
the cost and schedule variances over the last 6 months and compared
the averages with the performance ratings on the Dashboard. To assess
the accuracy of the cost data, we compared them with data from other
available supporting program documents, including program management
reports and inspector general reports; electronically tested the data
to identify obvious problems with completeness or accuracy; and
interviewed agency and program officials about the earned value
management systems. For the purposes of this report, we determined
that the cost data for these 7 investments were sufficiently reliable.
For the 3 remaining investments, we did not use earned value data
because the investments either did not measure performance using
earned value management or the earned value data were determined to be
insufficiently reliable.[Footnote 25] Instead, we used other program
documentation, such as inspector general reports and internal
performance management system performance ratings, to assess the
accuracy of the cost and schedule ratings on the Dashboard. We did not
test the adequacy of the agency or contractor cost-accounting systems.
Our evaluation of these cost data was based on what we were told by
each agency and the information it could provide.
We also interviewed officials from OMB and the selected agencies and
reviewed OMB guidance to obtain additional information on OMB's and
agencies' efforts to ensure the accuracy of the data used to rate
investment performance on the Dashboard. We used the information
provided by OMB and agency officials to identify the factors
contributing to inaccurate cost and schedule performance ratings on
the Dashboard.
We conducted this performance audit from July 2010 to March 2011 at
the selected agencies' offices in the Washington, D.C., metropolitan
area. Our work was done in accordance with generally accepted
government auditing standards. Those standards require that we plan
and perform the audit to obtain sufficient, appropriate evidence to
provide a reasonable basis for our findings and conclusions based on
our audit objectives. We believe that the evidence obtained provides a
reasonable basis for our findings and conclusions based on our audit
objectives.
[End of section]
Appendix II: Selected Investment Descriptions:
Below are descriptions of each of the selected investments that are
included in this review.
Department of Homeland Security:
USCIS-Transformation:
USCIS-Transformation is a bureauwide program to move from a paper-
based filing system to a centralized, consolidated, electronic
adjudication filing system.
C4ISR:
The C4ISR Common Operating Picture collects and fuses relevant
information for Coast Guard commanders to allow them to efficiently
exercise authority, while directing and monitoring all assigned forces
and first responders, across the range of Coast Guard operations.
Department of Transportation:
Automatic Dependent Surveillance-Broadcast:
The Automatic Dependent Surveillance-Broadcast system is intended to
be an underlying technology in the Federal Aviation Administration's
plan to transform air traffic control from the current radar-based
system to a satellite-based system. The Automatic Dependent
Surveillance-Broadcast system is to bring the precision and
reliability of satellite-based surveillance to the nation's skies.
En Route Automation Modernization:
The En Route Automation Modernization system is to replace the current
computer system used at the Federal Aviation Administration's high-
altitude en route centers. The current system is considered the
backbone of the nation's airspace system and processes flight radar
data, provides communications, and generates display data to air
traffic controllers.
Department of the Treasury:
Modernized e-File:
The current Modernized e-File system is a Web-based platform that
supports electronic tax returns and annual information returns for
large corporations and certain tax-exempt organizations, as well as
individual Form 1040 and other schedules and supporting forms.
[Footnote 26] This system is being updated to include the electronic
filing of the more than 120 remaining 1040 forms and schedules.
Combining these efforts is intended to streamline tax return filing
processes and reduce the costs associated with paper tax returns.
Payment Application Modernization:
The Payment Application Modernization investment is an effort to
modernize the current mainframe-based software applications that are
used to disburse approximately 1 billion federal payments annually.
The existing payment system is a configuration of numerous software
applications that generate check, wire transfer, and Automated
Clearing House payments for federal program agencies, including the
Social Security Administration, Internal Revenue Service, Department
of Veterans Affairs, and others.
Department of Veterans Affairs:
HealtheVet Core:
HealtheVet Core was a set of initiatives to improve health care
delivery, provide the platform for health information sharing, and
update outdated technology. The investment was to support veterans,
their beneficiaries, and providers by advancing the use of health care
information and leading edge IT to provide a patient-centric,
longitudinal, computable health record. According to department
officials, the HealtheVet Core investment was "stopped" in August 2010.
Medical Legacy:
The Medical Legacy program is an effort to provide software
applications necessary to maintain and modify the department's
Veterans Health Information Systems and Technology Architecture.
Social Security Administration:
Disability Case Processing System:
The Disability Case Processing System is intended to provide common
functionality and consistency to support the business processes of
each state's Disability Determination Services. Ultimately, it is to
provide analysis functionality, integrate health IT, improve case
processing, simplify maintenance, and reduce infrastructure growth
costs.
Intelligent Disability:
The Intelligent Disability program is intended to reduce the backlog
of disability claims, develop an electronic case processing system,
and support efficiencies in the claims process.
Table 2 provides additional details for each of the selected
investments in our review.
Table 2: Investment Management Details:
Agency: DHS;
Bureau: U.S. Coast Guard;
Investment name: C4ISR;
Investment start date: 06/30/2004;
Investment end date: 08/31/2029;
Prime contractor/developer: Integrated Coast Guard Systems.
Agency: DHS;
Bureau: Citizenship and Immigration Service;
Investment name: USCIS-Transformation;
Investment start date: 10/01/2007;
Investment end date: 09/30/2022;
Prime contractor/developer: IBM.
Agency: DOT;
Bureau: Federal Aviation Administration;
Investment name: Automatic Dependent Surveillance-Broadcast;
Investment start date: 01/03/2006;
Investment end date: 09/30/2035;
Prime contractor/developer: ITT.
Agency: DOT;
Bureau: Federal Aviation Administration;
Investment name: En Route Automation Modernization;
Investment start date: 10/01/2000;
Investment end date: 09/30/2020;
Prime contractor/developer: Lockheed Martin.
Agency: Treasury;
Bureau: Internal Revenue Service;
Investment name: Modernized e-File;
Investment start date: 08/2002;
Investment end date: 09/30/2020;
Prime contractor/developer: Computer Sciences Corporation and IBM.
Agency: Treasury;
Bureau: Financial Management Service;
Investment name: Payment Application Modernization;
Investment start date: 10/01/2005;
Investment end date: 09/30/2014;
Prime contractor/developer: Federal Reserve Bank of Kansas City.
Agency: VA;
Bureau: Agencywide;
Investment name: HealtheVet Core;
Investment start date: 10/01/2008;
Investment end date: 08/2010;
Prime contractor/developer: Numerous.
Agency: VA;
Bureau: Agencywide;
Investment name: Medical Legacy;
Investment start date: 10/01/2008;
Investment end date: No end date;
Prime contractor/developer: Numerous.
Agency: SSA;
Bureau: Agencywide;
Investment name: Disability Case Processing System;
Investment start date: 10/01/2008;
Investment end date: 09/30/2016;
Prime contractor/developer: SSA.
Agency: SSA;
Bureau: Agencywide;
Investment name: Intelligent Disability;
Investment start date: 10/01/2006;
Investment end date: 09/30/2016;
Prime contractor/developer: SSA.
Source: OMB's Dashboard and data from program officials.
[End of table]
[End of section]
Appendix III: Comments from the Department of Veterans Affairs:
Department Of Veterans Affairs:
Washington DC 20420:
February 1, 2011:
Mr. David A. Powner:
Director:
Information Technology Management Issues:
U.S. Government Accountability Office:
441 G Street, NW:
Washington, DC 20548:
Dear Mr. Powner:
The Department of Veterans Affairs (VA) has reviewed the Government
Accountability Office's (GAO) draft report, "Information Technology:
OMB Has Made Improvements to Its Dashboard, but Further Work Needed to
Ensure Data Accuracy (GAO-11-262) and generally agrees with GAO's
conclusions and concurs with GAO's recommendations to the Department.
The enclosure specifically addresses GAO's recommendations. VA
appreciates the opportunity to comment on your draft report.
Sincerely,
Signed by:
John R. Gingrich:
Chief of Staff:
Enclosure:
[End of letter]
Enclosure:
Department of Veterans Affairs (VA) Comments to Government
Accountability Office (GAO) Draft Report:
Information Technology: OMB Has Made Improvements to Its Dashboard, but
Further Work Needed to Ensure Data Accuracy (GA0-11-262):
GAO Recommendation; To better ensure that the Dashboard provides
accurate cost and schedule performance ratings, we are recommending
that the Secretary of the Department of Veterans Affairs direct the
CIO to:
Recommendation 1: Comply with OMB's guidance on updating the CIO
rating as soon as new information becomes available that impacts the
assessment of a given investment, including when an investment is in
the process of a rebaseline.
VA Response: Concur. The Office of Information and Technology (01T)
will comply with OMB's June 2010 guidance on updating baseline changes
to the Dashboard within 30 days of internal approval. Efforts are
underway to synchronize the projects posted to the OMB Dashboard with
VA's internal Program Management and Accountability System (PMAS).
PMAS tracks the current status of IT projects within VA, and tracks
changes to baselines approved by the CIO. The two investments
identified in this report, HealtheVet Core and Medical Legacy, consist
of many projects, each with differing status. Our revisions to the OMB
Dashboard will extract the necessary information from PMAS, so that
our reporting is consistent. Target Completion Date: April 1, 2011.
Recommendation 2: Work with Medical Legacy officials to comply with
OMB's guidance on updating investment cost and schedule data on the
Dashboard at least monthly.
VA Response: Concur. OIT will comply with OMB's guidance on updating
investment cost and schedule data on the Dashboard at least every 30
days. During the period of GAO's review, VA was reprioritizing much of
its IT workload and responsibilities, and projects were undergoing
constant change. The reprioritization has been completed and it should
be easier for VA to update the OMB Dashboard on a regular basis.
Target Completion Date: April 1, 2011.
Recommendation 3: Ensure Medical Legacy investment data submitted to the
Dashboard are consistent with the investment's internal performance
information.
VA Response: Concur. Unfortunately, we do not have an automated
interface between the OMB Dashboard and PMAS. OIT will have to develop
the means to extract PMAS data into the xml file format required by
OMB. Until this is completed, updating the OMB Dashboard for several
hundred projects on a monthly basis will continue to be menial. Target
Completion Date: April 1, 2011.
[End of section]
Appendix IV: Comments from the Social Security Administration:
Social Security:
Office of the Commissioner:
Social Security Administration:
Baltimore, MD 21235-0001:
Mr. David Powner:
Director, Information Technology Management Issues:
United States Government Accountability Office:
441 G Street, NW:
Washington, D.C. 20548:
Dear Mr. Powner:
Thank you for the opportunity to review your draft report. Our
response is enclosed.
11 you have any questions. please contact me or have your stall
contact Chris Molander, Senior Advisor, Audit Management and Liaison
Staff, at (410)965-7401.
Sincerely,
Signed by:
Dean S. Landis:
Deputy Chief of Staff:
Enclosure:
[End of letter]
Social Security Administration Comments On The Government
Accountability Office Draft Report, "Information Technology: OMB Has
Made Improvements To Its Dashboard, But Further Work Needed To Ensure
Data Accuracy" (GAO-11-262):
Thank you for the opportunity to review the subject report. We offer
the following comments.
Response To Recommendation:
You provide one recommendation for the Social Security Administration:
"... direct the Chief Information Officer to ensure that data
submissions to the Dashboard include accurate investment information
for all required fields."
Response:
We agree and recognize that in the past we have had data quality
issues on the Dashboard. We have taken corrective actions to prevent
future errors.
Other Comments
Page 10, footnote 12:
"OMB. Memorandum for Chief Information Officers: Improving Information
Technology (IT) Project Planning and Executive. M-05-23 (Washington,
D.C. August 4, 2005)."
Comment:
The referenced OMB guidance is obsolete. OMB memorandum M-05-23
mandated use of the American National Standards Institute (ANSI)
compliant Earned Value Management (EVM) for managing and measuring
projects. OMB M-10-27, dated June 28. 2010 replaced OMB M-05-23. and
requires ANSI compliant EVM for managing major contracts. but stops
short of mandating it for government efforts. The memorandum only
requires use of "a performance management system." We use ANSI
compliant procedures and formulas for managing and measuring our major
projects, but most other agencies and the Federal IT Dashboard do not.
Page 26. seventh bullet:
* Develop cost and schedule rating calculations that better reflect
current investment performance."
Comment:
While this is a good recommendation, it stops short of recommending
ANSI compliant EVM formulas and does not address the issue of OMB and
the various agencies using different formulas for calculating
variances. We believe that government-wide standards, preferably ANSI.
would improve the quality of Dashboard reporting.
[End of section]
Appendix V: GAO Contact and Staff Acknowledgments:
GAO Contact:
David A. Powner at (202) 512-9286 or pownerd@gao.gov:
Staff Acknowledgments:
In addition to the contact named above, the following staff also made
key contributions to this report: Carol Cha, Assistant Director;
Shannin O'Neill, Assistant Director; Alina Johnson; Emily Longcore;
Lee McCracken; and Kevin Walsh.
[End of section]
Footnotes:
[1] GAO, Information Technology: OMB's Dashboard Has Increased
Transparency and Oversight, but Improvements Needed, [hyperlink,
http://www.gao.gov/products/GAO-10-701] (Washington, D.C.: July 16,
2010); Information Technology: Management and Oversight of Projects
Totaling Billions of Dollars Need Attention, [hyperlink,
http://www.gao.gov/products/GAO-09-624T] (Washington, D.C.: Apr. 28,
2009); Information Technology: OMB and Agencies Need to Improve
Planning, Management, and Oversight of Projects Totaling Billions of
Dollars, [hyperlink, http://www.gao.gov/products/GAO-08-1051T]
(Washington, D.C.: July 31, 2008); Information Technology: Further
Improvements Needed to Identify and Oversee Poorly Planned and
Performing Projects, [hyperlink,
http://www.gao.gov/products/GAO-07-1211T] (Washington, D.C.: Sept. 20,
2007); Information Technology: Improvements Needed to More Accurately
Identify and Better Oversee Risky Projects Totaling Billions of
Dollars, [hyperlink, http://www.gao.gov/products/GAO-06-1099T]
(Washington, D.C.: Sept. 7, 2006); Information Technology: Agencies
and OMB Should Strengthen Processes for Identifying and Overseeing
High Risk Projects, [hyperlink,
http://www.gao.gov/products/GAO-06-647] (Washington, D.C.: June 15,
2006).
[2] "Major IT investment" means a system or an acquisition requiring
special management attention because it has significant importance to
the mission or function of the agency, a component of the agency, or
another organization; is for financial management and obligates more
than $500,000 annually; has significant program or policy
implications; has high executive visibility; has high development,
operating, or maintenance costs; is funded through other than direct
appropriations; or is defined as major by the agency's capital
planning and investment control process.
[3] [hyperlink, http://www.gao.gov/products/GAO-10-701]. The five
departments included in this review were the Departments of
Agriculture, Defense, Energy, Health and Human Services, and Justice.
[4] The 10 investments are DHS's Transformation program at United
States Citizenship and Immigration Service (USCIS) and Command,
Control, Communications, Computers, Intelligence, Surveillance &
Reconnaissance (C4ISR); DOT's Automatic Dependent Surveillance-
Broadcast and En Route Automation Modernization; Treasury's Modernized
e-File and Payment Application Modernization; VA's HealtheVet Core and
Medical Legacy; and SSA's Disability Case Processing System and
Intelligent Disability. See appendix II for descriptions of each
investment.
[5] 40 U.S.C. § 11302(c).
[6] 44 U.S.C. § 3606.
[7] Generally speaking, e-government refers to the use of IT,
particularly Web-based Internet applications, to enhance the access to
and delivery of government information and service to citizens, to
business partners, to employees, and among agencies at all levels of
government.
[8] [hyperlink, http://www.gao.gov/products/GAO-09-624T]; GAO,
Information Technology: Treasury Needs to Better Define and Implement
Its Earned Value Management Policy, [hyperlink,
http://www.gao.gov/products/GAO-08-951] (Washington, D.C.: Sept. 22,
2008); GAO-07-1211T; GAO-06-1099T; [hyperlink,
http://www.gao.gov/products/GAO-06-647]; Information Technology: OMB
Can Make More Effective Use of Its Investment Reviews, [hyperlink,
http://www.gao.gov/products/GAO-05-276] (Washington, D.C.: Apr. 15,
2005); and Air Traffic Control: FAA Uses Earned Value Techniques to
Help Manage Information Technology Acquisitions, but Needs to Clarify
Policy and Strengthen Oversight, [hyperlink,
http://www.gao.gov/products/GAO-08-756] (Washington, D.C.: July 18,
2008).
[9] [hyperlink, http://www.gao.gov/products/GAO-05-276].
[10] [hyperlink, http://www.gao.gov/products/GAO-06-647].
[11] Exhibit 53s list all of the IT investments and their associated
costs within a federal organization. An Exhibit 300 is also called the
Capital Asset Plan and Business Case. It is used to justify resource
requests for major IT investments and is intended to enable an agency
to demonstrate to its own management, as well as to OMB, that a major
investment is well planned.
[12] OMB, Memorandum for Chief Information Officers: Information
Technology Investment Baseline Management Policy, M-10-27 (Washington,
D.C.: June 28, 2010).
[13] A performance measurement baseline represents the cumulative
value of the planned work over time and represents the formal plan for
accomplishing all work in a certain time and at a specific cost.
[14] [hyperlink, http://www.gao.gov/products/GAO-10-701]. The agencies
in this review included the Departments of Agriculture, Defense,
Energy, Health and Human Services, and Justice.
[15] For the purposes of OMB's Dashboard, activities are used to
measure cost and schedule performance and represent one level of the
investment's work breakdown structure, generally level 3.
[16] OMB, 25 Point Implementation Plan to Reform Federal Information
Technology Management (Washington, D.C., 2010).
[17] OMB, BY 2012 IT Investment Submission Guidelines & Instructions,
(Washington, D.C.: Sept. 13, 2010). This guidance supplements previous
OMB rebaselining guidance contained in OMB's M-10-27.
[18] Treasury's Payment Application Modernization investment was not
included in our analysis because the underlying cost and schedule
performance data were not sufficiently reliable. Specifically, an
independent verification and validation assessment of Payment
Application Modernization's earned value management system, completed
in January 2010, found that the system (the primary source of data
reported to the Dashboard) did not adequately meet Treasury's
standards.
[19] GAO, NextGen Air Transportation System: FAA's Metrics Can Be Used
to Report on Status of Individual Programs, but Not of Overall NextGen
Implementation or Outcomes, [hyperlink,
http://www.gao.gov/products/GAO-10-629] (Washington, D. C.: July 27,
2010).
[20] M-10-27.
[21] M-10-27.
[22] We excluded the Department of Commerce and the National
Aeronautics and Space Administration.
[23] [hyperlink, http://www.gao.gov/products/GAO-10-701]. The agencies
in this review were the Departments of Agriculture, Defense, Energy,
Health and Human Services, and Justice.
[24] The 7 investments are DHS's Transformation program at USCIS and
C4ISR; DOT's Automatic Dependent Surveillance-Broadcast and En Route
Automation Modernization; Treasury's Modernized e-File; and SSA's
Disability Case Processing System and Intelligent Disability.
[25] The 3 investments are Treasury's Payment Application
Modernization and VA's HealtheVet Core and Medical Legacy. During the
course of our review, VA indicated that earned value management was
not used at the agency; however, we kept these two investments in our
review because the department was able to provide comparable
performance information for evaluation.
[26] The Form 1040 is the Internal Revenue Service's form for U.S.
individual income tax returns.
[End of section]
GAO's Mission:
The Government Accountability Office, the audit, evaluation and
investigative arm of Congress, exists to support Congress in meeting
its constitutional responsibilities and to help improve the performance
and accountability of the federal government for the American people.
GAO examines the use of public funds; evaluates federal programs and
policies; and provides analyses, recommendations, and other assistance
to help Congress make informed oversight, policy, and funding
decisions. GAO's commitment to good government is reflected in its core
values of accountability, integrity, and reliability.
Obtaining Copies of GAO Reports and Testimony:
The fastest and easiest way to obtain copies of GAO documents at no
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each
weekday, GAO posts newly released reports, testimony, and
correspondence on its Web site. To have GAO e-mail you a list of newly
posted products every afternoon, go to [hyperlink, http://www.gao.gov]
and select "E-mail Updates."
Order by Phone:
The price of each GAO publication reflects GAO‘s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black and
white. Pricing and ordering information is posted on GAO‘s Web site,
[hyperlink, http://www.gao.gov/ordering.htm].
Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537.
Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional
information.
To Report Fraud, Waste, and Abuse in Federal Programs:
Contact:
Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]:
E-mail: fraudnet@gao.gov:
Automated answering system: (800) 424-5454 or (202) 512-7470:
Congressional Relations:
Ralph Dawn, Managing Director, dawnr@gao.gov:
(202) 512-4400:
U.S. Government Accountability Office:
441 G Street NW, Room 7125:
Washington, D.C. 20548:
Public Affairs:
Chuck Young, Managing Director, youngc1@gao.gov:
(202) 512-4800:
U.S. Government Accountability Office:
441 G Street NW, Room 7149:
Washington, D.C. 20548: