IT Dashboard
Accuracy Has Improved, and Additional Efforts Are Under Way to Better Inform Decision Making
Gao ID: GAO-12-210 November 7, 2011
Each year the federal government spends billions of dollars on information technology (IT) investments. Given the importance of program oversight, the Office of Management and Budget (OMB) established a public website, referred to as the IT Dashboard, that provides detailed information on about 800 federal IT investments, including assessments of actual performance against cost and schedule targets (referred to as ratings). According to OMB, these data are intended to provide both a near-real-time and historical perspective of performance. In the third of a series of Dashboard reviews, GAO was asked to examine the accuracy of the Dashboard's cost and schedule performance ratings. To do so, GAO compared the performance of eight major investments undergoing development from four agencies with large IT budgets (the Departments of Commerce, the Interior, and State, as well as the General Services Administration) against the corresponding ratings on the Dashboard, and interviewed OMB and agency officials.
Since GAO's first report in July 2010, the accuracy of investment ratings has improved because of OMB's refinement of the Dashboard's cost and schedule calculations. Most of the Dashboard's cost and schedule ratings for the eight selected investments were accurate; however, they did not sufficiently emphasize recent performance for informed oversight and decision making. (1) Cost ratings were accurate for four of the investments that GAO reviewed, and schedule ratings were accurate for seven. In general, the number of discrepancies found in GAO's reviews has decreased. In each case where GAO found rating discrepancies, the Dashboard's ratings showed poorer performance than GAO's assessment. Reasons for inaccurate Dashboard ratings included missing or incomplete agency data submissions, erroneous data submissions, and inconsistent investment baseline information. In all cases, the selected agencies found and corrected these inaccuracies in subsequent Dashboard data submissions. Such continued diligence by agencies to report complete and timely data will help ensure that the Dashboard's performance ratings are accurate. In the case of the General Services Administration, officials did not disclose that performance data on the Dashboard were unreliable for one investment because of an ongoing baseline change. Without proper disclosure of pending baseline changes, OMB and other external oversight bodies may not have the appropriate information needed to make informed decisions. (2) While the Dashboard's cost and schedule ratings provide a cumulative view of performance, they did not emphasize current performance--which is needed to meet OMB's goal of reporting near-real-time performance. GAO's past work has shown cost and schedule performance information from the most recent 6 months to be a reliable benchmark for providing a near-real-time perspective on investment status. By combining recent and historical performance, the Dashboard's ratings may mask the current status of the investment, especially for lengthy acquisitions. GAO found that this discrepancy between cumulative and current performance ratings was reflected in two of the selected investments. For example, a Department of the Interior investment's Dashboard cost rating indicated normal performance from December 2010 through March 2011, whereas GAO's analysis of current performance showed that cost performance needed attention for those months. If fully implemented, OMB's recent and ongoing changes to the Dashboard, including new cost and schedule rating calculations and updated investment baseline reporting, should address this issue. These Dashboard changes could be important steps toward improving insight into current performance and the utility of the Dashboard for effective executive oversight. GAO plans to evaluate the new version of the Dashboard once it is publicly available in 2012. GAO is recommending that the General Services Administration disclose on the Dashboard when one of its investments is in the process of a rebaseline. Since GAO previously recommended that OMB improve how it rates investments relative to current performance, it is not making further recommendations. The General Services Administration agreed with the recommendation. OMB provided technical comments, which GAO incorporated as appropriate.
Recommendations
Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.
Director:
David A. Powner
Team:
Government Accountability Office: Information Technology
Phone:
(202) 512-9286
GAO-12-210, IT Dashboard: Accuracy Has Improved, and Additional Efforts Are Under Way to Better Inform Decision Making
This is the accessible text file for GAO report number GAO-12-210
entitled 'IT Dashboard: Accuracy Has Improved, and Additional Efforts
Are Under Way to Better Inform Decision Making' which was released on
December 7, 2011.
This text file was formatted by the U.S. Government Accountability
Office (GAO) to be accessible to users with visual impairments, as
part of a longer term project to improve GAO products' accessibility.
Every attempt has been made to maintain the structural and data
integrity of the original printed product. Accessibility features,
such as text descriptions of tables, consecutively numbered footnotes
placed at the end of the file, and the text of agency comment letters,
are provided but may not exactly duplicate the presentation or format
of the printed version. The portable document format (PDF) file is an
exact electronic replica of the printed version. We welcome your
feedback. Please E-mail your comments regarding the contents or
accessibility features of this document to Webmaster@gao.gov.
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed
in its entirety without further permission from GAO. Because this work
may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this
material separately.
United States Government Accountability Office:
GAO:
Report to Congressional Requesters:
November 2011:
IT Dashboard:
Accuracy Has Improved, and Additional Efforts Are Under Way to Better
Inform Decision Making:
GAO-12-210:
GAO Highlights:
Highlights of GAO-12-210, a report to congressional requesters.
Why GAO Did This Study:
Each year the federal government spends billions of dollars on
information technology (IT) investments. Given the importance of
program oversight, the Office of Management and Budget (OMB)
established a public website, referred to as the IT Dashboard, that
provides detailed information on about 800 federal IT investments,
including assessments of actual performance against cost and schedule
targets (referred to as ratings). According to OMB, these data are
intended to provide both a near-real-time and historical perspective
of performance. In the third of a series of Dashboard reviews, GAO was
asked to examine the accuracy of the Dashboard‘s cost and schedule
performance ratings. To do so, GAO compared the performance of eight
major investments undergoing development from four agencies with large
IT budgets (the Departments of Commerce, the Interior, and State, as
well as the General Services Administration) against the corresponding
ratings on the Dashboard, and interviewed OMB and agency officials.
What GAO Found:
Since GAO‘s first report in July 2010, the accuracy of investment
ratings has improved because of OMB‘s refinement of the Dashboard‘s
cost and schedule calculations. Most of the Dashboard‘s cost and
schedule ratings for the eight selected investments were accurate;
however, they did not sufficiently emphasize recent performance for
informed oversight and decision making.
* Cost ratings were accurate for four of the investments that GAO
reviewed, and schedule ratings were accurate for seven. In general,
the number of discrepancies found in GAO‘s reviews has decreased. In
each case where GAO found rating discrepancies, the Dashboard‘s
ratings showed poorer performance than GAO‘s assessment. Reasons for
inaccurate Dashboard ratings included missing or incomplete agency
data submissions, erroneous data submissions, and inconsistent
investment baseline information. In all cases, the selected agencies
found and corrected these inaccuracies in subsequent Dashboard data
submissions. Such continued diligence by agencies to report complete
and timely data will help ensure that the Dashboard‘s performance
ratings are accurate. In the case of the General Services
Administration, officials did not disclose that performance data on
the Dashboard were unreliable for one investment because of an ongoing
baseline change. Without proper disclosure of pending baseline
changes, OMB and other external oversight bodies may not have the
appropriate information needed to make informed decisions.
* While the Dashboard‘s cost and schedule ratings provide a cumulative
view of performance, they did not emphasize current performance”-which
is needed to meet OMB‘s goal of reporting near-real-time performance.
GAO‘s past work has shown cost and schedule performance information
from the most recent 6 months to be a reliable benchmark for providing
a near-real-time perspective on investment status. By combining recent
and historical performance, the Dashboard‘s ratings may mask the
current status of the investment, especially for lengthy acquisitions.
GAO found that this discrepancy between cumulative and current
performance ratings was reflected in two of the selected investments.
For example, a Department of the Interior investment‘s Dashboard cost
rating indicated normal performance from December 2010 through March
2011, whereas GAO‘s analysis of current performance showed that cost
performance needed attention for those months. If fully implemented,
OMB‘s recent and ongoing changes to the Dashboard, including new cost
and schedule rating calculations and updated investment baseline
reporting, should address this issue. These Dashboard changes could be
important steps toward improving insight into current performance and
the utility of the Dashboard for effective executive oversight. GAO
plans to evaluate the new version of the Dashboard once it is publicly
available in 2012.
What GAO Recommends:
GAO is recommending that the General Services Administration disclose
on the Dashboard when one of its investments is in the process of a
rebaseline. Since GAO previously recommended that OMB improve how it
rates investments relative to current performance, it is not making
further recommendations. The General Services Administration agreed
with the recommendation. OMB provided technical comments, which GAO
incorporated as appropriate.
View [hyperlink, http://www.gao.gov/products/GAO-12-210] or key
components. For more information, contact David A. Powner at (202) 512-
9286 or pownerd@gao.gov.
[End of section]
Contents:
Letter:
Background:
Most Dashboard Ratings Were Accurate, but Did Not Emphasize Recent
Performance:
Conclusions:
Recommendation for Executive Action:
Agency Comments and Our Evaluation:
Appendix I: Objective, Scope, and Methodology:
Appendix II: Selected Investment Descriptions:
Appendix III: Comments from the Department of Commerce:
Appendix IV: Comments from the General Services Administration:
Appendix V: GAO Contact and Staff Acknowledgments:
Tables:
Table 1: Assessment of Selected Investments' Cost and Schedule Ratings:
Table 2: Causes of Inaccurate Ratings on the Dashboard:
Table 3: Investment Management Details:
Figures:
Figure 1: Dashboard Cost and Schedule Ratings Scale:
Figure 2: Overall Performance Ratings of Major IT Investments on the
Dashboard:
Abbreviations:
CIO: chief information officer:
GSA: General Services Administration:
IT: information technology:
OMB: Office of Management and Budget:
[End of section]
United States Government Accountability Office:
Washington, DC 20548:
November 7, 2011:
Congressional Requesters:
Billions of taxpayer dollars are spent on information technology (IT)
investments each year; federal IT spending reported to the Office of
Management and Budget (OMB) totaled approximately $79 billion in
fiscal year 2011. During the past several years, we have issued
multiple reports and testimonies and made numerous recommendations to
OMB to improve the transparency, oversight, and management of the
federal government's IT investments.[Footnote 1] In June 2009, OMB
deployed a public website, known as the IT Dashboard, which provides
detailed information on federal agencies' major IT investments,
including assessments of actual performance against cost and schedule
targets (referred to as ratings) for approximately 800 major federal
IT investments.[Footnote 2] The Dashboard aims to improve the
transparency and oversight of these investments.
In July 2010, we completed our first review of the Dashboard and
reported that the cost and schedule ratings were not always accurate
because of limitations with OMB's calculations.[Footnote 3] We
recommended that OMB report to Congress on the effect of its planned
Dashboard calculation changes on the accuracy of performance
information and provide guidance to agencies that standardizes
activity reporting.
In March 2011, we completed our second review of the Dashboard and
again reported that the cost and schedule ratings were not always
accurate. Specifically, this was due to weaknesses in agencies'
practices and limitations with OMB's calculations.[Footnote 4] We
recommended that selected agencies take steps to improve the accuracy
and reliability of Dashboard information and that OMB improve how it
rates investments relative to current performance and schedule
variance.
This is the third report in our series of Dashboard reviews and
responds to your request that we examine the accuracy of the cost and
schedule performance ratings on the Dashboard for selected
investments. To accomplish this objective, we reviewed 4 agencies with
large IT budgets--the Departments of Commerce, the Interior, and
State, as well as the General Services Administration (GSA)--after
excluding the 10 agencies included in the first two Dashboard reviews.
[Footnote 5] The 4 agencies account for about 7 percent of IT spending
for fiscal year 2011. We then selected eight major investments
undergoing development, which represent about $486 million in total
spending for fiscal year 2011. We analyzed monthly cost and schedule
performance reports, program management documents, and operational
analyses for the eight investments to assess program performance. We
then compared our analyses of investment performance against the
corresponding ratings on the Dashboard to determine if the ratings
were accurate. Additionally, we interviewed officials from OMB and the
agencies to obtain further information on their efforts to ensure the
accuracy of the data used to rate investment performance on the
Dashboard. We did not test the adequacy of the agency or contractor
cost-accounting systems. Our evaluation of these cost data was based
on the documentation the agencies provided.
We conducted this performance audit from February 2011 to November
2011 in accordance with generally accepted government auditing
standards. Those standards require that we plan and perform the audit
to obtain sufficient, appropriate evidence to provide a reasonable
basis for our findings and conclusions based on our audit objective.
We believe that the evidence obtained provides a reasonable basis for
our findings and conclusions based on our audit objective. Further
details of our objective, scope, and methodology are provided in
appendix I.
Background:
Each year, OMB and federal agencies work together to determine how
much the government plans to spend on IT investments and how these
funds are to be allocated. In fiscal year 2011, government IT spending
reported to OMB totaled approximately $79 billion. OMB plays a key
role in helping federal agencies manage their investments by working
with them to better plan, justify, and determine how much they need to
spend on projects and how to manage approved projects.
To assist agencies in managing their investments, Congress enacted the
Clinger-Cohen Act of 1996, which requires OMB to establish processes
to analyze, track, and evaluate the risks and results of major capital
investments in information systems made by federal agencies and report
to Congress on the net program performance benefits achieved as a
result of these investments.[Footnote 6] Further, the act places
responsibility for managing investments with the heads of agencies and
establishes chief information officers (CIO) to advise and assist
agency heads in carrying out this responsibility. The Clinger-Cohen
Act strengthened the requirements of the Paperwork Reduction Act of
1995, which established agency responsibility for maximizing value and
assessing and managing the risks of major information systems
initiatives.[Footnote 7] The Paperwork Reduction Act also requires
that OMB develop and oversee policies, principles, standards, and
guidelines for federal agency IT functions, including periodic
evaluations of major information systems.[Footnote 8] Another key law
is the E-Government Act of 2002, which requires OMB to report annually
to Congress on the status of e-government.[Footnote 9] In these
reports, referred to as Implementation of the E-Government Act
reports, OMB is to describe the administration's use of e-government
principles to improve government performance and the delivery of
information and services to the public.
To help carry out its oversight role, in 2003, OMB established the
Management Watch List, which included mission-critical projects that
needed to improve performance measures, project management, IT
security, or overall justification for inclusion in the federal
budget. Further, in August 2005, OMB established a High-Risk List,
which consisted of projects identified by federal agencies, with the
assistance of OMB, as requiring special attention from oversight
authorities and the highest levels of agency management.
Over the past several years, we have reported and testified on OMB's
initiatives to highlight troubled IT projects, justify investments,
and use project management tools.[Footnote 10] We have made multiple
recommendations to OMB and federal agencies to improve these
initiatives to further enhance the oversight and transparency of
federal projects. Among other things, we recommended that OMB develop
a central list of projects and their deficiencies and analyze that
list to develop governmentwide and agency assessments of the progress
and risks of the investments, identifying opportunities for continued
improvement.[Footnote 11] In addition, in 2006 we also recommended
that OMB develop a single aggregate list of high-risk projects and
their deficiencies and use that list to report to Congress on progress
made in correcting high-risk problems.[Footnote 12] As a result, OMB
started publicly releasing aggregate data on its Management Watch List
and disclosing the projects' deficiencies. Furthermore, OMB issued
governmentwide and agency assessments of the projects on the
Management Watch List and identified risks and opportunities for
improvement, including in the areas of risk management and security.
OMB's Dashboard Publicizes Investment Details and Performance Status:
More recently, to further improve the transparency and oversight of
agencies' IT investments, in June 2009, OMB publicly deployed a
website, known as the IT Dashboard, which replaced the Management
Watch List and High-Risk List. It displays federal agencies' cost,
schedule, and performance data for the approximately 800 major federal
IT investments at 27 federal agencies. According to OMB, these data
are intended to provide a near-real-time perspective on the
performance of these investments, as well as a historical perspective.
Further, the public display of these data is intended to allow OMB;
other oversight bodies, including Congress; and the general public to
hold the government agencies accountable for results and progress.
The Dashboard was initially deployed in June 2009 based on each
agency's exhibit 53 and exhibit 300 submissions.[Footnote 13] After
the initial population of data, agency CIOs have been responsible for
updating cost, schedule, and performance fields on a monthly basis,
which is a major improvement from the quarterly reporting cycle OMB
previously used for the Management Watch List and High-Risk List.
For each major investment, the Dashboard provides performance ratings
on cost and schedule, a CIO evaluation, and an overall rating, which
is based on the cost, schedule, and CIO ratings. As of July 2010, the
cost rating is determined by a formula that calculates the amount by
which an investment's total actual costs deviate from the total
planned costs. Similarly, the schedule rating is the variance between
the investment's planned and actual progress to date. Figure 1
displays the rating scale and associated categories for cost and
schedule variations.
Figure 1: Dashboard Cost and Schedule Ratings Scale:
[Refer to PDF for image: illustration]
Normal:
Variance from planned costs or schedule: 0-10%;
Rating: 10-7.
Needs attention:
Variance from planned costs or schedule: 10-30%;
Rating: 7-3.
Significant concerns:
Variance from planned costs or schedule: 30-50+%;
Rating: 3-0.
Source: GAO based on OMB's Dashboard.
[End of figure]
Each major investment on the Dashboard also includes a rating
determined by the agency CIO, which is based on his or her evaluation
of the performance of each investment. The rating is expected to take
into consideration the following criteria: risk management,
requirements management, contractor oversight, historical performance,
and human capital. This rating is to be updated when new information
becomes available that would affect the assessment of a given
investment.
Last, the Dashboard calculates an overall rating for each major
investment. This overall rating is an average of the cost, schedule,
and CIO ratings, with each representing one-third of the overall
rating. However, when the CIO's rating is lower than both the cost and
schedule ratings, the CIO's rating will be the overall rating. Figure
2 shows the overall performance ratings of the 797 major investments
on the Dashboard as of August 2011.
Figure 2: Overall Performance Ratings of Major IT Investments on the
Dashboard:
[Refer to PDF for image: pie-chart]
Normal: 67%; $22.7 billion; 536 investments;
Needs attention: 29%; $16.1 billion; 228 investments;
Significant concerns: 4%; $1.8 billion; 33 investments.
Source: OMB‘s Dashboard.
[End of figure]
OMB Has Taken Steps to Address Prior GAO Recommendations on Improving
Dashboard Accuracy:
We have previously reported that the cost and schedule ratings on
OMB's Dashboard were not always accurate for selected agencies.
* In July 2010, we reviewed investments at the Departments of
Agriculture, Defense, Energy, Health and Human Services, and Justice,
and found that the cost and schedule ratings on the Dashboard were not
accurate for 4 of 8 selected investments and that the ratings did not
take into consideration current performance; specifically, the ratings
calculations factored in only completed activities.[Footnote 14] We
also found that there were large inconsistencies in the number of
investment activities that agencies report on the Dashboard. In the
report, we recommended that OMB report on the effect of planned
changes to the Dashboard and provide guidance to agencies to
standardize activity reporting. We further recommended that the
selected agencies comply with OMB's guidance to standardize activity
reporting. OMB and the Department of Energy concurred with our
recommendations, while the other selected agencies provided no
comments. In July 2010, OMB updated the Dashboard's cost and schedule
calculations to include both ongoing and completed activities.
* In March 2011, we reported that agencies and OMB need to do more to
ensure the Dashboard's data accuracy.[Footnote 15] Specifically, we
reviewed investments at the Departments of Homeland Security,
Transportation, the Treasury, and Veterans Affairs, and the Social
Security Administration. We found that cost ratings were inaccurate
for 6 of 10 selected investments and schedule ratings were inaccurate
for 9 of 10. We also found that weaknesses in agency and OMB practices
contributed to the inaccuracies on the Dashboard; for example,
agencies had uploaded erroneous data, and OMB's ratings did not
emphasize current performance. We therefore recommended that the
selected agencies provide complete and accurate data to the Dashboard
on a monthly basis and ensure that the CIOs' ratings of investments
disclose issues that could undermine the accuracy of investment data.
Further, we recommended that OMB improve how it rates investments
related to current performance and schedule variance. The selected
agencies generally concurred with our recommendation. OMB disagreed
with the recommendation to change how it reflects current investment
performance in its ratings because Dashboard data are updated on a
monthly basis. However, we maintained that current investment
performance may not always be as apparent as it should be; while data
are updated monthly, the ratings include historical data, which can
mask more recent performance.
Most Dashboard Ratings Were Accurate, but Did Not Emphasize Recent
Performance:
Most of the cost and schedule ratings on the Dashboard were accurate,
but did not provide sufficient emphasis on recent performance to
inform oversight and decision making. Performance rating discrepancies
were largely due to missing or incomplete data submissions from the
agencies. However, we generally found fewer such discrepancies than in
previous reviews, and in all cases the selected agencies found and
corrected these inaccuracies in subsequent submissions. In the case of
GSA, officials did not disclose that performance data on the Dashboard
were unreliable for one investment because of an ongoing baseline
change. Without proper disclosure of pending baseline changes, the
Dashboard will not provide the appropriate insight into investment
performance needed for near-term decision making. Additionally,
because of the Dashboard's ratings calculations, the current
performance for certain investments was not as apparent as it should
be for near-real-time reporting purposes. If fully implemented, OMB's
recent and ongoing changes to the Dashboard, including new cost and
schedule rating calculations and updated investment baseline
reporting, should address this issue. These Dashboard changes could be
important steps toward improving insight into current performance and
the utility of the Dashboard for effective executive oversight.
Most Cumulative Performance Ratings Were Accurate:
In general, the number of discrepancies we found in our reviews of
selected investments has decreased since July 2010. According to our
assessment of the eight selected investments, half had accurate cost
ratings and nearly all had accurate schedule ratings on the
Dashboard.[Footnote 16] Table 1 shows our assessment of the selected
investments during a 6-month period from October 2010 through March
2011.
Table 1: Assessment of Selected Investments' Cost and Schedule Ratings:
Agency: Commerce;
Investment: Advanced Weather Interactive Processing System;
Cost inaccuracies: No;
Schedule inaccuracies: No.
Agency: Commerce;
Investment: Geostationary Operational Environmental Satellite--Series
R Ground Segment;
Cost inaccuracies: No;
Schedule inaccuracies: No.
Agency: Interior;
Investment: Financial and Business Management System;
Cost inaccuracies: Yes;
Schedule inaccuracies: No.
Agency: Interior;
Investment: Land Satellites Data System;
Cost inaccuracies: Yes;
Schedule inaccuracies: Yes.
Agency: GSA;
Investment: Regional Business Application;
Cost inaccuracies: No;
Schedule inaccuracies: No.
Agency: GSA;
Investment: System for Tracking and Administering Real Property/Realty
Services;
Cost inaccuracies: Yes;
Schedule inaccuracies: No.
Agency: State;
Investment: Global Foreign Affairs Compensation System;
Cost inaccuracies: Yes;
Schedule inaccuracies: No.
Agency: State;
Investment: Integrated Logistics Management System;
Cost inaccuracies: No;
Schedule inaccuracies: No.
Source: GAO analysis of OMB's Dashboard and agency data.
[End of table]
As shown above, the Dashboard's cost ratings for four of the eight
selected investments were accurate, and four did not match the results
of our analyses during the period from October 2010 through March
2011. Specifically,
* State's Global Foreign Affairs Compensation System and Interior's
Land Satellites Data System investments had inaccurate cost ratings
for at least 5 months,
* GSA's System for Tracking and Administering Real Property/Realty
Services was inaccurate for 3 months, and:
* Interior's Financial and Business Management System was inaccurate
for 2 months.
In all of these cases, the Dashboard's cost ratings showed poorer
performance than our assessments. For example, State's Global Foreign
Affairs Compensation System investment's cost performance was rated
"yellow" (i.e., needs attention) in October and November 2010, and
"red" (i.e., significant concerns) from December 2010 through March
2011, whereas our analysis showed its cost performance was "green"
(i.e., normal) during those months. Additionally, GSA's System for
Tracking and Administering Real Property/Realty Services investment's
cost performance was rated "yellow" from October 2010 through December
2010, while our analysis showed its performance was "green" for those
months.
Regarding schedule, the Dashboard's ratings for seven of the eight
selected investments matched the results of our analyses over this
same 6-month period, while the ratings for one did not. Specifically,
Interior's Land Satellites Data System investment's schedule ratings
were inaccurate for 2 months; its schedule performance on the
Dashboard was rated "yellow" in November and December 2010, whereas
our analysis showed its performance was "green" for those months. As
with cost, the Dashboard's schedule ratings for this investment for
these 2 months showed poorer performance than our assessment.
There were three primary reasons for the inaccurate cost and schedule
Dashboard ratings described above: agencies did not report data to the
Dashboard or uploaded incomplete submissions, agencies reported
erroneous data to the Dashboard, and the investment baseline on the
Dashboard was not reflective of the investment's actual baseline (see
table 2).
Table 2: Causes of Inaccurate Ratings on the Dashboard:
Agency: Interior;
Investment: Financial and Business Management System;
Missing or incomplete data submissions: [Check];
Erroneous data submissions: [Empty];
Inconsistent program baseline: [Empty].
Agency: Interior;
Investment: Land Satellites Data System;
Missing or incomplete data submissions: [Check];
Erroneous data submissions: [Check];
Inconsistent program baseline: [Empty].
Agency: GSA;
Investment: System for Tracking and Administering Real Property/Realty
Services;
Missing or incomplete data submissions: [Check];
Erroneous data submissions: [Empty];
Inconsistent program baseline: [Check].
Agency: State;
Investment: Global Foreign Affairs Compensation System;
Missing or incomplete data submissions: [Check];
Erroneous data submissions: [Empty];
Inconsistent program baseline: [Empty].
Agency: Total;
Missing or incomplete data submissions: 4;
Erroneous data submissions: 1;
Inconsistent program baseline: 1.
Source: Agency officials and GAO analysis of Dashboard data.
[End of table]
* Missing or incomplete data submissions: Four selected investments
did not upload complete and timely data submissions to the Dashboard.
For example, State officials did not upload data for one of the Global
Foreign Affairs Compensation System investment's activities from
October 2010 through December 2010. According to a State official, the
department's investment management system was not properly set to
synchronize all activity data with the Dashboard. The official stated
that this issue was corrected in December 2010.
* Erroneous data submissions: One selected investment--Interior's Land
Satellites Data System--reported erroneous data to the Dashboard.
Specifically, Interior officials mistakenly reported certain
activities as fully complete rather than partially complete in data
submissions from September 2010 through December 2010. Agency
officials acknowledged the error and stated that they submitted
correct data in January and February 2011 after they realized there
was a problem.
* Inconsistent investment baseline: One selected investment--GSA's
System for Tracking and Administering Real Property/Realty Services--
reported a baseline on the Dashboard that did not match the actual
baseline tracked by the agency. In June 2010, OMB issued new guidance
on rebaselining, which stated that agencies should update investment
baselines on the Dashboard within 30 days of internal approval of a
baseline change and that this update will be considered notification
to OMB.[Footnote 17] The GSA investment was rebaselined internally in
November 2010, but the baseline on the Dashboard was not updated until
February 2011. GSA officials stated that they submitted the rebaseline
information to the Dashboard in January 2011 and thought that it had
been successfully uploaded; however, in February 2011, officials
realized that the new baseline was not on the Dashboard. GSA officials
successfully uploaded the rebaseline information in late February 2011.
Additionally, OMB's guidance states that agency CIOs should update the
CIO evaluation on the Dashboard as soon as new information becomes
available that affects the assessment of a given investment. During an
agency's internal process to update an investment baseline, the
baseline on the Dashboard will not be reflective of the current state
of the investment; thus, investment CIO ratings should disclose such
information. However, the CIO evaluation ratings for GSA's System for
Tracking and Administering Real Property/Realty Services investment
did not provide such a disclosure. Without proper disclosure of
pending baseline changes and resulting data reliability weaknesses,
OMB and other external oversight groups will not have the appropriate
information to make informed decisions about these investments.
In all of the instances where we identified inaccurate cost or
schedule ratings, agencies had independently recognized that there was
a problem with their Dashboard reporting practices and taken steps to
correct them. Such continued diligence by agencies to report accurate
and timely data will help ensure that the Dashboard's performance
ratings are accurate.
Dashboard Ratings Did Not Always Highlight Current Performance:
According to OMB, the Dashboard is intended to provide a near-real-
time perspective on the performance of all major IT investments.
Furthermore, our work has shown cost and schedule performance
information from the most recent 6 months to be a reliable benchmark
for providing this perspective on investment status.[Footnote 18] This
benchmark for current performance provides information needed by OMB
and agency executive management to inform near-term budgetary
decisions, to obtain early warning signs of impending schedule delays
and cost overruns, and to ensure that actions taken to reverse
negative performance trends are timely and effective. The use of such
a benchmark is also consistent with OMB's exhibit 300 guidelines,
which specify that project activities should be broken into segments
of 6 months or less.
In contrast, the Dashboard's cost and schedule ratings calculations
reflect a more cumulative view of investment performance dating back
to the inception of the investment. Thus, a rating for a given month
is based on information from the entire history of each investment.
While a historical perspective is important for measuring performance
over time relative to original cost and schedule targets, this
information may be dated for near-term budget and programmatic
decisions. Moreover, combining more recent and historical performance
can mask the current status of the investment. As more time elapses,
the impact of this masking effect will increase because current
performance becomes a relatively smaller factor in an investment's
cumulative rating.
In addition to our assessment of cumulative investment performance (as
reflected in the Dashboard ratings), we determined whether the ratings
were also reflective of current performance. Our analysis showed that
two selected investments had a discrepancy between cumulative and
current performance ratings. Specifically,
* State's Global Foreign Affairs Compensation System investment's
schedule performance was rated "green" on the Dashboard from October
2010 through March 2011, whereas our analysis showed its current
performance was "yellow" for most of that time. From a cumulative
perspective, the Dashboard's ratings for this investment were accurate
(as previously discussed in this report); however, these take into
account activities dating back to 2003.
* Interior's Financial and Business Management System investment's
cost performance was rated "green" on the Dashboard from December 2010
through March 2011; in contrast, our analysis showed its current
performance was "yellow" for those months. The Dashboard's cost
ratings accurately reflected cumulative cost performance from 2003
onward.
Further analysis of the Financial and Business Management System's
schedule performance ratings on the Dashboard showed that because of
the amount of historical performance data factored into its ratings as
of July 2011, it would take a minimum schedule variance of 9 years on
the activities currently under way in order to change its rating from
"green" to "yellow," and a variance of more than 30 years before
turning "red."
We have previously recommended to OMB that it develop cost and
schedule Dashboard ratings that better reflect current investment
performance.[Footnote 19] At that time, OMB disagreed with the
recommendation, stating that real-time performance is always reflected
in the ratings since current investment performance data are uploaded
to the Dashboard on a monthly basis.
However, in September 2011, officials from OMB's Office of E-
Government & Information Technology stated that changes designed to
improve insight into current performance on the Dashboard have either
been made or are under way. If OMB fully implements these actions, the
changes should address our recommendation. Specifically,
* New project-level reporting: In July 2011, OMB issued new guidance
to agencies regarding the information that is to be reported to the
Dashboard.[Footnote 20] In particular, beginning in September 2011,
agencies are required to report data to the Dashboard at a detailed
project level, rather than at the investment level previously
required. Further, the guidance emphasizes that ongoing work
activities should be broken up and reported in increments of 6 months
or less.
* Updated investment baseline reporting: OMB officials stated that
agencies are required to update existing investment baselines to
reflect planned fiscal year 2012 activities, as well as data from the
last quarter of fiscal year 2011 onward. OMB officials stated that
historical investment data that are currently on the Dashboard will be
maintained, but plans have yet to be finalized on how these data may
be displayed on the new version of the Dashboard.
* New cost and schedule ratings calculations: OMB officials stated
that work is under way to change the Dashboard's cost and schedule
ratings calculations. Specifically, officials said that the new
calculations will emphasize ongoing work and reflect only development
efforts, not operations and maintenance activities. In combination
with the first action on defining 6-month work activities, the
calculations should result in ratings that better reflect current
performance.
OMB plans for the new version of the Dashboard to be fully viewable by
the public upon release of the President's Budget for fiscal year
2013. Once OMB implements these changes, they could be significant
steps toward improving insight into current investment performance on
the Dashboard. We plan to evaluate the new version of the Dashboard
once it is publicly available in 2012.
Conclusions:
Since our first review in July 2010, the accuracy of investment
ratings on the Dashboard has improved because of OMB's refinement of
its cost and schedule calculations, and the number of discrepancies
found in our reviews has decreased. While rating inaccuracies continue
to exist, for the discrepancies we identified, the Dashboard's ratings
generally showed poorer performance than our assessments. Reasons for
inaccurate Dashboard ratings included missing or incomplete agency
data submissions, erroneous data submissions, and inconsistent
investment baseline information. In all cases, the selected agencies
detected the discrepancies and corrected them in subsequent Dashboard
data submissions. However, in GSA's case, officials did not disclose
that performance data on the Dashboard were unreliable for one
investment because of an ongoing baseline change.
Additionally, the Dashboard's ratings calculations reflect cumulative
investment performance--a view that is important but does not meet
OMB's goal of reporting near-real-time performance. Our IT investment
management work has shown a 6-month view of performance to be a
reliable benchmark for current performance, as well as a key component
of informed executive decisions about the budget and program. OMB's
Dashboard changes could be important steps toward improving insight
into current performance and the utility of the Dashboard for
effective executive oversight.
Recommendation for Executive Action:
To better ensure that the Dashboard provides accurate cost and
schedule performance ratings, we are recommending that the
Administrator of GSA direct its CIO to comply with OMB's guidance
related to Dashboard data submissions by updating the CIO rating for a
given GSA investment as soon as new information becomes available that
affects the assessment, including when an investment is in the process
of a rebaseline. Because we have previously made recommendations
addressing the development of Dashboard ratings calculations that
better reflect current performance, we are not making additional
recommendations to OMB at this time.
Agency Comments and Our Evaluation:
We provided a draft of our report to the five agencies selected for
our review and to OMB. In written comments on the draft, Commerce's
Acting Secretary concurred with our findings. Also in written
comments, GSA's Administrator stated that GSA agreed with our finding
and recommendation and would take appropriate action. Letters from
these agencies are reprinted in appendixes III and IV. In addition, we
received oral comments from officials from OMB's Office of E-
Government & Information Technology and written comments via e-mail
from an Audit Liaison from Interior. These comments were technical in
nature and we incorporated them as appropriate. OMB and Interior
neither agreed nor disagreed with our findings. Finally, an Analyst
from Education and a Senior Management Analyst from State indicated
via e-mail that they had no comments on the draft.
As agreed with your offices, unless you publicly announce the contents
of this report earlier, we plan no further distribution until 30 days
from the report date. At that time, we will send copies of this report
to interested congressional committees; the Director of OMB; the
Secretaries of Commerce, Education, the Interior, and State; the
Administrator of GSA; and other interested parties. In addition, the
report will be available at no charge on GAO's website at [hyperlink,
http://www.gao.gov].
If you or your staff have any questions on the matters discussed in
this report, please contact me at (202) 512-9286 or pownerd@gao.gov.
Contact points for our Offices of Congressional Relations and Public
Affairs may be found on the last page of this report. GAO staff who
made major contributions to this report are listed in appendix V.
Signed by:
David A. Powner:
Director, Information Technology Management Issues:
List of Requesters:
The Honorable Joseph I. Lieberman:
Chairman:
The Honorable Susan M. Collins:
Ranking Member:
Committee on Homeland Security and Governmental Affairs:
United States Senate:
The Honorable Thomas R. Carper:
Chairman:
The Honorable Scott P. Brown:
Ranking Member:
Subcommittee on Federal Financial Management, Government Information,
Federal Services, and International Security:
Committee on Homeland Security and Governmental Affairs:
United States Senate:
The Honorable Ben Quayle:
House of Representatives:
[End of section]
Appendix I: Objective, Scope, and Methodology:
Our objective was to examine the accuracy of the cost and schedule
performance ratings on the Dashboard for selected investments. We
selected 5 agencies and 10 investments to review. To select these
agencies and investments, we used the Office of Management and
Budget's (OMB) fiscal year 2011 exhibit 53 to identify 6 agencies with
the largest information technology (IT) budgets, after excluding the
10 agencies included in our first two Dashboard reviews.[Footnote 21]
We then excluded the National Aeronautics and Space Administration
because it did not have enough investments that met our selection
criteria. As a result, we selected the Departments of Commerce,
Education, the Interior, and State, as well as the General Services
Administration (GSA).
In selecting the specific investments at each agency, we identified
the largest investments that, according to the fiscal year 2011
budget, were spending at least 25 percent of their budget on IT
development, modernization, and enhancement work. To narrow this list,
we excluded investments that, according to the fiscal year 2011
budget, were in the planning phase or were infrastructure-related. We
then selected the top 2 investments per agency.[Footnote 22] The 10
final investments were Commerce's Geostationary Operational
Environmental Satellite--Series R Ground Segment project and Advanced
Weather Interactive Processing System, Education's Integrated Partner
Management system and National Student Loan Data System, Interior's
Financial and Business Management System and Land Satellites Data
System, State's Global Foreign Affairs Compensation System and
Integrated Logistics Management System, and GSA's Regional Business
Application and System for Tracking and Administering Real
Property/Realty Services.
To assess the accuracy and currency of the cost and schedule
performance ratings on the Dashboard, we evaluated, where available,
agency or contractor documentation related to cost and schedule
performance for 8 of the selected investments to determine their
cumulative and current cost and schedule performance and compared our
ratings with the performance ratings on the Dashboard.[Footnote 23]
The analyzed investment performance-related documentation included
program management reports, internal performance management system
performance ratings, earned value management data, investment
schedules, system requirements, and operational analyses.[Footnote 24]
* To determine cumulative cost performance, we weighted our cost
performance ratings based on each investment's percentage of
development spending (represented in our analysis of the program
management reports and earned value data) and steady-state spending
(represented in our evaluation of the operational analysis), and
compared our weighted ratings with the cost performance ratings on the
Dashboard. To evaluate earned value data, we determined cumulative
cost variance for each month from October 2010 through March 2011. To
assess the accuracy of the cost data, we electronically tested the
data to identify obvious problems with completeness or accuracy, and
interviewed agency and program officials about the earned value
management systems. We did not test the adequacy of the agency or
contractor cost-accounting systems. Our evaluation of these cost data
was based on what we were told by each agency and the information it
could provide.
* To determine cumulative schedule performance, we analyzed
requirements documentation to determine whether investments were on
schedule in implementing planned requirements. To perform the schedule
analysis of the earned value data, we determined the investment's
cumulative schedule variance for each month from October 2010 through
March 2011.
* To determine both current cost and schedule performance, we
evaluated investment data from the most recent 6 months of performance
for each month from October 2010 through March 2011.
We were not able to assess the cost or schedule performance of 2
selected investments, Education's Integrated Partner Management
investment and National Student Loan Data System investment. During
the course of our review, we determined that the department did not
establish a validated performance baseline for the Integrated Partner
Management investment until March 2011. Therefore, the underlying cost
and schedule performance data for the time frame we analyzed were not
sufficiently reliable. We also determined during our review that the
department recently rescoped development work on the National Student
Loan Data System investment and did not have current, representative
performance data available.
Further, we interviewed officials from OMB and the selected agencies
to obtain additional information on agencies' efforts to ensure the
accuracy of the data used to rate investment performance on the
Dashboard. We used the information provided by agency officials to
identify the factors contributing to inaccurate cost and schedule
performance ratings on the Dashboard.
We conducted this performance audit from February 2011 to November
2011 at the selected agencies' offices in the Washington, D.C.,
metropolitan area. Our work was done in accordance with generally
accepted government auditing standards. Those standards require that
we plan and perform the audit to obtain sufficient, appropriate
evidence to provide a reasonable basis for our findings and
conclusions based on our audit objective. We believe that the evidence
obtained provides a reasonable basis for our findings and conclusions
based on our audit objective.
[End of section]
Appendix II: Selected Investment Descriptions:
Below are descriptions of each of the selected investments that are
included in this review.
Department of Commerce:
Advanced Weather Interactive Processing System:
The Advanced Weather Interactive Processing System is used to ingest,
analyze, forecast, and disseminate operational weather data.
Enhancements currently being implemented to the system are intended to
improve the system's infrastructure and position the National Weather
Service to meet future requirements in the years ahead.
Geostationary Operational Environmental Satellite--Series R Ground
Segment:
The Geostationary Operational Environmental Satellite--Series R Ground
Segment includes the development of key systems needed for the on-
orbit operation of the next generation of geostationary operational
environmental satellites, receipt and processing of information, and
distribution of satellite data products to users.
Department of Education:
Integrated Partner Management:
The Integrated Partner Management investment is to replace five legacy
applications and provide, in one solution, improved eligibility,
enrollment, and oversight processes for schools, lenders, federal and
state agencies, and other entities that administer financial aid to
help students pay for higher education.
National Student Loan Data System:
The National Student Loan Data System includes continued operations
and maintenance of an application that manages the integration of data
regarding student aid applicants and recipients. The investment also
includes a development portion that is intended to ensure that
reporting and data collection processes are in place to efficiently
determine partner eligibility to participate in higher education
financial aid programs, and ensure only eligible students receive
loans, grants, or work study awards.
Department of the Interior:
Financial and Business Management System:
The Financial and Business Management System is an enterprisewide
system that is intended to replace most of the department's
administrative systems, including budget, acquisitions, financial
assistance, core finance, personal and real property, and enterprise
management information systems.
Land Satellites Data System:
The Land Satellites Data System investment includes the continued
operation of Landsat satellites and the IT-related costs for the
ground system that captures, archives, processes, and distributes data
from land-imaging satellites. The development efforts under way are
intended to enable the U.S. Geological Survey to continue to capture,
archive, process, and deliver images of the earth's surface to
customers.
Department of State:
Global Foreign Affairs Compensation System:
The Global Foreign Affairs Compensation System is intended to enable
the department to replace six obsolete legacy systems with a single
system better suited to support the constant change of taxation and
benefits requirements in more than 180 countries, and to help the
department make accurate and timely payments to its diverse workforce
and retired Foreign Service officers.
Integrated Logistics Management System:
The Integrated Logistics Management System is the department's
enterprisewide supply chain management system. It is intended to be
the backbone of the department's logistics infrastructure and provide
for requisition, procurement, distribution, transportation, receipt,
asset management, mail, diplomatic pouch, and tracking of goods and
services both domestically and overseas.
General Services Administration:
Regional Business Application:
The Regional Business Application includes three systems that are
intended to provide a means to transition from a semi-automated to an
integrated acquisition process, and provide tools to expedite the
processing of customer funding documents and vendor invoices.
System for Tracking and Administering Real Property/Realty Services:
The System for Tracking and Administering Real Property/Realty
Services investment includes continued operations of a transaction
processor that supports space management, revenue generation, and
budgeting. The investment also includes development of a new system
that is intended to simplify user administration and reporting, and
improve overall security.
Table 3 provides additional details for each of the selected
investments in our review.
Table 3: Investment Management Details:
Agency: Commerce;
Bureau: National Oceanic and Atmospheric Administration;
Investment name: Advanced Weather Interactive Processing System;
Investment start date: 10/1/2001;
Investment end date: 9/30/2017;
Prime contractor/developer: Raytheon.
Agency: Commerce;
Bureau: National Oceanic and Atmospheric Administration;
Investment name: Geostationary Operational Environmental Satellite--
Series R Ground Segment;
Investment start date: 10/1/2006;
Investment end date: 9/30/2028;
Prime contractor/developer: Harris Corporation.
Agency: Education;
Bureau: Office of Federal Student Aid;
Investment name: Integrated Partner Management;
Investment start date: 9/30/2003;
Investment end date: 11/15/2018;
Prime contractor/developer: Digital Management, Inc.
Agency: Education;
Bureau: Office of Federal Student Aid;
Investment name: National Student Loan Data System;
Investment start date: 10/1/2001;
Investment end date: 7/13/2016;
Prime contractor/developer: Briefcase Systems.
Agency: Interior;
Bureau: Agencywide;
Investment name: Financial and Business Management System;
Investment start date: 10/1/2003;
Investment end date: 9/30/2030;
Prime contractor/developer: IBM.
Agency: Interior;
Bureau: U.S. Geological Survey;
Investment name: Land Satellites Data System;
Investment start date: 10/1/2010;
Investment end date: 9/30/2019;
Prime contractor/developer: SGT.
Agency: GSA;
Bureau: Supply and Technology Activities;
Investment name: Regional Business Application;
Investment start date: 10/1/2000;
Investment end date: 9/30/2013;
Prime contractor/developer: Tech Flow, Inc.
Agency: GSA;
Bureau: Real Property Activities;
Investment name: System for Tracking and Administering Real
Property/Realty Services;
Investment start date: 10/1/2002;
Investment end date: 9/30/2016;
Prime contractor/developer: QinetiQ North America.
Agency: State;
Bureau: Agencywide;
Investment name: Global Foreign Affairs Compensation System;
Investment start date: 10/1/2003;
Investment end date: 9/30/2015;
Prime contractor/developer: STG.
Agency: State;
Bureau: Agencywide;
Investment name: Integrated Logistics Management System;
Investment start date: 1/1/1998;
Investment end date: 9/30/2016;
Prime contractor/developer: Accenture.
Source: OMB's Dashboard and data from program officials.
[End of table]
[End of section]
Appendix III: Comments from the Department of Commerce:
United States Department of Commerce:
The Secretary of Commerce:
Washington DC 20230:
October 14, 2011:
Mr. David A. Powner:
Director, Information Technology Management:
U.S. Government Accountability Office:
Washington, DC 20548:
Dear Mr. Powner:
Thank you for the opportunity to comment on the draft report from the
U.S. Government Accountability Office (GAO) entitled IT Dashboard:
Accuracy Has Improved, and Additional Efforts are Under Way to Better
Inform Decision Making (GAO-12-28). [Now GAO-12-210]
We appreciate that GAO concurs that the cost and schedule performance
ratings for the Department of Commerce's Advanced Weather Interactive
Processing System and Geostationary Operational Environmental
Satellite-”Series R Ground Segment are reported accurately in the
Office of Management and Budget's Information Technology Dashboard. We
recognize that GAO made no recommendations that are directed to the
Department of Commerce and concur with GAO's findings.
If you have questions regarding the Department of Commerce's response,
please contact Lisa Westerback in the Office of the Chief Information
Officer at (202) 482-0694.
Sincerely,
Signed by:
Acting Secretary Rebecca M. Blank:
[End of section]
Appendix IV: Comments from the General Services Administration:
GSA:
The Administrator:
U.S. General Services Administration:
1275 First Street, NE:
Washington, DC 20417:
Telephone: (202) 501-0800:
Fax: (202) 219-1243:
October 19, 2011:
The Honorable Gene L. Dodaro:
Comptroller General of the United States:
U.S. Government Accountability Office:
Washington, DC 20548:
Dear Mr. Dodaro:
The U.S. General Services Administration (GSA) appreciates the
opportunity to review and comment on the draft report, "IT Dashboard:
Accuracy Has Improved, and Additional Efforts Are Under Way to Better
Inform Decision Making" (GAO-12-28). [Now GAO-12-210]
The U.S. Government Accountability Office recommends that the GSA
Administrator comply with the Office of Management and Budget's
guidance on updating the Chief Information Officer rating as soon as
new information becomes available that affects the assessment of a
given investment, including when an investment is in the process of a
rebaseline.
We agree with the finding and recommendation and will take appropriate
action. If you have any questions or concerns, please do not hesitate
to contact me. Staff inquiries may be directed to Mr. Rodney P. Emery,
Associate Administrator for Congressional and Intergovernmental
Affairs. He can be reached at (202) 501-0563.
Sincerely,
Signed by:
Martha Johnson:
Administrator:
cc: Mr. David A. Powner, Director, Information Technology Management
Issues U.S. Government Accountability Office:
[End of section]
Appendix V: GAO Contact and Staff Acknowledgments:
GAO Contact:
David A. Powner at (202) 512-9286 or pownerd@gao.gov:
Staff Acknowledgments:
In addition to the contact named above, the following staff also made
key contributions to this report: Carol Cha, Assistant Director; Emily
Longcore; Lee McCracken; Karl Seifert; and Kevin Walsh.
[End of section]
Footnotes:
[1] GAO, Information Technology: Continued Attention Needed to
Accurately Report Federal Spending and Improve Management, [hyperlink,
http://www.gao.gov/products/GAO-11-831T] (Washington, D.C.: July 14,
2011); Information Technology: Continued Improvements in Investment
Oversight and Management Can Yield Billions in Savings, [hyperlink,
http://www.gao.gov/products/GAO-11-511T] (Washington, D.C.: Apr. 12,
2011); Information Technology: OMB Has Made Improvements to Its
Dashboard, but Further Work Is Needed by Agencies and OMB to Ensure
Data Accuracy, [hyperlink, http://www.gao.gov/products/GAO-11-262]
(Washington, D.C.: Mar. 15, 2011); Information Technology: OMB's
Dashboard Has Increased Transparency and Oversight, but Improvements
Needed, [hyperlink, http://www.gao.gov/products/GAO-10-701]
(Washington, D.C.: July 16, 2010); Information Technology: Management
and Oversight of Projects Totaling Billions of Dollars Need Attention,
[hyperlink, http://www.gao.gov/products/GAO-09-624T] (Washington,
D.C.: Apr. 28, 2009); and Information Technology: OMB and Agencies
Need to Improve Planning, Management, and Oversight of Projects
Totaling Billions of Dollars, [hyperlink,
http://www.gao.gov/products/GAO-08-1051T] (Washington, D.C.: July 31,
2008).
[2] "Major IT investment" means a system or an acquisition requiring
special management attention because it has significant importance to
the mission or function of the agency, a component of the agency, or
another organization; is for financial management and obligates more
than $500,000 annually; has significant program or policy
implications; has high executive visibility; has high development,
operating, or maintenance costs; is funded through other than direct
appropriations; or is defined as major by the agency's capital
planning and investment control process.
[3] GAO-10-701. The five departments included in this review were the
Departments of Agriculture, Defense, Energy, Health and Human
Services, and Justice.
[4] GAO-11-262. The five agencies included in this review were the
Departments of Homeland Security, Transportation, the Treasury, and
Veterans Affairs, as well as the Social Security Administration.
[5] Initially, we had also selected two investments from the
Department of Education; however, these investments were subsequently
dropped, as detailed in appendix I. The remaining eight investments
were Commerce's Advanced Weather Interactive Processing System and
Geostationary Operational Environmental Satellite--Series R Ground
Segment investment, Interior's Financial and Business Management
System and Land Satellites Data System investment, State's Global
Foreign Affairs Compensation System and Integrated Logistics
Management System, and GSA's Regional Business Application and System
for Tracking and Administering Real Property/Realty Services. See
appendix II for descriptions of each investment.
[6] 40 U.S.C. § 11302(c).
[7] 44 U.S.C. § 3506(h)(5).
[8] 44 U.S.C. § 3504(h)(1).
[9] 44 U.S.C. § 3606. Generally speaking, e-government refers to the
use of IT, particularly web-based Internet applications, to enhance
the access to and delivery of government information and services to
the public and among agencies at all levels of government.
[10] [hyperlink, http://www.gao.gov/products/GAO-09-624T]; GAO,
Information Technology: Treasury Needs to Better Define and Implement
Its Earned Value Management Policy, [hyperlink,
http://www.gao.gov/products/GAO-08-951] (Washington, D.C.: Sept. 22,
2008); and Air Traffic Control: FAA Uses Earned Value Techniques to
Help Manage Information Technology Acquisitions, but Needs to Clarify
Policy and Strengthen Oversight, [hyperlink,
http://www.gao.gov/products/GAO-08-756] (Washington, D.C.: July 18,
2008).
[11] GAO, Information Technology: OMB Can Make More Effective Use of
Its Investment Reviews, [hyperlink,
http://www.gao.gov/products/GAO-05-276] (Washington, D.C.: Apr. 15,
2005).
[12] GAO, Information Technology: Agencies and OMB Should Strengthen
Processes for Identifying and Overseeing High Risk Projects,
[hyperlink, http://www.gao.gov/products/GAO-06-647] (Washington, D.C.:
June 15, 2006).
[13] Exhibit 53s list all of the IT investments and their associated
costs within a federal organization. An exhibit 300 is also called the
Capital Asset Plan and Business Case. It is used to justify resource
requests for major IT investments and is intended to enable an agency
to demonstrate to its own management, as well as to OMB, that a major
investment is well planned.
[14] [hyperlink, http://www.gao.gov/products/GAO-10-701].
[15] [hyperlink, http://www.gao.gov/products/GAO-11-262].
[16] Of the 10 selected investments, we were unable to assess the
performance of the 2 investments from Education: Integrated Partner
Management and National Student Loan Data System. In the first case,
the department had not yet established a validated baseline against
which to measure performance. In the second case, the department had
recently rescoped planned development work and did not have current,
representative performance data available. See appendix I for details.
[17] OMB, Memorandum for Chief Information Officers: Information
Technology Investment Baseline Management Policy, M-10-27 (Washington,
D.C.: June 28, 2010).
[18] GAO, Investment Management: IRS Has a Strong Oversight Process
but Needs to Improve How It Continues Funding Ongoing Investments,
[hyperlink, http://www.gao.gov/products/GAO-11-587] (Washington, D.C.:
July 20, 2011); GAO Cost Estimating and Assessment Guide: Best
Practices for Developing and Managing Capital Program Costs,
[hyperlink, http://www.gao.gov/products/GAO-09-3SP] (Washington, D.C.:
March 2009); and Information Technology: Treasury Needs to Strengthen
Its Investment Board Operations and Oversight, [hyperlink,
http://www.gao.gov/products/GAO-07-865] (Washington, D.C.: July 23,
2007).
[19] [hyperlink, http://www.gao.gov/products/GAO-11-262].
[20] OMB, FY13 Guidance for Exhibit 300a-b (July 2011).
[21] GAO-10-701 and GAO-11-262. The agencies reviewed in GAO-10-701
were the Departments of Agriculture, Defense, Energy, Health and Human
Services, and Justice. The agencies reviewed in GAO-11-262 were the
Departments of Homeland Security, Transportation, the Treasury, and
Veterans Affairs, and the Social Security Administration.
[22] For the Department of Commerce, we excluded two of its top
investments because one had been recently completed and the other had
significant funding uncertainty as a result of a continuing resolution.
[23] We were unable to assess the cost or schedule performance of the
two selected Education investments, as discussed later.
[24] Earned value management is a technique that integrates the
technical, cost, and schedule parameters of a development contract and
measures progress against them.
[End of section]
GAO‘s Mission:
The Government Accountability Office, the audit, evaluation, and
investigative arm of Congress, exists to support Congress in meeting
its constitutional responsibilities and to help improve the
performance and accountability of the federal government for the
American people. GAO examines the use of public funds; evaluates
federal programs and policies; and provides analyses, recommendations,
and other assistance to help Congress make informed oversight, policy,
and funding decisions. GAO‘s commitment to good government is
reflected in its core values of accountability, integrity, and
reliability.
Obtaining Copies of GAO Reports and Testimony:
The fastest and easiest way to obtain copies of GAO documents at no
cost is through GAO‘s website [hyperlink, http://www.gao.gov]. Each
weekday afternoon, GAO posts on its website newly released reports,
testimony, and correspondence. To have GAO e mail you a list of newly
posted products, go to [hyperlink, http://www.gao.gov] and select ’E-
mail Updates.“
Order by Phone:
The price of each GAO publication reflects GAO‘s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black
and white. Pricing and ordering information is posted on GAO‘s
website, [hyperlink, http://www.gao.gov/ordering.htm].
Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537.
Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional
information.
Connect with GAO:
Connect with GAO on facebook, flickr, twitter, and YouTube.
Subscribe to our RSS Feeds or E mail Updates. Listen to our Podcasts.
Visit GAO on the web at www.gao.gov.
To Report Fraud, Waste, and Abuse in Federal Programs:
Contact:
Website: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm];
E-mail: fraudnet@gao.gov;
Automated answering system: (800) 424-5454 or (202) 512-7470.
Congressional Relations:
Ralph Dawn, Managing Director, dawnr@gao.gov, (202) 512-4400
U.S. Government Accountability Office, 441 G Street NW, Room 7125
Washington, DC 20548.
Public Affairs:
Chuck Young, Managing Director, youngc1@gao.gov, (202) 512-4800
U.S. Government Accountability Office, 441 G Street NW, Room 7149
Washington, DC 20548.