Information Technology
OMB Can Make More Effective Use of Its Investment Reviews
Gao ID: GAO-05-276 April 15, 2005
For the President's Budget for Fiscal Year 2005, the Office of Management and Budget (OMB) stated that of the nearly 1,200 major information technology (IT) projects in the budget, it had placed approximately half--621 projects, representing about $22 billion--on a Management Watch List, composed of mission-critical projects with identified weaknesses. GAO was asked to describe and assess OMB's processes for (1) placing projects on its Management Watch List and (2) following up on corrective actions established for projects on the list.
For the fiscal year 2005 budget, OMB developed processes and criteria for including IT investments on its Management Watch List. In doing so, it identified opportunities to strengthen investments and promote improvements in IT management. However, it did not develop a single, aggregate list identifying the projects and their weaknesses. Instead, OMB officials told GAO that to identify IT projects with weaknesses, individual OMB analysts used scoring criteria that the office established for evaluating the justifications for funding that federal agencies submit for major projects. These analysts, each of whom is typically responsible for several federal agencies, were then responsible for maintaining information on these projects. To derive the total number of projects on the list that OMB reported for fiscal year 2005, OMB polled its individual analysts and compiled the result. However, OMB officials told GAO that they did not compile a list that identified the specific projects and their identified weaknesses. The officials added that they did not construct a single list because they did not see such an activity as necessary. Thus, OMB has not fully exploited the opportunity to use the list as a tool for analyzing IT investments on a governmentwide basis. OMB had not developed a structured, consistent process for deciding how to follow up on corrective actions that its individual analysts asked agencies to take to address weaknesses associated with projects on its Management Watch List. According to OMB officials, decisions on follow-up and monitoring of progress were typically made by the staff with responsibility for reviewing individual agency budget submissions, depending on the staff's insights into agency operations and objectives. Because it did not consistently require or monitor follow-up activities, OMB did not know whether the project risks that it identified through its Management Watch List were being managed effectively, potentially leaving resources at risk of being committed to poorly planned and managed projects. In addition, because it did not consistently monitor the follow-up performed on projects on the Management Watch List, OMB could not readily tell GAO which of the 621 projects received follow-up attention. Thus, OMB was not using its Management Watch List as a tool in setting priorities for improving IT investments on a governmentwide basis and focusing attention where it was most needed.
Recommendations
Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.
Director:
Team:
Phone:
GAO-05-276, Information Technology: OMB Can Make More Effective Use of Its Investment Reviews
This is the accessible text file for GAO report number GAO-05-276
entitled 'Information Technology: OMB Can Make More Effective Use of
Its Investment Reviews' which was released on April 15, 2005.
This text file was formatted by the U.S. Government Accountability
Office (GAO) to be accessible to users with visual impairments, as part
of a longer term project to improve GAO products' accessibility. Every
attempt has been made to maintain the structural and data integrity of
the original printed product. Accessibility features, such as text
descriptions of tables, consecutively numbered footnotes placed at the
end of the file, and the text of agency comment letters, are provided
but may not exactly duplicate the presentation or format of the printed
version. The portable document format (PDF) file is an exact electronic
replica of the printed version. We welcome your feedback. Please E-mail
your comments regarding the contents or accessibility features of this
document to Webmaster@gao.gov.
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed
in its entirety without further permission from GAO. Because this work
may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this
material separately.
Report to Congressional Requesters:
April 2005:
Information Technology:
OMB Can Make More Effective Use of Its Investment Reviews:
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-05-276]:
GAO Highlights:
Highlights of GAO-05-276, a report to congressional requesters.
Why GAO Did This Study:
For the President‘s Budget for Fiscal Year 2005, the Office of
Management and Budget (OMB) stated that of the nearly 1,200 major
information technology (IT) projects in the budget, it had placed
approximately half”621 projects, representing about $22 billion”on a
Management Watch List, composed of mission-critical projects with
identified weaknesses. GAO was asked to describe and assess OMB‘s
processes for (1) placing projects on its Management Watch List and (2)
following up on corrective actions established for projects on the list.
What GAO Found:
For the fiscal year 2005 budget, OMB developed processes and criteria
for including IT investments on its Management Watch List. In doing so,
it identified opportunities to strengthen investments and promote
improvements in IT management. However, it did not develop a single,
aggregate list identifying the projects and their weaknesses. Instead,
OMB officials told GAO that to identify IT projects with weaknesses,
individual OMB analysts used scoring criteria that the office
established for evaluating the justifications for funding that federal
agencies submit for major projects. These analysts, each of whom is
typically responsible for several federal agencies, were then
responsible for maintaining information on these projects. To derive
the total number of projects on the list that OMB reported for fiscal
year 2005, OMB polled its individual analysts and compiled the result.
However, OMB officials told GAO that they did not compile a list that
identified the specific projects and their identified weaknesses. The
officials added that they did not construct a single list because they
did not see such an activity as necessary. Thus, OMB has not fully
exploited the opportunity to use the list as a tool for analyzing IT
investments on a governmentwide basis.
OMB had not developed a structured, consistent process for deciding how
to follow up on corrective actions that its individual analysts asked
agencies to take to address weaknesses associated with projects on its
Management Watch List. According to OMB officials, decisions on follow-
up and monitoring of progress were typically made by the staff with
responsibility for reviewing individual agency budget submissions,
depending on the staff‘s insights into agency operations and
objectives. Because it did not consistently require or monitor follow-
up activities, OMB did not know whether the project risks that it
identified through its Management Watch List were being managed
effectively, potentially leaving resources at risk of being committed
to poorly planned and managed projects. In addition, because it did not
consistently monitor the follow-up performed on projects on the
Management Watch List, OMB could not readily tell GAO which of the 621
projects received follow-up attention. Thus, OMB was not using its
Management Watch List as a tool in setting priorities for improving IT
investments on a governmentwide basis and focusing attention where it
was most needed.
What GAO Recommends:
To enable OMB to take advantage of potential benefits of using its
Management Watch List as a tool for analyzing, setting priorities, and
following up on IT projects, GAO is making recommendations to OMB aimed
at more effective development and use of its Management Watch List.
In commenting on a draft of this report, OMB did not agree that an
aggregated list, as recommended by GAO, is necessary for adequate
oversight and management, because it uses other information and
processes for this purpose. However, GAO continues to believe that an
aggregated list would contribute to OMB‘s ability to analyze IT
investments governmentwide and track progress in addressing
deficiencies.
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-05-276]:
To view the full product, including the scope and methodology, click on
the link above. For more information, contact David Powner at (202) 512-
9286 or [Hyperlink, pownerd@gao.gov.]:
Contents:
Letter:
Results in Brief:
Background:
Objectives, Scope, and Methodology:
OMB Established Processes and Criteria for Identifying Weak Projects,
but It Did Not Use an Aggregate List to Perform Its Analysis or
Oversight:
OMB's Follow-up on Projects Was Inconsistent, and Follow-up Activities
Were Not Tracked Centrally:
Conclusions:
Recommendations for Executive Action:
Agency Comments and Our Evaluation:
Appendix:
Appendix I: Comments from the Office of Management and Budget:
Abbreviations:
IT: information technology:
OIRA: Office of Information and Regulatory Affairs:
OMB: Office of Management and Budget:
RMO: Resource Management Office:
Letter April 15, 2005:
The Honorable Tom Davis:
Chairman:
Committee on Government Reform:
House of Representatives:
The Honorable Adam H. Putnam:
House of Representatives:
The President's Budget for Fiscal Year 2005 identified approximately
$60 billion for information technology (IT) projects. In that budget,
the Office of Management and Budget (OMB) stated that, of approximately
1,200 major IT projects, about half--621 projects, representing about
$22 billion--were on a "Management Watch List." This information was
reiterated in testimony in March 2004,[Footnote 1] during which OMB
officials stated that the list consisted of mission-critical projects
that needed to improve performance measures, project management, and IT
security. OMB identified weaknesses in these three areas, among others,
in its analysis of the business cases that agencies submitted to
justify project funding. The officials added that the fiscal year 2005
budget process required agencies to successfully correct project
weaknesses and business case deficiencies; otherwise, OMB would limit
agencies' spending on new starts and other developmental activities.
This report responds to your request that we describe and assess OMB's
processes for (1) placing projects on its Management Watch List and (2)
following up on corrective actions established for projects on the
list. To accomplish these objectives, we reviewed and analyzed OMB's
policy and budget guidance for fiscal year 2005 and interviewed OMB
officials (further details on our objectives, scope, and methodology
are provided following the background section).
Results in Brief:
For the fiscal year 2005 budget, OMB developed processes and criteria
for including IT projects (investments) on its Management Watch List.
In doing so, it identified opportunities to strengthen investments and
promote improvements in IT management. However, OMB did not develop a
single, aggregate list identifying the projects and their weaknesses.
Instead, OMB officials told us that individual OMB analysts used
scoring criteria established in the office's Circular A-11 for
evaluating the justifications for funding (known as exhibit 300s) that
are submitted by federal agencies. OMB delegated individual analysts on
its staff, each of whom is typically assigned responsibility for
several federal agencies, with maintaining, for their respective
agencies, information for the IT projects included on the list. To
derive the 621 total of projects on the list that OMB reported for
fiscal year 2005, OMB polled its individual analysts and compiled the
numbers. OMB officials told us that they did not construct a single
list of projects meeting their watch list criteria because they did not
see such an activity as necessary for performing OMB's predominant
mission: to assist in overseeing the preparation of the federal budget
and to supervise agency budget administration. Thus, OMB did not
exploit the opportunity to use the list as a tool for analyzing IT
investments on a governmentwide basis, limiting its ability to identify
and report on the full set of IT investments requiring corrective
actions.
OMB did not develop a structured, consistent process for deciding how
to follow up on corrective actions that it asked agencies to take to
address weaknesses associated with projects on the Management Watch
List. According to OMB officials, decisions on follow-up and monitoring
of specific projects were typically made by the OMB staff with
responsibility for reviewing individual agency budget submissions,
depending on the staff's insights into agency operations and
objectives. Because it did not consistently monitor the follow-up
performed, OMB could not tell us which of the 621 projects received
follow-up attention, and it did not know whether the specific project
risks that it identified through its Management Watch List were being
managed effectively. This approach could leave resources at risk of
being committed to poorly planned and managed projects. Thus, OMB was
not using its Management Watch List as a tool for improving IT
investments on a governmentwide basis and focusing attention where it
was most needed.
To enable OMB to take advantage of the potential benefits of using the
Management Watch List as a tool for analyzing and following up on IT
investments, we are recommending that OMB develop a centralized
capability for creating and monitoring its Management Watch List,
including developing and using criteria for prioritizing the IT
projects on the list and appropriate follow-up activities, and that it
use the prioritized list for reporting to the Congress as part of its
statutory reporting responsibilities.
In commenting on a draft of this report, OMB's Administrator of the
Office of E-Government and Information Technology expressed
appreciation for our review of OMB's use of its Management Watch List.
However, the Administrator disagreed with our assessment that an
aggregated governmentwide list is necessary to perform adequate
oversight and management, and that OMB does not know whether risks are
being addressed. According to the Administrator, OMB has more than
adequate knowledge of agency project planning and uses others means to
assess project performance. Nonetheless, based on OMB's inability to
easily report which of the 621 investments on the Management Watch List
remained deficient or how much of the $22 billion cited in the
President's Budget remained at risk, we continue to believe that an
aggregate list would facilitate OMB's ability to track progress.
Background:
According to OMB, its predominant mission is to assist the President in
overseeing the preparation of the federal budget and to supervise
budget administration in executive branch agencies. In helping to
formulate the President's spending plans, OMB is responsible for
evaluating the effectiveness of agency programs, policies, and
procedures; assessing competing funding demands among agencies; and
setting funding priorities. OMB also is to ensure that agency reports,
rules, testimony, and proposed legislation are consistent with the
President's budget and with administration policies.
In addition, OMB is responsible for overseeing and coordinating the
administration's procurement, financial management, information, and
regulatory policies. In each of these areas, OMB's role is to help
improve administrative management, to develop better performance
measures and coordinating mechanisms, and to reduce unnecessary burden
on the public.
To drive improvement in the implementation and management of IT
projects, the Congress enacted the Clinger-Cohen Act in 1996 to further
expand the responsibilities of OMB and the agencies under the Paperwork
Reduction Act.[Footnote 2] The act requires that agencies engage in
capital planning and performance-and results-based management. OMB is
required by the Clinger-Cohen Act to establish processes to analyze,
track, and evaluate the risks and results of major capital investments
in information systems made by executive agencies. OMB is also required
to report to the Congress on the net program performance benefits
achieved as a result of major capital investments in information
systems that are made by executive agencies.[Footnote 3]
In response to the Clinger-Cohen Act and other statutes, OMB developed
section 300 of Circular A-11. This section provides policy for
planning, budgeting, acquisition, and management of federal capital
assets and instructs agencies on budget justification and reporting
requirements for major IT investments.[Footnote 4] Section 300 defines
the budget exhibit 300, also called the Capital Asset Plan and Business
Case, as a document that agencies submit to OMB to justify resource
requests for major IT investments. The exhibit 300 consists of two
parts: the first is required of all assets; the second applies only to
information technology. Among other things, the exhibit 300 requires
agencies to provide information summarizing spending and funding plans;
performance goals and measures; project management plans, goals, and
progress; and security plans and progress. This reporting mechanism, as
part of the budget formulation and review process, is intended to
enable an agency to demonstrate to its own management, as well as OMB,
that it has employed the disciplines of good project management,
developed a strong business case for the investment, and met other
Administration priorities in defining the cost, schedule, and
performance goals proposed for the investment. The types of information
included in the exhibit 300, among other things, are to help OMB and
the agencies identify and correct poorly planned or performing
investments (i.e., investments that are behind schedule, over budget,
or not delivering expected results) and real or potential systemic
weaknesses in federal information resource management (e.g., project
manager qualifications).
According to OMB's description of its processes, agencies' exhibit 300
business cases are reviewed by OMB analysts from its four statutory
offices--Offices of E-Government and Information Technology (e-Gov),
Information and Regulatory Affairs (OIRA), Federal Financial
Management, and Federal Procurement Policy--and its Resource Management
Offices (RMO). In addition to other responsibilities under various
statutes, e-Gov and OIRA develop and oversee the implementation of
governmentwide policies in the areas of IT, information policy,
privacy, and statistical policy. OIRA and e-Gov analysts also carry out
economic and related analyses, including reviewing exhibit 300s. Each
of about 12 analysts is responsible for overseeing IT projects for a
specific agency or (more commonly) several agencies.
OMB's RMOs are staffed with program examiners, whose responsibility is
to develop and support the President's Budget and Management Agenda.
RMOs work as liaisons between federal agencies and the presidency. In
formulating the budget, they evaluate agency requests for funding and
evaluate agency management and financial practices. RMOs also evaluate
and make recommendations to the President when agencies seek new
legislation or the issuance of Presidential executive orders that would
help agencies to fulfill their organizational objectives.
According to OMB officials, the OIRA and e-Gov analysts, along with RMO
program examiners, evaluate agency exhibit 300 business cases as part
of the development of the President's Budget. The results of this
review are provided to agencies through what is called the "passback"
process. That is, OMB passes the requests back to agencies with its
evaluation, which identifies any areas requiring remediation.
The final step in the budget process, occurring after the Congress has
appropriated funds, is apportionment, through which OMB formally
controls agency spending. According to the Antideficiency Act, before
the agency may spend its funding resources, appropriations must be
apportioned by periods within the fiscal year (typically by quarters)
or among the projects to be undertaken.[Footnote 5] Although
apportionment is a procedure required to allow agencies to access their
appropriated funds, OMB can also use apportionment to impose conditions
on agency spending, such as changes in agency practices; it is one of
several mechanisms that the Clinger-Cohen Act authorizes OMB to use to
enforce an agency head's accountability for the agency's IT
investments.[Footnote 6]
The President's Budget for Fiscal Year 2005 included about 1,200 IT
projects, totaling about $60 billion. Of this total number of projects,
OMB reported in the budget that slightly over half--621 projects,
representing about $22 billion--were on a Management Watch List.
According to OMB's March 2004 testimony, this list consists of mission-
critical projects that needed to improve performance measures, project
management, IT security, or overall justification. OMB officials
described this assessment as based on evaluations of exhibit 300s
submitted to justify inclusion in the budget. According to OMB's
testimony, the fiscal year 2005 budget required agencies to
successfully correct identified project weaknesses and business case
deficiencies; otherwise, they risked OMB placing limits on their
spending. OMB officials testified in March 2004 that they would enforce
these corrective actions through the apportionment process.
OMB continued its use of a Management Watch List in the recently
released President's Budget for Fiscal Year 2006. The President's
Budget for Fiscal Year 2006 includes 1,087 IT projects, totaling about
$65 billion. Of this total number of projects, OMB reported in the
budget that 342 projects, representing about $15 billion, are on the
fiscal year 2006 Management Watch List.
Objectives, Scope, and Methodology:
Our objectives were to describe and assess OMB's processes for (1)
placing projects on its Management Watch List and (2) following up on
corrective actions established for projects on the list.
To examine OMB's processes for developing the list, we requested a copy
of the Management Watch List; we reviewed related OMB policy guidance,
including its Circular A-11 and Capital Programming Guide, as well as
the Analytical Perspectives for the President's Budget submissions for
fiscal years 2005 and 2006; and we interviewed OMB analysts and their
managers, including the Deputy Administrator of OIRA and the Chief of
the Information Technology and Policy Branch, to identify the processes
and criteria they have in place to determine which IT projects to
include on the Management Watch List.
To examine OMB's follow-up procedures on corrective actions established
for IT projects on the list, we reviewed related policy guidance,
including section 300 of Circular A-11 and OMB's Capital Programming
Guide. We analyzed OMB's apportionment documentation, specifically the
Standard Form 132 (Apportionment and Reapportionment Schedule), which
documented special apportionments that specified conditions that had to
be met before the agencies could receive funds. In addition, we
interviewed OMB officials and analysts and reviewed testimony and laws
affecting the management of IT investments, such as the Clinger-Cohen
Act.
We conducted our work at OMB headquarters in Washington, D.C., from
August 2004 through March 2005, in accordance with generally accepted
government auditing standards.
OMB Established Processes and Criteria for Identifying Weak Projects,
but It Did Not Use an Aggregate List to Perform Its Analysis or
Oversight:
According to OMB officials, including the Deputy Administrator of OIRA
and the Chief of the Information Technology and Policy Branch, OMB
staff identified projects for the Management Watch List through their
evaluation of the exhibit 300s that agencies submit for major IT
projects as part of the budget development process. This evaluation is
carried out as part of OMB's responsibility for helping to ensure that
investments of public resources are justified and that public resources
are wisely invested.
The OMB officials added that their analysts evaluate agency exhibit
300s by assigning scores to each exhibit 300 based on guidance
presented in OMB Circular A-11.[Footnote 7] According to this circular,
the purpose of the scoring is to ensure that agency planning and
management of capital assets are consistent with OMB policy and
guidance.
As described in Circular A-11, the scoring of a business case consists
of individual scoring for 10 categories, as well as a total composite
score of all the categories. The 10 categories are:
* acquisition strategy,
* project (investment) management,
* enterprise architecture,
* alternatives analysis,
* risk management,
* performance goals,
* security and privacy,
* performance-based management system (including the earned value
management system[Footnote 8]),
* life-cycle costs formulation, and:
* support of the President's Management Agenda.
According to Circular A-11, scores range from 1 to 5, with 5 indicating
investments whose business cases provided the best justification and 1
the least. For investments with average scores of 3 or below, OMB may
ask agencies for remediation plans to address weaknesses in their
business cases.
OMB officials said that, for fiscal year 2005, an IT project was placed
on the Management Watch List if its exhibit 300 business case received
a total composite score of 3 or less, or if it received a score of 3 or
less in the areas of performance goals, performance-based management
systems, or security and privacy, even if its overall score was a 4 or
5. OMB reported that agencies with weaknesses in these three areas were
to submit remediation plans addressing the weaknesses.
According to OMB management, individual analysts were responsible for
evaluating projects and determining which projects met the criteria to
be on the Management Watch List for their assigned agencies. To derive
the total number of projects on the list that were reported for fiscal
year 2005, OMB polled the individual analysts and compiled the numbers.
OMB officials said that they did not aggregate these projects into a
single list describing projects and their weaknesses. According to
these officials, they did not construct a single list of projects
meeting their watch list criteria because they did not see such an
activity as necessary in performing OMB's predominant mission: to
assist in overseeing the preparation of the federal budget and to
supervise agency budget administration. Further, OMB officials stated
that the limited number of analysts involved enabled them to explore
governmentwide issues using ad hoc queries and to develop approaches to
address systemic problems without the use of an aggregate list. They
pointed at successes in improving IT management, such as better
compliance with security requirements, as examples of the effectiveness
of their current approach.
Nevertheless, OMB has not fully exploited the opportunity to use its
Management Watch List as a tool for analyzing IT investments on a
governmentwide basis. According to the Clinger-Cohen Act, OMB is
required to establish processes to analyze, track, and evaluate the
risks and results of major IT capital investments by executive
agencies, which aggregation of the Management Watch List would
facilitate. Without aggregation, the list's visibility was limited at
more senior levels of OMB, constraining its ability to conduct analysis
of IT investments on a governmentwide basis and limiting its ability to
identify and report on the full set of IT investments requiring
corrective actions.
OMB's Follow-up on Projects Was Inconsistent, and Follow-up Activities
Were Not Tracked Centrally:
OMB did not develop a structured, consistent process or criteria for
deciding how to follow up on corrective actions that it asked agencies
to take to address weaknesses associated with projects on the
Management Watch List. Instead, OMB officials, including the Deputy
Administrator of OIRA and the Chief of the Information Technology and
Policy Branch, said that the decision on whether and how to follow up
on a specific project was typically made jointly between the OIRA
analyst and the RMO program examiner who had responsibility for the
individual agency, and that follow-up on specific projects was driven
by a number of factors, only one of which was inclusion on the
Management Watch List.
These officials also said that the decision for follow-up was generally
driven by OMB's predominant mission to assist in budget preparation and
to supervise budget administration, rather than strictly by the
perceived risk of individual projects. According to these officials,
those Management Watch List projects that did receive specific follow-
up attention received feedback through the passback process, through
targeted evaluation of remediation plans designed to address
weaknesses, and through the apportioning of funds so that the use of
budgeted dollars was conditional on appropriate remediation plans being
in place.[Footnote 9] These officials also said that follow-up of some
Management Watch List projects was done through quarterly e-Gov
Scorecards.[Footnote 10]
OMB officials also stated that those Management Watch List projects
that did receive follow-up attention were not tracked centrally, but
only by the individual OMB analysts with responsibility for the
specific agencies. For example, if an agency corrected a deficiency or
weakness in a specific area of the exhibit 300 for a Management Watch
List project, that change was not recorded centrally. Accordingly, OMB
could not readily tell us which of the 621 watch list projects for
fiscal year 2005 were followed up on, nor could it use the list to
describe the relationship between its follow-up activities and the
changes in the numbers of projects on the watch list between fiscal
year 2005 (621 projects) and fiscal year 2006 (342). Further, because
OMB did not trace follow-up centrally, senior management could not
report which projects received follow-up attention and which did not.
OMB does not have specific criteria for prioritizing follow-up on
Management Watch List projects. Without specific criteria, OMB staff
may be agreeing to commit resources to follow up on projects that did
not represent OMB's top priorities from a governmentwide perspective.
For example, inconsistent attention to OMB priorities, such as earned
value management, could undermine the objectives that OMB set in these
areas. In addition, major projects with significant management
deficiencies may have continued to absorb critical agency resources.
In order for OMB management to have assurance that IT program
deficiencies are addressed, it is critical that corrective actions
associated with Management Watch List projects be monitored. Follow-up
activities are instrumental in ensuring that agencies address and
resolve weaknesses found in exhibit 300s, which may indicate underlying
weaknesses in project planning or management. Tracking these follow-up
activities is essential to enabling OMB to determine progress on both
specific projects and governmentwide trends. In addition, tracking is
necessary for OMB to fully execute its responsibilities under the
Clinger-Cohen Act, which requires OMB to establish processes to
analyze, track, and evaluate the risks and results of major capital
investments made by executive agencies for information systems. Without
tracking specific follow-up activities, OMB could not know whether the
risks that it identified through its Management Watch List were being
managed effectively; if they were not, funds were potentially being
spent on poorly planned and managed projects.
Conclusions:
By scoring agency IT budget submissions and identifying weaknesses that
may indicate investments at risk, OMB is identifying opportunities to
strengthen investments. This scoring addresses many critical IT
management areas and promotes the improvement of IT investments.
However, OMB has not developed a single, aggregate list identifying the
projects and their weaknesses, nor has it developed a structured,
consistent process for deciding how to follow up on corrective actions.
Aggregating the results at a governmentwide level would help OMB take
full advantage of the effort that it puts into reviewing business cases
for hundreds of IT projects. A governmentwide perspective could enable
OMB to use its scoring process more effectively to identify management
issues that transcend individual agencies, to prioritize follow-up
actions, and to ensure that high-priority deficiencies are addressed.
OMB's follow-up on poorly planned and managed IT projects has been
largely driven by its focus on the imperatives of the overall budget
process. Although this approach is consistent with OMB's predominant
mission, it does not fully exploit the insights developed through the
scoring process, and it may leave unattended weak projects consuming
significant budget dollars. The Management Watch List described in the
President's Budget for Fiscal Year 2005 contained projects representing
over $20 billion in budgetary resources that could have remained at
risk because of inadequate planning and project management. Because of
the absence of a consistent and integrated approach to follow-up and
tracking, OMB was unable to use the Management Watch List to ascertain
whether progress was made in addressing governmentwide and project-
specific weaknesses and where resources should be applied to encourage
additional progress. Thus, there is an increased risk that remedial
actions were incomplete and that billions of dollars were invested in
IT projects with planning and management deficiencies. In addition,
OMB's ability to report to the Congress on progress made in addressing
critical issues and areas needing continued attention is limited by the
absence of a consolidated list and coordinated follow-up activities.
Recommendations for Executive Action:
In order for OMB to take advantage of the potential benefits of using
the Management Watch List as a tool for analyzing and following up on
IT investments on a governmentwide basis, we are recommending that the
Director of OMB take the following four actions:
* Develop a central list of projects and their deficiencies.
* Use the list as the basis for selecting projects for follow-up and
for tracking follow-up activities;
* to guide follow-up, develop specific criteria for prioritizing the IT
projects included on the list, taking into consideration such factors
as the relative potential financial and program benefits of these IT
projects, as well as potential risks.
* Analyze the prioritized list to develop governmentwide and agency
assessments of the progress and risks of IT investments, identifying
opportunities for continued improvement.
* Report to the Congress on progress made in addressing risks of major
IT investments and management areas needing attention.
Agency Comments and Our Evaluation:
In written comments on a draft of this report, OMB's Administrator of
the Office of E-Government and Information Technology expressed
appreciation for our review of OMB's use of its Management Watch List.
She noted that the report was narrowly focused on the Management Watch
List and the use of exhibit 300s in that context. She added that the
report did not address the more broad budget and policy oversight
responsibilities that OMB carries out or the other strategic tools
available to OMB as it executes those responsibilities. We agree that
our review described and assessed OMB's processes for (1) placing the
621 projects representing about $22 billion on its Management Watch
List and (2) following up on corrective actions established for
projects on the list.
The Administrator commented that OMB's oversight activities include the
quarterly President's Management Agenda Scorecard assessment. We
acknowledge these activities in the report in the context of the e-Gov
scorecard, which measures the results of OMB's evaluation of the
agencies' implementation of e-government criteria in the President's
Management Agenda. We also agree with the Administrator that OMB is not
the sole audience of an exhibit 300. As we state in the report, an
exhibit 300 justification is intended to enable an agency to
demonstrate to its own management, as well as to OMB, that it has
employed the disciplines of good project management, developed a strong
business case for the investment, and met other Administration
priorities in defining the cost, schedule, and performance goals
proposed for the investment.
The Administrator disagreed with our finding that OMB did not have
specific criteria for prioritizing follow-up on exhibit 300s that have
been included on the Management Watch List. She explained that OMB
establishes priorities on a case-by-case basis within the larger
context of OMB's overall review of agency program and budget
performance. However, our review showed that OMB did not develop a
structured, consistent process or criteria for deciding how it should
follow up on corrective actions that it asked agencies to take to
address the weaknesses of the projects on the Management Watch List.
Accordingly, we continue to believe that OMB should specifically
consider those factors that it had already determined were critical
enough that they caused an investment to be included in the Management
Watch List. Without consistent attention to those IT management areas
already deemed as being of the highest priority by OMB, the office
risks focusing on areas of lesser importance.
We agree with the Administrator's separate point that agencies have the
responsibility for ensuring that investments on the Management Watch
List are successfully brought up to an acceptable level. The follow-up
that we describe in our report consists of those activities that would
allow OMB to ascertain that the deficient investments have, in fact,
been successfully strengthened. We note in the report that the
quarterly President's Management Agenda Scorecard plays a role in this
activity (in the report, we refer to the e-Gov Scorecard, which
contributes to the Management Agenda Scorecard).
Finally, the Administrator disagrees with our assessment that an
aggregated governmentwide list is necessary to perform adequate
oversight and management, and that OMB does not know whether risks are
being addressed. However, our review indicated that OMB was unable to
easily determine which of the 621 investments on the Management Watch
List remained deficient or how much of the $22 billion cited in the
President's Budget remained at risk. In our assessment we observed that
OMB had expended considerable resources in the scoring of all exhibit
300s and the identification of investments requiring corrective action,
but that it never committed the additional resources that would be
required to aggregate the partial management watch lists held by each
individual analyst. Because no complete Management Watch List was
formed, OMB lost the opportunity to analyze the full set of deficient
investments as a single set of data. This undermined its ability to
assess governmentwide trends and issues. In addition, the lack of a
complete Management Watch List necessarily inhibited OMB's ability to
track progress overall and to represent the full set of investments
requiring corrective action. We continue to believe that these
activities could be facilitated by an aggregate Management Watch List.
As agreed with your offices, unless you publicly announce its contents
earlier, we plan no further distribution of this report until 30 days
from the report date. At that time, we will send copies to other
interested congressional committees and to the Director of the Office
of Management and Budget. We also will make copies available to others
upon request. In addition, the report will be available at no charge on
the GAO Web site at [Hyperlink, http://www.gao.gov.]
Should you or your offices have questions on matters discussed in this
report, please contact me at (202) 512-9286, or Lester P. Diamond,
Assistant Director, at (202) 512-7957. We can also be reached by e-mail
at [Hyperlink, pownerd@gao.gov] or [Hyperlink, diamondl@gao.gov],
respectively. Key contributors to this report were William G. Barrick,
Barbara Collier, Sandra Kerr, and Mary Beth McClanahan.
Signed by:
David A. Powner:
Director, IT Management Issues:
[End of section]
Appendixes:
Appendix I: Comments from the Office of Management and Budget:
Executive Office Of The President:
Office Of Management And Budget:
Washington, D.C. 20503:
April 5, 2005:
Mr. David A. Powner:
Director:
IT Management Issues:
Government Accountability Office:
441 G Street, SW:
Washington, DC 20548:
Dear Mr. Powner:
Thank you for the opportunity to comment on the draft GAO report on
OMB's information technology investment review process, "INFORMATION
TECHNOLOGY: OMB Can Make More Use of Its Investment Reviews" (GAO-05-
276).
We appreciate GAO's careful review of OMB's management and oversight
activities with respect to OMB's budget Exhibit 300 (Capital Asset Plan
and Business Case), and the related management watch list, which we
believe will provide insight into how Exhibit 300s and the management
watch list are used.
For proper context of the report, we note the report was narrowly
focused on these two tools, and not on the many strategic tools
available to OMB in fulfilling its budget and policy oversight
responsibilities. Additionally, we note the report focused on OMB's
management and oversight activities, and does not address the agencies'
equally important actions and responsibilities.
In that vein, we would like to note some important points which may add
to the breadth and context of your report. Namely, while an Exhibit 300
does include information on whether the agency has considered
performance as part of the project planning, such information is only
an annual snapshot and is neither designed nor used for measuring
ongoing project execution and performance. Managing and measuring
actual project performance is conducted by the agency. OMB oversees
project performance through, among other means, the quarterly
President's Management Agenda scorecard assessment of agency's earned
value management activity.
It is also important to recognize that OMB is not the sole or even
primary audience of an Exhibit 300 justification. Agency officials and
investment review boards must use them to effectively manage their own
IT portfolios and submit to OMB only those investment requests meeting
criteria specified in OMB policies.
Accordingly, we disagree with your assessment that OMB does not have
specific criteria for prioritizing follow-up on Exhibit 300s or the
management watch list and therefore OMB oversight is inconsistent.
Responsibility for ensuring corrective actions for the management watch
list investments rests with each individual agency. They report their
progress at least quarterly to OMB through the President's Management
Agenda scorecard. The process and criteria are consistent, and OMB
analyzes whether a systematic problem exists affecting the overall
agency's program and ability to perform. OMB establishes priorities on
a case-by-case, not a one-size-fits-all, basis. Moreover, as stated
previously, these priorities exist within the larger context of OMB's
overall review of agency program and budget performance.
In addition, we disagree with your assessment that an aggregated
government-wide list is necessary for OMB to perform adequate oversight
and management, and that OMB does not know whether risks were being
managed effectively. As noted above, OMB uses Exhibit 300s and the
management watch list in the larger context of OMB's budget and program
oversight processes. OMB has more than adequate knowledge of agency
project planning and uses other means to assess project performance.
For example as your report also states, OMB has successfully used
Exhibit 300s and the management watch list to help identify agency and
government-wide program-level weaknesses in areas such as earned value
and project management, enterprise architecture, and security.
Thank you for the opportunity to review and comment on your draft
report on this important issue. While we appreciate your careful review
and discussion of OMB's budget Exhibit 300 and the related management
watch list, we caution readers to view the report in the context of the
many oversight responsibilities of both OMB and the agencies.
Sincerely,
Signed by:
Karen S. Evans:
Administrator:
Office of E-Government and Information Technology:
(310472):
[End of Section]
FOOTNOTES
[1] On March 3, 2004, OMB's Deputy Director for Management and its
Administrator for Electronic Government and Information Technology
testified at a hearing conducted by the Subcommittee on Technology,
Information Policy, Intergovernmental Relations and the Census,
Committee on Government Reform, House of Representatives. The hearing
topic was "Federal Information Technology Investment Management,
Strategic Planning, and Performance Measurement: $60 Billion Reasons
Why."
[2] 44 U.S.C. § 3504(a)(1)(B)(vi) (OMB); 44 U.S.C. § 3506(h)(5)
(agencies).
[3] These requirements are specifically described in the Clinger-Cohen
Act, 40 U.S.C. § 11302(c).
[4] OMB Circular A-11 defines a major IT investment as an investment
that requires special management attention because of its importance to
an agency's mission or because it is an integral part of the agency's
enterprise architecture, has significant program or policy
implications, has high executive visibility, or is defined as major by
the agency's capital planning and investment control process.
[5] 31 U.S.C. § 1512.
[6] 40 U.S.C. § 11303(b)(5)(B).
[7] These scoring criteria are presented in Office of Management and
Budget Circular A-11, Part 7, Planning, Budgeting, Acquisition, and
Management of Capital Assets (July 2004).
[8] Earned value management is a project management tool that
integrates the investment scope of work with schedule and cost elements
for investment planning and control. This method compares the value of
work accomplished during a given period with that of the work expected
in the period. Differences in expectations are measured in both cost
and schedule variances.
[9] The authority for apportioning funds is specifically described in
the Clinger-Cohen Act, 40 U.S.C. § 11303(b)(5)(B)(ii).
[10] The quarterly e-Gov Scorecards are reports that use a red/yellow/
green scoring system to illustrate the results of OMB's evaluation of
the agencies' implementation of e-government criteria in the
President's Management Agenda. The scores are determined in quarterly
reviews, where OMB evaluates agency progress toward agreed-upon goals
along several dimensions, and they provide input to the quarterly
reporting on the President's Management Agenda.
GAO's Mission:
The Government Accountability Office, the investigative arm of
Congress, exists to support Congress in meeting its constitutional
responsibilities and to help improve the performance and accountability
of the federal government for the American people. GAO examines the use
of public funds; evaluates federal programs and policies; and provides
analyses, recommendations, and other assistance to help Congress make
informed oversight, policy, and funding decisions. GAO's commitment to
good government is reflected in its core values of accountability,
integrity, and reliability.
Obtaining Copies of GAO Reports and Testimony:
The fastest and easiest way to obtain copies of GAO documents at no
cost is through the Internet. GAO's Web site ( www.gao.gov ) contains
abstracts and full-text files of current reports and testimony and an
expanding archive of older products. The Web site features a search
engine to help you locate documents using key words and phrases. You
can print these documents in their entirety, including charts and other
graphics.
Each day, GAO issues a list of newly released reports, testimony, and
correspondence. GAO posts this list, known as "Today's Reports," on its
Web site daily. The list contains links to the full-text document
files. To have GAO e-mail this list to you every afternoon, go to
www.gao.gov and select "Subscribe to e-mail alerts" under the "Order
GAO Products" heading.
Order by Mail or Phone:
The first copy of each printed report is free. Additional copies are $2
each. A check or money order should be made out to the Superintendent
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or
more copies mailed to a single address are discounted 25 percent.
Orders should be sent to:
U.S. Government Accountability Office
441 G Street NW, Room LM
Washington, D.C. 20548:
To order by Phone:
Voice: (202) 512-6000:
TDD: (202) 512-2537:
Fax: (202) 512-6061:
To Report Fraud, Waste, and Abuse in Federal Programs:
Contact:
Web site: www.gao.gov/fraudnet/fraudnet.htm
E-mail: fraudnet@gao.gov
Automated answering system: (800) 424-5454 or (202) 512-7470:
Public Affairs:
Jeff Nelligan, managing director,
NelliganJ@gao.gov
(202) 512-4800
U.S. Government Accountability Office,
441 G Street NW, Room 7149
Washington, D.C. 20548: