Information Technology
Veterans Affairs Can Further Improve Its Development Process for Its New Education Benefits System
Gao ID: GAO-11-115 December 1, 2010
The Post-9/11 GI Bill was signed into law in June 2008 and provides educational assistance for veterans and members of the armed forces who served on or after September 11, 2001. The Department of Veterans Affairs (VA) is responsible for processing claims for these new education benefits. VA concluded that its legacy systems and manual processes were insufficient to support the new benefits and, therefore, began an initiative to modernize its benefits processing capabilities. The long-term solution was to provide a fully automated end-to-end information technology (IT) system to support the delivery of benefits by December 2010. VA chose an incremental development approach, called Agile software development, which is intended to deliver functionality in short increments before the system is fully deployed. GAO was asked to (1) determine the status of VA's development and implementation of its IT system to support the implementation of education benefits identified in the Post-9/11 GI Bill and (2) evaluate the department's effectiveness in managing its IT project for this initiative.
VA has made important progress in delivering key automated capabilities to process the new education benefits. Specifically, it deployed the first two of four releases of its long-term system solution by its planned dates, thereby providing regional processing offices with key automated capabilities to prepare original and amended benefit claims. In addition, the Agile process allowed the department the flexibility to accommodate legislative changes and provide functionality according to business priorities. While progress has been made, VA did not ensure that certain critical tasks were completed that were initially expected to be included in the second release by June 30, 2010. For example, the conversion of data from systems in the interim solution to systems developed for the long-term solution was not completed until August 23, 2010. Because of the delay, VA planned to reprioritize the functionality that was to be included in the third release. Further, while VA plans to include full self-service capabilities to veterans, it will not do so in the fourth release as scheduled; instead it intends to provide this capability after the release or in a separate initiative. VA reported obligations and expenditures for these releases, through July 2010, to be approximately $84.6 million, with additional planned obligations of $122.5 million through fiscal year 2011. VA has taken important steps by demonstrating a key Agile practice essential to effectively managing its system development--establishing a cross-functional team that involves senior management, governance boards, key stakeholders, and distinct Agile roles. In addition, VA made progress toward demonstrating three other Agile practices--focusing on business priorities, delivering functionality in short increments, and inspecting and adapting the project as appropriate. Specifically, to ensure business priorities are a focus, VA established a vision that captures the project purpose and goals and established a plan to maintain requirements traceability. To aid in delivering functionality, the department established an incremental testing approach. It also used an oversight tool, which was intended to allow the project to be inspected and adapted by management. However, VA could make further improvements to these practices. In this regard, it did not (1) establish metrics for the goals or prioritize project constraints; (2) always maintain traceability between legislation, policy, business rules, and test cases; (3) establish criteria for work that was considered "done" at all levels of the project; (4) provide for quality unit and functional testing during the second release, as GAO found that 10 of the 20 segments of system functionality were inadequate; and (5) implement an oversight tool that depicted the rate of the work completed and the changes to project scope over time. Until VA improves these areas, management will lack the visibility it needs to clearly communicate progress and unresolved issues in its development processes may not allow VA to maximize the benefits of the system. To help guide the full development and implementation of the long-term solution, GAO is recommending that VA take five actions to improve its development process for its new education benefits system. VA concurred with three of GAO's five recommendations and provided details on planned actions, but did not concur with the remaining two.
Recommendations
Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.
Director:
Valerie C. Melvin
Team:
Government Accountability Office: Information Technology
Phone:
(202) 512-6304
GAO-11-115, Information Technology: Veterans Affairs Can Further Improve Its Development Process for Its New Education Benefits System
This is the accessible text file for GAO report number GAO-11-115
entitled 'Information Technology: Veterans Affairs Can Further Improve
Its Development Process for Its New Education Benefits System' which
was released on December 1, 2010.
This text file was formatted by the U.S. Government Accountability
Office (GAO) to be accessible to users with visual impairments, as
part of a longer term project to improve GAO products' accessibility.
Every attempt has been made to maintain the structural and data
integrity of the original printed product. Accessibility features,
such as text descriptions of tables, consecutively numbered footnotes
placed at the end of the file, and the text of agency comment letters,
are provided but may not exactly duplicate the presentation or format
of the printed version. The portable document format (PDF) file is an
exact electronic replica of the printed version. We welcome your
feedback. Please E-mail your comments regarding the contents or
accessibility features of this document to Webmaster@gao.gov.
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed
in its entirety without further permission from GAO. Because this work
may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this
material separately.
United States Government Accountability Office:
GAO:
Report to the Committee on Veterans' Affairs, House of Representatives:
December 2010:
Information Technology:
Veterans Affairs Can Further Improve Its Development Process for Its
New Education Benefits System:
GAO-11-115:
GAO Highlights:
Highlights of GAO-11-115, a report to the Committee on Veterans'
Affairs, House of Representatives.
Why GAO Did This Study:
The Post-9/11 GI Bill was signed into law in June 2008 and provides
educational assistance for veterans and members of the armed forces
who served on or after September 11, 2001. The Department of Veterans
Affairs (VA) is responsible for processing claims for these new
education benefits. VA concluded that its legacy systems and manual
processes were insufficient to support the new benefits and,
therefore, began an initiative to modernize its benefits processing
capabilities. The long-term solution was to provide a fully automated
end-to-end information technology (IT) system to support the delivery
of benefits by December 2010. VA chose an incremental development
approach, called Agile software development, which is intended to
deliver functionality in short increments before the system is fully
deployed.
GAO was asked to (1) determine the status of VA‘s development and
implementation of its IT system to support the implementation of
education benefits identified in the Post-9/11 GI Bill and (2)
evaluate the department‘s effectiveness in managing its IT project for
this initiative.
What GAO Found:
VA has made important progress in delivering key automated
capabilities to process the new education benefits. Specifically, it
deployed the first two of four releases of its long-term system
solution by its planned dates, thereby providing regional processing
offices with key automated capabilities to prepare original and
amended benefit claims. In addition, the Agile process allowed the
department the flexibility to accommodate legislative changes and
provide functionality according to business priorities. While progress
has been made, VA did not ensure that certain critical tasks were
completed that were initially expected to be included in the second
release by June 30, 2010. For example, the conversion of data from
systems in the interim solution to systems developed for the long-term
solution was not completed until August 23, 2010. Because of the
delay, VA planned to reprioritize the functionality that was to be
included in the third release. Further, while VA plans to include full
self-service capabilities to veterans, it will not do so in the fourth
release as scheduled; instead it intends to provide this capability
after the release or in a separate initiative. VA reported obligations
and expenditures for these releases, through July 2010, to be
approximately $84.6 million, with additional planned obligations of
$122.5 million through fiscal year 2011.
VA has taken important steps by demonstrating a key Agile practice
essential to effectively managing its system development”establishing
a cross-functional team that involves senior management, governance
boards, key stakeholders, and distinct Agile roles. In addition, VA
made progress toward demonstrating three other Agile practices”
focusing on business priorities, delivering functionality in short
increments, and inspecting and adapting the project as appropriate.
Specifically, to ensure business priorities are a focus, VA
established a vision that captures the project purpose and goals and
established a plan to maintain requirements traceability. To aid in
delivering functionality, the department established an incremental
testing approach. It also used an oversight tool, which was intended
to allow the project to be inspected and adapted by management.
However, VA could make further improvements to these practices. In
this regard, it did not (1) establish metrics for the goals or
prioritize project constraints; (2) always maintain traceability
between legislation, policy, business rules, and test cases; (3)
establish criteria for work that was considered ’done“ at all levels
of the project; (4) provide for quality unit and functional testing
during the second release, as GAO found that 10 of the 20 segments of
system functionality were inadequate; and (5) implement an oversight
tool that depicted the rate of the work completed and the changes to
project scope over time. Until VA improves these areas, management
will lack the visibility it needs to clearly communicate progress and
unresolved issues in its development processes may not allow VA to
maximize the benefits of the system.
What GAO Recommends:
To help guide the full development and implementation of the long-term
solution, GAO is recommending that VA take five actions to improve its
development process for its new education benefits system. VA
concurred with three of GAO‘s five recommendations and provided
details on planned actions, but did not concur with the remaining two.
View [hyperlink, http://www.gao.gov/products/GAO-11-115] or key
components. For more information, contact Valerie C. Melvin at (202)
512-6304 or melvinv@gao.gov.
[End of section]
Contents:
Letter:
Conclusions:
Recommendations for Executive Action:
Agency Comments and Our Evaluation:
Appendix I: Briefing for Staff Members of the Subcommittee on Economic
Opportunity, Committee on Veterans' Affairs, House of Representatives:
Appendix II: Comments from the Department of Veterans Affairs:
Appendix III: Comments from the Veterans Affairs' Assistant Secretary
for Information and Technology:
Appendix IV: GAO Contact and Staff Acknowledgments:
Abbreviations:
IT: information technology:
OI&T: Office of Information and Technology:
PMAS: Project Management Accountability System:
SPAWAR: Space and Naval Warfare Systems Center--Atlantic:
VA: Department of Veterans Affairs:
VBA: Veterans Benefits Administration:
[End of section]
United States Government Accountability Office:
Washington, DC 20548:
December 1, 2010:
The Honorable Bob Filner:
Chairman:
The Honorable Steve Buyer:
Ranking Member:
Committee on Veterans' Affairs:
House of Representatives:
In June 2008, Congress passed the Post-9/11 Veterans Educational
Assistance Act of 2008[Footnote 1] (commonly referred to as the Post-
9/11 GI Bill or Chapter 33). This act amended Title 38 United States
Code to include Chapter 33, which provides educational assistance for
veterans and members of the armed forces who served on or after
September 11, 2001. The Department of Veterans Affairs (VA) is
responsible for processing claims for the Chapter 33 education
benefit, which is a three-part benefit--tuition and fee payments,
housing allowance, and book stipend.
In considering its implementation of the legislation, the department
concluded that it did not have a system capable of calculating the new
benefit. As a result, the department undertook an initiative to
modernize its education benefits processing capabilities. This
initiative involved an interim solution that augmented existing
processes by providing temporary applications to manually collect data
and a long-term solution to deliver automated processing capabilities.
The department intended to complete enough of the system functionality
in the long-term solution to replace the interim solution by June
2010, and to include additional capabilities, such as interfaces to
legacy systems,[Footnote 2] to provide a fully automated end-to-end
system to support the delivery of education benefits by December 2010.
To develop the system for its long-term solution, VA is relying on
contractor assistance and is using an incremental development
approach, called Agile software development,[Footnote 3] which is to
deliver software functionality in short increments before the system
is fully deployed. Agile software development stresses the use of key
practices such as working as one team to define business priorities
and, based on those priorities, delivering work in short increments.
Each increment of work is inspected by the team and the project's
plans and priorities are adapted accordingly.
Given the importance of delivering education benefits to veterans and
their families, we were asked to review the long-term solution to:
* determine the status of VA's development and implementation of its
information technology (IT) system to support the implementation of
education benefits identified in the Post-9/11 GI Bill, and:
* evaluate the department's effectiveness in managing its IT project
for this initiative.
As agreed with your offices, on September 13, 2010, we provided
briefing slides that outlined the results of our study to staff of
your Subcommittee on Economic Opportunity. The purpose of this report
is to provide the published briefing slides to you and to officially
transmit our recommendations to the Secretary of Veterans Affairs. The
slides, which discuss our scope and methodology, are included in
appendix I.
We conducted our work in support of this performance audit at the
Department of Veterans Affairs headquarters in Washington, D.C., and
at a contractor's facility in Chantilly, Virginia, from November 2009
to December 2010 in accordance with generally accepted government
auditing standards. Those standards require that we plan and perform
the audit to obtain sufficient, appropriate evidence to provide a
reasonable basis for our findings and conclusions based on our audit
objectives. We believe that the evidence obtained provides a
reasonable basis for our findings and conclusions based on our audit
objectives.
In summary, our study highlighted the following:
* VA has made important progress in developing and implementing the
first two of four releases of software planned for its new education
benefits processing system as scheduled, with these deployments
occurring on March 31, 2010, and June 30, 2010. In doing so, the
department provided its regional processing offices with key automated
capabilities to prepare original and amended benefit claims. It also
responded to legislative changes and provided for housing rate
adjustments. While VA did not previously estimate cost for these
releases and, as such, could not track estimated to actual costs, it
reported that about $84.6 million had been obligated through July
2010. The department noted that its Agile process allowed the
flexibility to adapt to legislative changes and provide functionality
according to business priorities.
However, VA did not ensure that certain critical tasks were completed
that were initially expected to be included in the second release by
June 30, 2010. Specifically, the department did not complete the
conversion of data from systems in the interim solution to the systems
developed for the long-term solution because it was found to be more
complex than the department had anticipated. The department also did
not complete the development of interfaces between the new system and
legacy systems. Program officials stated that data conversion was
included along with housing rate adjustments in a sub-release that was
later deployed on August 23, 2010. Because of these delays, VA planned
to reprioritize what functionality would be developed in its third
release by September 30, 2010. For its fourth release, it intends to
reduce its planned functionality of providing full self-service
capabilities to veterans by December 31, 2010. The department intends
to provide this capability after its fourth release or under a
separate initiative. As such, VA estimates that the total system
development actual and planned obligations through 2011 will be about
$207.1 million.[Footnote 4]
* VA has demonstrated key Agile practices that are essential to
effectively managing its system development, but certain practices can
be improved. Specifically, the department ensured that teams
represented key stakeholders and that distinct Agile roles were
fulfilled. For example, the teams consisted of subject matter experts,
programmers, testers, analysts, engineers, architects, and designers.
The department also made progress toward demonstrating the three other
Agile practices--focusing on business priorities, delivering
functionality in short increments, and inspecting and adapting the
project as appropriate. However, VA can improve its effectiveness in
these areas.
* Business priorities. To ensure business priorities are a focus, a
project starts with a vision that contains, among other things, a
purpose, goals, metrics, and constraints. In addition, it should be
traceable to requirements. VA established a vision that captured the
project purpose and goals; however, it had not established metrics for
the project's goals or prioritized project constraints. Department
officials stated that project documentation is evolving and they
intend to improve their processes based on lessons learned; however,
until it identifies metrics and constraints, the department will not
have the means to compare the projected performance and actual results
of this goal. VA had also established a plan that identified how to
maintain requirements traceability within an Agile environment;
however, the traceability between legislation, policy, business rules,
and test cases was not always maintained. VA stated that its
requirements tool did not previously have the capability to perform
this function but now provides this traceability to test cases.
Nonetheless, if the department does not ensure that requirements are
traceable to legislation, policies, and business rules, it has limited
assurance that the requirements will be fully met.
* Deliver functionality in short increments. To aid in delivering
functionality in short increments, defining what constitutes completed
work (work that is "done") and testing functionality is critical.
[Footnote 5] However, VA had not established criteria for work that
was considered "done" at all levels of the project. Program officials
stated that each development team had its own definition of "done" and
agreed that they needed to provide a standard definition across all
teams. If VA does not mutually agree upon a definition of "done" at
each level, confusion about what constitutes completed work can lead
to inconsistent quality and it may not be able to clearly communicate
how much work remains. In addition, while the department had
established an incremental testing approach, the quality of unit and
functional testing performed during Release 2 was inadequate in 10 of
the 20 segments of system functionality we reviewed. Program officials
stated that they placed higher priority on user acceptance testing at
the end of a release and relied on users to identify defects that were
not detected during unit and functional testing. Until the department
improves testing quality, it risks deploying future releases that
contain defects which may require rework.
* Inspect and adapt. In order for projects to be effectively inspected
and adapted, management must have tools to provide effective
oversight. For Agile development, progress and the amount of work
remaining can be reflected in a burn-down chart, which depicts how
factors such as the rate at which work is completed (velocity) and
changes in overall product scope affect the project over time. While
VA had an oversight tool that showed the percentage of work completed
to reflect project status at the end of each iteration, it did not
depict the velocity of the work completed and the changes to scope
over time. Program officials stated that their current reporting did
not show the changes in project scope because their focus was on
metrics that are forward looking rather than showing past statistics
for historical comparison. However, without this level of visibility
in its reporting, management may not have all the information it needs
to fully understand project status.
Conclusions:
VA deployed the first two of four releases of its long-term system
solution by its planned dates, therefore providing improved claims-
processing functionality to all regional processing offices, such as
the ability to calculate original and amended benefit claims. In
addition, the Agile process allowed the department the flexibility to
accommodate legislative changes and provide functionality according to
business priorities, such as housing rate adjustments. However, key
features of the solution were not completed as intended in the second
release because the department found some functionality to be more
complex than anticipated. Specifically, interfaces to legacy systems
and the conversion of data from systems in the interim solution were
not completed as intended in the second release. Due to these delays,
VA planned to reprioritize what functionality would be included in its
third release. Also, for its fourth release, the department had
reduced a significant planned functionality--veteran self-service
capability. While VA intends to provide this capability after the
fourth release within the long-term system solution or under a
separate initiative, it is unclear what functionality will be
delivered in the remaining two releases when it deploys the system in
December 2010.
In using an Agile approach for this initiative, VA is applying lessons
learned and has taken important first steps to effectively manage the
IT project by establishing a cross-functional team that involves
senior management, governance boards, and key stakeholders. However,
the department had not ensured that several key Agile practices were
performed. Measurable goals were not developed and the project
progressed without bidirectional traceability in its requirements.
Additionally, in developing the system, VA did not establish a common
standard and consistent definition for work to be considered "done" or
develop oversight tools to clearly communicate velocity and the
changes to project scope over time. Testing deficiencies further
hindered VA's assurances that all critical system defects would be
identified. Until VA improves these areas, management does not have
the visibility it needs to clearly communicate progress to
stakeholders and estimate when future system capabilities will be
delivered. Additionally, reduced visibility and unresolved issues in
its development processes may result in the department continuing to
remove functionality that was expected in future releases, thus
delivering a system that does not fully and effectively support the
implementation of education benefits as identified in the Post-9/11 GI
Bill.
Recommendations for Executive Action:
To help guide the full development and implementation of the Chapter
33 long-term solution, we recommend that the Secretary of Veterans
Affairs direct the Under Secretary for Benefits to take the following
five actions:
* establish performance measures for goals and identify constraints to
provide better clarity in the vision and expectations of the project;
* establish bidirectional traceability between requirements and
legislation, policies, and business rules to provide assurance that
the system will be developed as expected;
* define the conditions that must be present to consider work "done"
in adherence with agency policy and guidance;
* implement an oversight tool to clearly communicate velocity and the
changes to project scope over time; and:
* improve the adequacy of the unit and functional testing processes to
reduce the amount of system rework.
Agency Comments and Our Evaluation:
We received written comments on a draft of this report from the
Secretary of Veterans Affairs and VA's Assistant Secretary for
Information and Technology. In the Secretary's comments, reproduced in
appendix II, VA concurred with three of our recommendations and did
not concur with two recommendations. Specifically, the department
concurred with our recommendation to establish performance measures
for goals and identify constraints to provide better clarity in the
vision and expectations of the project. VA noted that it plans to
develop performance measures consistent with automating the Post-9/11
GI Bill by March 2011. While this is a positive step, as we noted, it
is also important that the department identify any project or business
constraints to better clarify the vision and expectations of the
system. VA also concurred with our recommendation that it establish
bidirectional traceability between requirements and legislation,
policies, and business rules to provide assurance that the system will
be developed as expected. The department stated that it plans to
establish traceability between its business rules for the long-term
solution and the legislation by June 2011. Additionally, VA concurred
with our recommendation to define the conditions that must be present
to consider work "done" in adherence with department policy and
guidance. VA noted that the initiative's fiscal year 2011 operating
plan outlines these conditions at the project level and that it
intends to clarify the definition at the working group level by
December 2010.
VA did not concur with our recommendation that it implement an
oversight tool to clearly communicate velocity and the changes to
project scope over time. The department indicated that development
metrics and models had already been established and implemented to
forecast and measure development velocity. In this regard, as our
briefing noted, department officials stated that they previously
reported project-level metrics during the first release, and based on
lessons learned, decided to shift to reporting metrics at the
development team level. While it is important that VA established the
capability to track team-level metrics, it is also important to track
and clearly report how changes to the system development at the team
level affect the overall project-level scope over time. Specifically,
without the overall velocity--a key mechanism under the Agile
methodology--VA does not have the information to understand the
expected effort to complete the total scope of work and the associated
length of time to do so. The overall velocity provides an
understanding of the complexity and difficulty in accomplishing tasks
and provides VA management with information to better understand
project risks. This visibility is a key concept of the Agile
methodology that VA has chosen to implement for this project. Without
this level of visibility in its reporting, management and the
development teams may not have all the information they need to fully
understand project status and generate the discussion and feedback
necessary for continuous process improvement. Therefore, we continue
to believe that our recommendation for VA to clearly communicate
velocity and project scope changes can only strengthen the
department's development process to be more in line with Agile system
development practices.
VA also did not concur with our recommendation to improve the adequacy
of the unit and functional testing processes to reduce the amount of
system rework. While the department noted that its testing approach is
compatible with Agile development, it also acknowledged in other
technical comments the department provided that there were instances
of inconsistencies of user stories for capabilities being marked
"done" and the user stories we reviewed during the second release
showed significant weaknesses in the quality of testing performed.
While we agree that VA's testing approach is consistent with Agile
methodology, these weaknesses we identified demonstrate ineffective
testing and the need for a consistent and agreed-upon definition of
"done." Further, the program officials noted that their approach
focused on users identifying defects at the end of the release, which,
as we have noted, can be problematic because it is difficult for users
to remember all the items and parameters needed for
functionality.[Footnote 6] Without increased focus on the quality of
testing early in the development process, VA risks delaying
functionality and/or deploying functionality with unknown defects that
could require future rework that may be costly and ultimately impede
the claims examiners' ability to process claims efficiently.
Therefore, we continue to believe that our recommendation to improve
the adequacy of unit and functional testing is needed to improve the
effectiveness of VA's process called for in the Agile methodology.
This would provide stakeholders greater assurance that functionality
developed during each iteration is of releasable quality before it is
presented to users as completed work in accordance with Agile system
development practices.
In addition, VA provided other technical comments on a draft of this
report. In the comments, the department provided additional
clarification on why there were delays to functionality and how they
affected the release schedule. Specifically, the department stated
that the governance structure it established for the initiative
provided the necessary management, support, and prioritization of
development efforts to balance desired functionality within the
development challenges and constraints. The department noted that,
among other things, delays in the first two releases were a result of
additional functionality prioritized and developed, such as housing
rate adjustments and the ability to automatically generate letters for
veterans as well as unanticipated challenges, such as the complexity
of data conversion tasks. Further, it noted that decisions and
prioritizations were primarily made to minimize impact on field
offices and to support fall enrollment and that they impacted the
development capacity to support the capabilities that could be
developed in the third release. VA also offered other technical
comments which were incorporated as appropriate.
Beyond the department's comments on our recommendations, the Assistant
Secretary for Information and Technology provided additional written
comments, reproduced in appendix III, which noted concerns with our
draft report. In these comments, the Assistant Secretary stated that
the department believes we fell short of meeting the objectives for
this report by omitting key facts and presenting an unnecessarily
negative view of VA's status and effectiveness to Congress. In
particular, the Assistant Secretary stated that VA had successfully
converted all processing of new Post-9/11 GI Bill claims to the long-
term solution prior to the commencement of the fall 2010 enrollment
process and that processing with the new system has been nearly
flawless. He added that Veterans Business Administration claims
processors like the new system and find it easy and effective to use.
We are encouraged to hear that the department is experiencing positive
results from the system development efforts that have been
accomplished. However, as noted in our briefing, system functionality
that was envisioned to (1) provide self-service capabilities to
veterans and (2) end-to-end processing of benefits by December 2010
was deferred. Further, as the vision for its new education benefits
system evolves, it is essential that the department documents these
changes to ensure that its expectations and goals for the system are
measurable and aligned at all levels.
In addition, the Assistant Secretary stated that limited exposure to
the Agile methodology possibly caused us to present incorrect
assumptions as facts, such as our evaluation of the department's
testing. Our audit team, which included a trained ScrumMaster,
examined the department's use of Agile Scrum practices against
documented and accepted methodologies and consulted with an expert in
the field that is not only a ScrumMaster, but also an Agile Scrum
trainer that has extensive experience in evaluating Agile system
development projects. At the initiation of our study, we discussed our
evaluation approach with program officials and throughout the study,
held meetings with them to ensure our full understanding of the
department's requirements and testing processes. We did not take issue
with the Agile testing approach used by VA. However, we found
deficiencies in testing. Further, we presented the results of our
observations to program officials in June 2010, at which time they did
not express any such concerns, or otherwise comment on our evaluation
of the testing. Further, given the evolving nature of Agile system
development, it is important to ensure that work that is presented as
"done" and demonstrated to the users at the end of an iteration has
undergone adequate testing to prevent inaccurate information from
being provided. In addition to weaknesses we identified in the testing
of select user stories, the department identified a number of defects
during the development of the second release. In our view, VA has an
opportunity to improve the adequacy of its unit and functional testing
which occurs during each iteration to help identify and resolve any
defects before any related functionality is presented to users as
completed work at the end of the iteration. As we noted, the
department agreed that they needed to clarify their definition of
"done" and ensure it is being applied consistently. As such, the
definition often includes fully tested functionality that has no
defects. During our review, we observed on multiple occasions work
being presented as "done" without having completed all testing.
Improved testing up front can reduce the amount of defects found later
in user acceptance testing and production that would require more
costly rework.
Further, the Assistant Secretary stated that the department believes
that we missed a substantial opportunity to positively influence real
change by not focusing on the fact that VA had adopted the Agile
methodology after many failings with other IT systems development
efforts that used waterfall development methodologies. We agree that
VA has taken an important step toward improving its development
capability and that it has developed significant segments of its new
education benefits system with its new methodology. However, as we
noted in our briefing, the department had not fully established
metrics for its goals, which are essential to fully gauge its progress
beyond past initiatives.
While we believe that VA has made substantial progress in implementing
a new process to develop its system, we stand by our position that
there is still an opportunity for the department to improve its new
development process in accordance with our recommendations. Doing so
would further increase the likelihood that VA fully develops and
delivers the end-to-end benefits processing capabilities envisioned to
support the Post-9/11 GI Bill and the needs of veterans.
We are sending copies of this report to appropriate congressional
committees, the Secretary of Veterans Affairs, and other interested
parties. In addition, the report will be available at no charge on the
GAO Web site at [hyperlink, http://www.gao.gov].
If you or your staffs have questions about this report, please contact
me at (202) 512-6304 or melvinv@gao.gov. Contact points for our
Offices of Congressional Relations and Public Affairs may be found on
the last page of this report. Key contributors to this report are
listed in appendix IV.
Signed by:
Valerie C. Melvin:
Director, Information Management and Human Capital Issues:
[End of section]
Appendix I: Briefing for Staff Members of the Subcommittee on Economic
Opportunity, Committee on Veterans' Affairs, House of Representatives:
Information Technology: Veterans Affairs Can Improve Its Development
Process for Its New Education Benefits System:
Briefing for Staff Members of the Subcommittee on Economic Opportunity
Committee on Veterans' Affairs:
House of Representatives:
September 13, 2010:
Table of Contents:
Introduction;
Objectives;
Scope and Methodology;
Results in Brief;
Background;
Status of Development Efforts;
VA's Effectiveness in Managing Its New System;
Conclusions;
Recommendations for Executive Action;
Agency Comments and Our Evaluation;
Attachment I;
Attachment II;
Attachment III.
Introduction:
In June 2008, Congress passed the Post-9/11 Veterans Educational
Assistance Act of 2008[Footnote 7] (commonly referred to as the Post-
9/11 GI Bill or Chapter 33). This act amended Title 38 United States
Code to include Chapter 33, which provides educational assistance for
veterans and members of the armed forces who served on or after
September 11, 2001.
The Department of Veterans Affairs (VA) is responsible for processing
claims for the Chapter 33 education benefit, which is a three-part
benefit”tuition and fee payments, housing allowance, and book stipend.
The benefit is determined based upon an individual's aggregate
qualifying active duty service.
A key milestone in the Chapter 33 legislation was the requirement that
VA provide the new educational assistance benefits to service members
and veterans beginning on August 1, 2009. In considering its
implementation of the legislation, the department concluded that it
did not have a system capable of calculating the new benefit. As a
result, the department undertook an initiative to modernize its
education benefits processing capabilities.
VA's initiative to modernize its education benefits processing
involved interim and longterm solutions to deliver new processing
capabilities. According to the department, the interim solution was
intended to augment existing processes by providing temporary
applications and tools, such as a spreadsheet that aided claims
examiners in manually collecting data from VA legacy systems and to
calculate the new benefits.[Footnote 8]
At the same time that it began the interim solution, the department
also initiated a longterm solution that was intended to fully automate
the manual processes for calculating education benefits for service
members and veterans. Specifically, the long-term solution was
intended to provide a system to replace the interim solution as well
as provide automated interfaces with existing legacy systems. The
department intended to complete enough of the functionality in the
long-term solution to replace the interim solution by June 2010, and
to include additional capabilities for full deployment of the new
education benefits system by December 2010.
To develop the system for its long-term solution, VA is relying on
contractor assistance and is using an incremental development
approach, called Agile software development,[Footnote 9] which is to
deliver software functionality in short increments before the system
is fully deployed. Agile software development has key practices such
as working as one team. This one team is to define business priorities
and, based on those priorities, deliver work in short increments. Each
increment of work is inspected by the team and the project's plans and
priorities are adapted accordingly.
Historically, VA has experienced significant IT development and
delivery difficulties. In the spring of 2009, the department reviewed
its inventory of IT projects and identified ones that exhibited
serious problems with schedule slippages and cost overruns. The
department noted that an incremental approach, such as Agile software
development, was considered to be an effective way to support the long-
term system solution development.
Given the importance of delivering education benefits to veterans and
their families, we were asked to review the long-term solution to:
* determine the status of VA's development and implementation of its
information technology (IT) system to support the implementation of
education benefits identified in the Post-9/11 GI Bill and;
* evaluate the agency's effectiveness in managing its IT project for
this initiative.
[End of section]
Scope and Methodology:
To determine the status of VA's development and implementation of IT
system to support the implementation of education benefits identified
in the Post-9/11 GI Bill, we:
* reviewed VA and contractor plans for system development, observed
project status meetings, and compared the actual status of development
and implementation to the planned status, and;
* discussed the department's plans and implementation with VA and
contractor officials to determine the functionality completed and
demonstrated.
To evaluate VA's effectiveness in managing its IT project for this
initiative, we:
* reviewed current Agile literature and interviewed experts in the
field to identify key Agile practices applicable to the department's
project;
* evaluated and compared the department's IT project management
practices to industry standards, best practices, and disciplined
processes, such as Agile software development, and applicable federal
laws,[Footnote 10] policies, and guidance;[Footnote 11]
* analyzed requirements and testing artifacts for 20 segments of
system features developed to determine the traceability of
requirements and testing coverage;
* observed key agency and contractor development meetings such as
daily discussions, bi-weekly software reviews and planning meetings,
where management decisions were made and Agile practices were
demonstrated; and;
* interviewed department and contractor officials about the management
and oversight of requirements, testing, and transition plans.
The information on cost estimates and costs that were incurred for
long-term solution development were provided by VA officials. We did
not audit the reported costs and thus cannot attest to their accuracy
or completeness.
We conducted this performance audit at the Department of Veterans
Affairs headquarters in Washington, D.C., and at a contractor facility
in Chantilly, Virginia, from November 2009 to September 2010 in
accordance with generally accepted government auditing standards.
Those standards require that we plan and perform the audit to obtain
sufficient, appropriate evidence to provide a reasonable basis for our
findings and conclusions based on our audit objectives. We believe
that the evidence obtained provides a reasonable basis for our
findings and conclusions based on our audit objectives.
[End of section]
Results in Brief:
VA has developed and implemented the first two of four releases of
software planned for its new education benefits processing system as
scheduled, with these deployments occurring on March 31, 2010, and
June 30, 2010. In doing so, VA provided its regional processing
offices with key automated capabilities to prepare original and
amended benefit claims. In addition, VA responded to legislative
changes and provided for housing rate adjustments. While VA did not
previously estimate costs for these releases and, as such, could not
track estimated to actual costs, it has reported that about $84.6
million has been obligated through July 2010. The department noted
that its Agile process allowed the flexibility to adapt to legislative
changes and provide functionality according to business priorities.
However, VA did not ensure that certain critical tasks were performed
that were expected to be part of the second release. Specifically, it
did not complete the conversion of data from systems in the interim
solution to the systems developed for the long-term solution and did
not complete the development of interfaces between the new system and
legacy systems.[Footnote 12]
Further, because of these delays, VA is in the process of determining
and prioritizing what functionality will be developed in its third
release by September 30, 2010. For its fourth release, it intends to
reduce its planned functionality of providing full self-service
capabilities to veterans by December 31, 2010. However, VA intends to
provide this capability after its fourth release or under a separate
initiative. As such, VA estimates that the total system development
actual and planned obligations through 2011 will be about $207.1
million.[Footnote 13]
VA has demonstrated key Agile practices that are essential to
effectively managing its system development, but certain practices can
be improved. Specifically, the department has ensured that teams
represent key stakeholders and that specific Agile roles were
fulfilled. For example, the teams consist of subject matter experts,
programmers, testers, analysts, engineers, architects, and designers.
The department has also made progress toward demonstrating the three
other Agile practices”focusing on business priorities, delivering
functionality in short increments, and inspecting and adapting the
project as appropriate. However, VA can improve its effectiveness in
these areas. Specifically:
* To ensure business priorities are a focus, a project starts with a
vision that contains, among other things, purpose, goals, metrics, and
constraints. In addition, it should be traceable to requirements. VA
has established a vision that captures the project purpose and goals;
however, it has not established metrics for the project's goals or
prioritized project constraints. VA officials stated that project
documentation is evolving and they intend to improve their processes
based on lessons learned; however, until it identifies metrics and
constraints, the department will not have the means to compare the
projected performance and actual results of this goal. VA has
also established a plan that identifies how to maintain requirements
traceability within an Agile environment; however, the traceability
between legislation, policy, business rules, and test cases was not
always maintained. VA stated its requirements tool did not previously
have the capability to perform this function and now provides this
traceability to test cases. Nonetheless, if VA does not ensure that
requirements are traceable to legislation, policies, and business
rules, it has limited assurance that the requirements will be fully
met.
* To aid in delivering functionality in short increments, defining
what constitutes completed work (work that is "done") and testing
functionality is critical.[Footnote 14] However, VA has not yet
established criteria for work that is considered "done" at all levels
of the project. Program officials stated that each development team
has its own definition of "done" and agreed that they need to provide
a standard definition across all teams. If VA does not mutually agree
upon a definition of "done" at each level, confusion about what
constitutes completed work can lead to inconsistent quality and it may
not be able to clearly communicate how much work remains. In addition,
while the department has established an incremental testing approach,
the quality of unit and functional testing performed during Release 2
was inadequate in 10 of the 20 segments of system functionality we
reviewed. Program officials stated that they placed higher priority on
user acceptance testing at the end of a release and relied on users to
identify defects that were not detected during unit and functional
testing. Until the department improves testing quality, it risks
deploying future releases that contain defects which may require
rework.
* In order for projects to be effectively inspected and adapted,
management must have tools to provide effective oversight. For Agile
development, progress and the amount of work remaining can be
reflected in a burn-down chart, which depicts how factors such as the
rate at which work is completed (velocity) and changes in overall
product scope affect the project over time. While VA has an oversight
tool that shows the percentage of work completed to reflect project
status at the end of each iteration, it does not depict the velocity
of the work completed and the changes to scope over time. Program
officials stated that their current reporting does not show
the changes in project scope because their focus is on metrics that
are forward looking rather than showing past statistics for historical
comparison. However, without this level of visibility in its
reporting, management may not have all the information it needs to
fully understand project status.
To help ensure successful implementation of the Chapter 33 initiative,
we are recommending that VA establish performance measures for goals
and identify constraints; establish traceability between requirements
and legislation, policies, and business rules; define the conditions
that must be present to consider work "done;" review and improve the
unit and functional testing processes; and implement an oversight tool
to clearly communicate velocity and the changes to project scope over
time.
We received oral comments on a draft of this briefing from VA
officials, including the Deputy Assistant Secretary for Congressional
and Legislative Affairs and the Assistant Secretary for Information
and Technology. In their comments, the officials stated that the
department was not in a position to concur or not concur with our
recommendations, but planned to provide formal comments on our final
report. The officials also provided technical comments, which we have
incorporated in the briefing as appropriate.
[End of section]
Background:
In recognition of their service to our country, VA provides medical
care, benefits, social support, and lasting memorials to veterans,
service members, and their families. VA is the second largest federal
department with more than 270,000 employees. In fiscal year 2009, the
department reported incurring more than $100 billion in obligations
for its overall operations.
The Veterans Benefits Administration (VBA), one of VA's three line
administrations,[Footnote 15] provides assistance and benefits, such
as educational assistance, through four veterans' regional processing
offices.[Footnote 16] In 2009, the department reported that it
provided more than $3.5 billion in educational assistance benefits to
approximately 560,000 individuals. In 2011, it expects the number of
all education claims to grow by 32 percent over 2009, increasing from
1.7 million to 2.25 million.
Prior to the passage of the Post-9/11 GI Bill, VA delivered education
benefits by relying on a combination of manual processes and legacy IT
systems. However, the department concluded that the educational
assistance required by the statute would be complex to calculate and
would involve a multitude of factors, such as the beneficiary's length
of service, the type of education being pursued, and the geographic
location of the academic institution. Accordingly, the department
determined its legacy systems were insufficient to support the demands
for processing the new benefit.
In October 2008, VA established its Chapter 33 initiative to develop
the capability to process the new education benefit. The initiative
involved both an interim and long-term solution:
* The interim solution, deployed in November 2009, provided
applications and tools, such as a spreadsheet that aided claims
examiners in manually collecting data from VA legacy systems to
calculate the new education benefit.
* The long-term solution was expected to be complete enough to replace
the interim solution by June 2010 and to include additional
capabilities to provide a fully automated end-to-end system to support
the delivery of education benefits by December 2010.
Among other features, by December 2010, the new education benefits
system was to:
* modernize processing of new Chapter 33 awards and amended existing
Chapter 33 claims, to include automated calculations of benefits, such
as tuition and fee payments, housing allowance, and book stipends;
* increase claims processing efficiency to all regional offices, such
as providing capability to automatically access veteran demographic
and service data;
* interface with VA's existing legacy systems that contain information
required to calculate benefits, such as a financial payment system;
and;
* create veteran self-service capabilities such as the capability to
estimate and apply for benefits online.
To oversee the development and implementation of the new education
benefits system, VA has formed a governance structure that includes
executive level management from VBA and the department's Office of
Information and Technology (OI&T). The VBA Under Secretary of Benefits
has primary responsibility for coordinating the Chapter 33 initiative.
For example, the Under Secretary ensures collaboration for the
effective management and coordination of VA resources in support of
the Chapter 33 implementation.
To develop and implement the long-term solution, VA's OI&T entered
into an interagency agreement with the Department of Defense's Space
and Naval Warfare Systems Center”Atlantic (SPAWAR) to develop the long-
term solution. SPAWAR is managing multiple contractors[Footnote 17] to
develop the system and is providing technical, information assurance,
and program management services. SPAWAR is also providing operational
services and engineering, planning, and analysis to support
application development. VA and SPAWAR work together to manage and
develop the system. Specifically, VBA subject matter experts and OI&T
technical representatives are part of the system development teams.
Chapter 33 Implementation Approach:
To develop and implement the new system, VA also is following its
Project Management Accountability System (PMAS)[Footnote 18] framework
which was established in June 2009. PMAS requires that the
department's IT projects adopt an iterative release schedule in which
system features are delivered within firm, 6-month (or less) cycles.
Consistent with the framework, the department established four release
dates for the Chapter 33 long-term solution. Each release was to
deploy incremental capabilities that would expand upon previously
developed functionality.
The following table shows VA's planned schedule for deploying the four
releases and the associated functionality.
Release: 1;
Planned deployment date: March 31, 2010;
Planned functionality: Provide improved claims-processing
functionality, such as the ability to calculate new original awards,
amend awards, and convert beneficiary data from systems supporting the
interim solution to the new system. To be deployed to a limited number
of claims examiners in the regional processing offices.
Release: 2;
Planned deployment date: June 30, 2010;
Planned functionality: Increase automation and efficiency to all
regional processing offices, as well as develop interlaces to legacy
systems (excluding the financial system).
Release: 3;
Planned deployment date: September 30, 2010;
Planned functionality: Develop an interlace between the new system and
the department's legacy
financial system.
Release: 4;
Planned deployment date: December 31, 2010;
Planned functionality: Provide other end user features to further
improve processing efficiencies, such as self-service functionality
aimed at improving the veteran's experience.
Source: VA.
[End of table]
While VA did not originally estimate the total cost to implement the
long-term solution, nor estimate its cost by release, as of July 2010,
program officials reported actual and planned obligations of
approximately $207.1 million through fiscal year 2011.[Footnote 19]
To develop and implement the long-term solution according to the
planned release schedule, VA is using an Agile software development
approach, which places an emphasis on collaboration between developers
and stakeholders and produces a system in an iterative and incremental
fashion. Agile software development is recognized as having
fundamental differences from traditional methods.[Footnote 20] For
example, it is an iterative process in which each output (which can
range between 1 and 8 weeks in duration) provides a segment of system
functionality that is developed, tested, and demonstrated to users so
that early feedback can be considered. A segment of functionality
could be a computer screen to display the amount a beneficiary would
be entitled to. However, with a traditional approach, the complete
product is often delivered at the end of the development phase of the
system lifecycle and is not performed in short iterative segments.
Although iterative and incremental development approaches have been
used for many years,[Footnote 21] the Agile movement officially began
in February 2001 with the creation of the Agile Manifesto.[Footnote
22] According to current Agile literature,[Footnote 23] an Agile
approach emphasizes the following four key practices:
Work as one team. In Agile development, project participants view
themselves as one team aimed at a common goal. However, while the
participants should work together as one whole team, there are
specific roles on the team.
* Product owner. The product owner's primary duties include making
sure that all team members are pursuing a common vision for the
project, establishing priorities so the highest-valued functionality
is always being worked on, and making decisions that lead to a good
return on investment.
* Team member. The team member's role often includes programmers,
testers, analysts, database engineers, usability experts, technical
writers, architects, and designers. The team members are responsible
for developing high-quality functionality as prioritized by the
product owner.
* Project manager. The project manager focuses more on leadership than
on management and is a facilitator for the team working together. In
addition, he or she is responsible for removing project obstacles that
may impede the team's performance.
Additionally, best practices state that it is essential for a systems
development team to have involvement from other stakeholders, such as
executive level management and senior management.[Footnote 24] Such
involvement helps to minimize project risk by ensuring that key
requirements are identified and developed, problems or issues are
resolved, and decisions and commitments are made in a timely manner.
Focus on business priorities. Agile teams demonstrate customer
collaboration and commitment to business priorities in two ways.
First, the product owner prioritizes and determines the order in which
features will be developed. A release plan is then created that states
how much work the team can accomplish by a certain date. Second, Agile
teams focus on completing and delivering user-valued features, usually
in the form of user stories, which are a way of expressing software
requirements. A user story is a brief description of functionality as
viewed by a user or customer of the system. User stories are gathered
and documented throughout the development process.
Work to deliver functionality in short iterations. Agile practices
focus on delivering functionality in short increments as opposed to
delivering at the end of the development phase of the system
lifecycle; however, the functionality is based on a product vision,
which is important for motivating a team and creating a long-term
connection between those developing the product and those using it.
Most Agile teams work in iterations of 2 to 4 weeks long, during which
time a team transforms one or more user stories into coded and tested
software. All work (for example, analysis, design, coding, and
testing) is performed concurrently within each iteration.
During the iteration, each piece of functionality or feature worked on
should be determined to be of releasable quality, before it is
presented as completed work. The criteria for making the determination
is the definition of done. Only work that is completed should be
presented during a review meeting that takes place at the end of an
iteration. Because a single iteration does not usually provide
sufficient time to complete enough new functionality to satisfy user
or customer desires, a release, which is typically 2 to 6 months and
is comprised of one or more iterations, identifies a desired set of
functionality and the projected time frame it will be ready for users
and customers.
Inspect and adapt. Agile teams demonstrate the value of responding to
change by incorporating knowledge gained in the preceding iteration
and adapting plans accordingly at the start of the next iteration.
This is intended to facilitate continuous process improvement. For
example, the accuracy of the release plan may be affected by the
team's discovery that it has overestimated or underestimated its rate
of progress in that software development is more time consuming;
therefore, the release plan should be revisited and updated regularly.
Further, it may be the case that based on seeing the software from an
earlier iteration, the product owner learns that users would like to
see more of one type of feature and that they do not value another
feature as much as originally planned. The value of the plan could be
increased in this case by moving more of the desired features into the
release and postponing some of the lesser-valued features. This
recognizes that planning is an ongoing process that takes place during
the entire project in order to deliver a valuable solution to the
users.
These practices are important to any Agile framework, including the
one VA has chosen to implement called Scrum.[Footnote 25] Scrum
emphasizes developing software in increments and producing segments of
functionality that are tested and demonstrated to users. In addition,
Scrum teams are interactive and cross-functional in developing these
segments throughout each iteration. See attachment I for a discussion
of the specific practices and predefined roles within the Scrum
framework for managing software development.
[End of section]
Objective 1: Status of Development Efforts:
VA Has Delivered Key Automated Capabilities in Its Long-Term Solution,
but Has Reduced Its Planned Functionality to Accommodate Recent
Development Delays:
While VA deployed Release 1 and 2 as scheduled and plans to meet
Release 3 and 4 scheduled deployment dates of September 30, 2010, and
December 31, 2010, system functionality was reduced and delayed to
meet its scheduled release dates. VA reported obligations and
expenditures for these releases, through July 2010, to be
approximately $84.6 million”$59.8 million for SPAWAR and contractor
support and $24.8 million for VA program operations. (For a breakout
of SPAWAR and VA obligations and expenditures by release, see
attachment II.)
VA deployed Release 1 of the long-term solution on March 31, 2010, as
scheduled, providing a limited set of claims examiners at the four
regional processing offices the ability to calculate tuition, housing
allowance, books, stipends, and fees for processing original awards of
education benefits. However, the release did not provide planned
functionality to process claims for amended awards or to convert and
transfer beneficiary data from systems that were part of the interim
solution to systems for the long-term solution. VA officials stated
that the processing of amended awards and the data conversion task
were found to be more complex than they had originally anticipated and
therefore, the functionality was delayed for completion in Release 2.
The department subsequently deployed Release 2 on June 30, 2010, as
scheduled. This release extended the basic award capability to all
claims examiners at each of the four regional processing offices.
Department officials noted that the Agile process allowed them the
flexibility to reprioritize the functionality that would be included
in the release. As such, the release provided key automated
capabilities including the ability to generate three different types
of letters to veterans, to process amended awards, and the capability
to process benefits for legislative changes, such as Fry
Scholarships.[Footnote 26] However, the planned development of an
interface to the legacy systems was not fully completed. For example,
VA did not fully develop the interface that was intended to automate
the verification of student enrollment data.
In addition, despite having delayed the conversion of data from the
interim solution to the long-term solution until Release 2, this task
was not completed. As a result, VA created a sub-release, Release 2.1,
with the intent of completing data conversion and adding selected
functionality, such as the 2010 housing rate adjustments, by July 26,
2010; however, program officials stated that Release 2.1 was actually
deployed on August 23, 2010. With this release, program officials
stated that approximately 30,000 out of 550,00 records were not
converted. The officials added that they intend to make a decision in
mid-September on when and how the remaining records will be converted.
Further, program officials stated that the department has not yet
decided how the delay of Release 2.1 will affect the interfaces that
were to be developed in Release 3, which is still planned to be
deployed by September 30, 2010. As such, program officials stated that
they would decide in September how much of the Release 3 functionality
could be completed by the scheduled date. In addition, they also
stated that they have reduced functionality of the system and the self-
service capability will not be included as planned in Release 4 when
it is deployed in December 31, 2010. However, the department plans to
provide this self-service capability after Release 4 within the long-
term system solution or under a separate initiative. The department is
in the process of defining what the self-service capability will
include.
[End of section]
Objective 2: VA's Effectiveness in Managing Its New System:
VA Has Established a Team to Support System Development, but
Management Can Improve Other Key Agile Practices:
To provide effective management for an Agile project, such as the
development of the Chapter 33 long-term solution, a key component for
success is demonstrating effective use of the Agile practices: working
as one team, focusing on business priorities, delivering functionality
in short increments, and inspecting and adapting the project as
appropriate. While VA has taken an important step to effectively
manage its development of the system for processing Chapter 33
educational benefits by establishing a cross-functional team, it has
not yet fully ensured business priorities are a focus, demonstrated
that it is delivering quality functionality in short increments, or
provided mechanisms to enable inspection and adaptation of the
project. As a result, VA does not have the visibility it needs to
clearly communicate progress to stakeholders and may not be able to
generate feedback necessary for effectively establishing project
priorities and continuous process improvements.
VA Has Established a Cross-Functional Team:
As discussed earlier, Agile practices value the importance of
organizations developing a cross-functional team that includes all key
stakeholders. Specific Agile roles such as product owner, team member,
and project manager should be included in the development. In
addition, there needs to be involvement from executive level
management, senior management, and users. Such involvement helps to
minimize project risk by ensuring that key requirements are identified
and developed.[Footnote 27]
VA has established a team of executive level management that fulfills
the role of the product owner. For example, the team consists
primarily of executives and senior managers from VBA and the
department's OI&T, who are members of two decision-making bodies for
the initiative: the Joint Executive Board and Executive Steering
Committee. They meet weekly to discuss the vision and make decisions
on functionality, schedule, and cost issues. VA has also established
additional workgroups that provide daily leadership, oversight, and
operations management for the systems development effort and serve as
extensions of the product owner to identify and prioritize
requirements. (For detailed information about the responsibilities and
leadership of the decision-making bodies in the governance structure,
see attachment III.)
The department has also established multiple, cross-functional teams
to develop the system. These teams consist of VA subject matter
experts as well as contractors that are programmers, testers,
analysts, database engineers, architects, and designers. These teams
hold daily Scrum meetings to discuss work that has been planned and
accomplished, and any impediments to completing the work. At the
completion of each iteration, which in VA's case is every 2 weeks, a
review meeting occurs between the cross-functional teams and VA
stakeholders to review and demonstrate completed system functionality.
Following this meeting, planning sessions are held to discuss the work
to be accomplished in the next iteration based on the next highest-
prioritized requirements contained in user stories.
In addition, VA has identified project managers from both VA and
SPAWAR that focus on leadership of the initiative. These project
managers monitor and facilitate meetings and provide clarification to
contractors, subject matter experts, and other developers. They are
also responsible for addressing impediments discussed at the review
meetings.
With this involvement from key stakeholders, VA has established a team
structure that fulfills the key roles within an Agile team and has
better positioned itself to effectively manage the initiative.
Although VA Has a Vision for Its Business Priorities, Key Elements Are
Missing:
Under an Agile methodology, to ensure business priorities are a focus,
a project starts with a vision of the system that is communicated to
the team by the product owner. This vision should clearly state the
purpose and goals of the project; the goals should be measurable; and
constraints should be identified and prioritized to establish project
parameters related to scope, cost, and schedule. In addition, well-
defined and managed requirements are a cornerstone of effective system
development. According to recognized guidance, disciplined processes
for developing and managing requirements can help reduce the risks of
developing a system that does not meet user and operational needs.
[Footnote 28] Such processes include establishing policies and plans
for managing changes to requirements and maintaining bidirectional
requirements traceability.[Footnote 29] As such, the project vision
should be traceable to requirements and functionality developed, which
is contained in user stories.
VA has established a vision document that captures the project purpose
and goals. Specifically, the stated purpose of the department's long-
term solution is to develop a system that ensures timely and accurate
benefit payments to beneficiaries and achieves the following goals:
maximizes the user experience, provides a flexible architecture to
support benefit changes, provides an efficient workflow, and provides
a model and framework that supports code reuse across future VA
projects.
However, the department has not established metrics for the project's
goals, prioritized project constraints, or ensured that requirements
were fully traceable to legislation, policies, and business rules.
Specifically, the goals that VA has established do not have metrics
for determining the progress towards achieving the goals. For example,
for VA's goal to maximize the user experience, the department has not
established a quantifiable, numerical target or other measurable value
to facilitate future assessments of whether the goal was achieved. As
a result, the department does not have the means to compare the
projected performance and actual results of this goal.
Further, the department has not clearly identified and prioritized
constraints for the project that would impact how decisions affecting
the scope, cost, and schedule for the system should be made. Although
its vision document states that VA will identify constraints, it has
not yet documented them. Without having clearly identified and
prioritized constraints, stakeholders may not agree or understand what
factors should drive the decisions and adjustments made in system
development to achieve the project's goals.
VA also did not always ensure that requirements for Release 2 were
traceable. While the department has established a plan that identifies
the process that the team is to follow to transform requirements into
user stories and the tools it is to utilize to maintain traceability,
our review of selected user stories in Release 2 found that
traceability between legislation, policy, business rules, and test
cases was not always maintained and, therefore, could not be verified.
For example, requirements in the 20 user stories we reviewed in
Release 2 were not traceable to legislation, nor were we able to
observe how requirements were traceable to all the test cases that
covered the complete requirement.
With regard to these deficiencies, VA officials stated that project
documentation is evolving and they intend to improve their processes
based on lessons learned. However, until the department fully
establishes goals that are measurable and identifies and prioritizes
constraints, it may not have the ability to clearly communicate
progress to stakeholders.
Further, officials acknowledged that the department's requirement tool
did not have the capability to fully establish software traceability
for Release 2, but that VA has since upgraded its tool.[Footnote 30]
The officials stated the department will be able to provide this level
of traceability to test cases in future releases. While program
officials acknowledged the importance of traceability and the need to
improve their process, they have not identified how the department
will effectively establish bidirectional traceability between system
requirements and legislation, policy, and business rules. Until the
department can effectively ensure that requirements are fully
traceable to legislation, policies, business rules, and test cases it
will continue to have a limited ability to reasonably assure that the
Chapter 33 requirements will be completely met.
VA Delivers Functionality in Short Iterations, but Needs to Ensure
Standards Are Defined and Met:
To ensure that the product is potentially shippable at the end of
every increment, work should adhere to an agreed-upon definition of
"done." If the definition is not agreed upon, the quality of work may
vary and teams may inappropriately consider work as completed, thus
unreliably reporting progress. Stakeholders should agree to a
definition of completed work that conforms to an organization's
standards, conventions, and guidelines. These standards often include
fully tested functionality that has no defects. Furthermore, we have
highlighted in our prior work that effective testing is an essential
component of any system development effort.[Footnote 31]
While the department has defined some criteria for work that is
considered "done" at the release level, VA has not defined what it
means at the user story, iteration, or project level. We observed
multiple cases during Release 2 development in which user stories were
presented as "done," but had varying amounts of work completed. For
example, at three iteration review meetings, we observed at least one
development team that presented user stories as "done" without having
completed all testing.
Program officials stated that each development team has their own
definition of "done" and agreed that they need to provide a standard
definition across all teams. If VA does not mutually agree upon and
document this definition at each level and ensure it conforms to the
department's standards, conventions, and guidelines, confusion about
what constitutes completed work could lead to inconsistent quality and
unreliable performance and progress reporting. Further, in the absence
of an agreed-upon definition, VA is not able to clearly communicate
how much work remains for completing the system.
With regard to testing, VA has established an incremental testing
approach that calls for automated unit and functional testing to be
conducted on work completed during iterations.[Footnote 32] In
addition, it has also established user acceptance testing that is
performed before a release is delivered. Nonetheless, we found that
the unit and functional testing performed during Release 2 was
inadequate. Specifically, in reviewing the testing conducted for 20
user stories, we identified the testing to be inadequate for 10 of
them. For these 10 user stories, we identified a total of 19
deficiencies covering a range of issues.
For example, 7 user stories were not fully tested for expected values
or boundary conditions specified in their associated requirements
documents. These testing deficiencies may hinder VA's ability to
identify critical defects. VA and contractor system development and
testing teams subsequently identified a number of defects during
Release 2. Specifically, program officials stated that 218 of the 423
defects that were to be corrected in Release 2 were classified as high
priority.[Footnote 33] For example, user acceptance testing found that
an award letter included the incorrect date for a student's enrollment
period. Program officials stated that all of the high-priority defects
were corrected or closed as invalid and that they are working toward
correcting the remaining defects in future iterations.
Program officials also stated that they placed higher priority on user
acceptance testing at the end of a release and relied on users to
identify defects that were not detected during unit and functional
testing. However, as we have noted, relying on subject matter experts
to perform user acceptance testing is not a realistic solution because
it is difficult for them to remember all the items needed for
functionality.[Footnote 34] Further, while program officials stated
that many of the defects were closed before Release 2 was fully
deployed, due to the inadequate testing the potential exists for a
significant number of additional defects to be found after deployment,
thus requiring system rework which can increase costs and affect
schedule.[Footnote 35] Until the department improves testing quality,
it risks deploying future releases that contain defects which may
require rework and extend the completion date for the project.
Ultimately, this could increase the risk of delayed functionality that
would impede the ability for claims examiners to process claims
efficiently.
VA Has Not Fully Implemented Tools to Inspect and Adapt the Project:
In order for projects to be effectively inspected and adapted,
management must have tools to provide visibility to communicate
progress to stakeholders. Under the Scrum framework, project
visibility is achieved through the use of specific tools. For example,
progress and the amount of work remaining across the release is
demonstrated by a burn-down chart. Specifically, a burn-down chart can
depict how factors such as the rate at which work is completed
(velocity) and changes in overall product scope affect the project
over time. This information can be forecasted to estimate how long a
release will take to complete. Further, when compared to the project
rate of work completion, the chart can provide visibility into the
actual project status and can be used for continuous process
improvement such as increasing the accuracy of estimating story points
for future user stories.
VA's burn-down chart did not include elements that aided in
communicating progress. While the department used a burn-down chart in
Release 2 that showed the percentage of work completed to reflect
project status at the end of each iteration, this chart did not depict
the velocity of the work completed and the changes to scope over time.
[Footnote 36] Program officials stated that their current reporting
did not show the changes in project scope because their focus was on
metrics that are forward looking rather than showing past statistics
for historical comparison. However, such a chart is essential to team
members' understanding of progress made and provides a continuous
feedback loop. In addition, it can also provide management visibility
into the project and changes over time.
Since the department did not use a burn-down chart to report
performance over time, management and stakeholders cannot clearly
discern the actual amount of work completed relative to the amount of
work that was expected to be completed. Without this level of
visibility in its reporting, management and the development teams may
not have all the information they need to fully understand project
status and generate the discussion and feedback necessary for
continuous process improvement.
[End of section]
Conclusions:
VA deployed the first two releases of its long-term system solution by
its planned dates, therefore providing improved claims-processing
functionality, such as the ability to calculate new original awards in
Release 1. Additionally, it increased automation to all regional
processing offices with automated capability to process amended awards
in Release 2. Critical long-term system solution features were not
completed because VA reprioritized its work to accommodate for
legislative changes and because the department found some major
functions more complex than anticipated. As such, interfaces to legacy
systems and the conversion of data from systems in the interim
solution were not completed in Release 2. VA added an additional sub-
release to address this incomplete functionality, but it has not yet
concluded how these delays will affect the functionality that will be
developed in Release 3. Also, for Release 4, VA has reduced a
significant planned functionality”veteran self-service capability.
While VA intends to provide this capability after Release 4 within the
long-term system solution or under a separate initiative, it is
unclear what functionality will be delivered in the remaining 2
releases when it deploys the system in December 2010.
In using an Agile approach for this initiative, VA is applying lessons
learned and has taken important first steps to effectively manage the
IT project by establishing a cross-functional team that involves
senior management, governance boards, and key stakeholders. However,
the department has not yet ensured that several key Agile practices
were performed. Measurable goals were not developed and the project
progressed without bidirectional traceability in its requirements.
Additionally, in developing the system, VA did not establish a common
standard and consistent definition for work to be considered "done" or
develop oversight tools to clearly communicate velocity and the
changes to project scope over time. Testing deficiencies further
hinder VA's assurances that all critical system defects will be
identified. Until VA improves these areas, management does not have
the visibility it needs to clearly communicate progress to
stakeholders and estimate when future system capabilities will be
delivered. Additionally, reduced visibility and unresolved issues in
its development processes may result in the department continuing to
remove functionality that was expected in future releases, thus
delivering a system that does not fully and effectively support the
implementation of education benefits as identified in the Post-9/11 GI
Bill.
[End of section]
Recommendations for Executive Action:
To help guide the development and implementation of the Chapter 33
long-term solution, we recommend that the Secretary of Veterans
Affairs direct the Under Secretary for Benefits to take the following
five actions:
* establish performance measures for goals and identify constraints to
provide better clarity in the vision and expectations of the project;
* establish bidirectional traceability between requirements and
legislation, policies, and business rules to provide assurance that
the system will be developed as expected;
* define the conditions that must be present to consider work "done"
in adherence with agency policy and guidance;
* improve the adequacy of the unit and functional testing processes to
reduce the amount of system rework; and;
* implement an oversight tool to clearly communicate velocity and the
changes to project scope over time.
[End of section]
Agency Comments and Our Evaluation:
We received oral comments on a draft of this briefing from VA
officials, including the Deputy Assistant Secretary for Congressional
and Legislative Affairs and the Assistant Secretary for Information
and Technology. In the comments, the Deputy Assistant Secretary stated
that the department was not in a position to concur or not concur with
our recommendations but planned to provide formal comments on our
final report. The officials provided additional clarification on why
the department experienced delays in data conversion. Specifically,
they noted that, consistent with Agile practices, the department
reprioritized work and adapted the system to add selected
functionality, such as the 2010 housing rate adjustments. They added
that the Joint Executive Board had made this decision to ensure that
claims examiners would have the most recent rate to process benefits
for the fall 2010 enrollment season. Additionally, the department
recognized lessons learned with the Agile approach, and it intends to
incorporate them in future development work. The officials provided
other technical comments, which we have incorporated as appropriate.
In further comments, the Assistant Secretary for Information and
Technology emphasized that using Agile system development for this
initiative allowed the department to provide significant system
functionality incrementally that had far exceeded its past IT
initiatives. Specifically, he noted that the project had delivered
working software close to schedule and had been more successful than
past system development efforts.
[End of section]
Attachment I: Description of the Scrum Framework Being Used by VA:
The Scrum framework is an Agile method that contains sets of practices
and predefined roles. The following describes the key terminology used
within the framework being utilized by VA for the Chapter 33 long-term
solution system development:
Scrum teams. These teams are to be cross-functional groups of about
seven individuals that are developers, subject matter experts, and
managers that perform analysis, design, implementation, and testing of
specific pieces of functionality. The product owner acts as an
interface between stakeholders and Scrum teams and is also responsible
for translating requirements into work lists, maintains the work list,
and maintains a backlog of requirements (i.e. user stories), called a
product backlog.
Sprint. Each team works in iterations that typically last 2 to 4
weeks; these blocks of time are known as sprints. During a sprint,
each Scrum team creates a potentially shippable product (for example,
working and tested software). These products are developed based on
the user stories in the product backlog that are prioritized by the
product owner and team. Each user story is assigned a level of effort,
called story points. Story points are used as a relative unit of
measure to communicate complexity and progress between the business
and development sides of the project. Each sprint builds on the
previous sprint to generate a working system. After a predetermined
number of sprints, a release of the system goes into production.
Sprint planning meeting. Held prior to each sprint, this meeting is
where user stories are communicated to the team and the team then
commits to an amount of work it will complete for the next sprint.
After the product owner agrees, user stories are then finalized for
that sprint.
Daily Scrum meeting. During each sprint, Scrum teams meet every day
and hold a daily Scrum meeting. This meeting is short and concise and
its purpose is to ensure that team members understand the work that
has been completed since the last stand up meeting, what work is
planned for the current day, and any problems that would prevent the
team from achieving that work. Each team has a ScrumMaster, who is
responsible for facilitating the meetings by maintaining the process
and promoting resolution of problems identified by the team.
Sprint review. After each sprint, teams demonstrate completed work and
discuss work that was not finished with stakeholders. They also
identify any problems that were encountered in completing the work.
Feedback and priority is solicited from stakeholders so that it can be
incorporated into future sprints.
[End of section]
Attachment II: VA and SPAWAR Costs for Chapter 33 System Development:
Table: VA and SPAWAR Chapter 33 Costs by Release as of July 31, 2010:
Type of Cost[A]: SPAWAR expenditures;
Release 1[B]: $39.8 million;
Release 2: $14.4 million;
Release 2.1: $3.3 million;
Release 3: $2.3 million;
Release 4: $0.0;
Post-Release 4: [Empty];
Total: $59.8 million.
Type of Cost[A]: VA program obligations;
Release 1[B]: $20.7 million;
Release 2: $3.8 million;
Release 2.1: $0.0;
Release 3: $0.3 million;
Release 4: $0.0;
Post-Release 4: [Empty];
Total: $24.8 million.
Type of Cost[A]: Total;
Release 1[B]: $60.5 million;
Release 2: $18.2 million;
Release 2.1: $3.3 million;
Release 3: $2.6 million;
Release 4: $0.0;
Post-Release 4: [Empty];
Total: $84.6 million.
Type of Cost[A]: Funds obligated and transferred to SPAWAR but not yet
expended;
Release 3/Release 4: $45.5 million [C].
Type of Cost[A]: Planned and obligated costs to complete Release 3 (VA
and SPAWAR program costs;
Release 3: $24.0 million.
Type of Cost[A]: Planned but not obligated FY2011 cost to complete
Release 4 (VA and SPAWAR program costs);
Release 4: $1.3 million.
Type of Cost[A]: Planned but not obligated FY2011 cost for post-
Release 4 systems development activities (VA and SPAWAR program costs);
Post-Release 4: $51.7 million[D].
Type of Cost[A]: Total funds expended, obligated, and planned
obligations for Chapter 33 interim and long-term solution development;
Total: $207.1 million.
Source: VA.
[A] These costs represent actual expenditures, obligated funds, and
planned obligated funds. VA could not provide estimated project costs
to compare to these costs.
[B] Release 1 costs include both interim and long-term solution costs.
VA did not provide the cost accounting to account for these separately.
[C] As of September 4, 2010, VA could not provide a breakout between
Release 3 and 4.
[D] While VA has estimated this funding, it could not describe what
these costs will represent.
[End of table]
[End of section]
Attachment III: VA's Governance and Oversight for the Chapter 33
Initiative:
VA established a governance structure for the Chapter 33 initiative in
October 2008. The table below shows the decision-making bodies and
their responsibilities for the initiative.
Title: Joint Executive Board;
Description: Co-chaired by the Under Secretary for Benefits and the
Assistant Secretary for Information and Technology, this senior
governing body provides executive-level oversight and strategic
guidance for implementation of the initiative. It is responsible for
ensuring that communications, strategies, planning, and deliverables
enable the initiative to meet its mission, goals, and objectives.
Title: Executive Steering Committee;
Description: Co-chaired by the Director of Education Service and the
Program Manager, the Steering Committee advises the Joint Executive
Board on requirements, policies, and standards. It is responsible for
the oversight of program planning and execution to ensure that the
strategic vision is incorporated into the business operations.
Title: Working Group;
Description: Co-chaired by the Leader of the Veterans Benefits
Administration (VBA) Education Service Program Executive Office and
the Dependency Lead, Office of Information and Technology, Chapter 33
Program Management Office, the Working Group provides oversight and
governance to workgroups leading programmatic and technical interests
of the initiative. It defines and prioritizes business requirements,
identifies and escalates issues and risks, and makes recommendations
to the Executive Steering Committee on which requests to approve and
resource.
Title: Workgroups;
Description: Eight workgroups, led by Education Service and Office of
Information and Technology staff, provide daily operations management
and ensure that requirements areas are identified and defined for each
of the following areas: Benefits Delivery Network/Financial Accounting
System, Business Requirements, Certification and
Accreditation/Security, Infrastructure, Interlaces, Strategic
Planning, Training, and the Security Review Board.
Source: VA.
[End of table]
[End of section]
[End of briefing slides]
Appendix II: Comments from the Department of Veterans Affairs:
The Secretary Of Veterans Affairs:
Washington:
November 12, 2010:
The Honorable Gene Dodaro:
Acting Comptroller General of the United States:
Washington, DC 20548:
Dear Mr. Dodaro:
The Department of Veterans Affairs (VA) has reviewed the Government
Accountability Office's (GAO) draft report, "Information Technology:
Veterans Affairs Can Improve its Development Process for its New
Education Benefits System" (GA0-11-115).
Since this GI Bill's implementation in 2009, which gives veterans with
active duty service on, or after, September 11, 2001, enhanced
educational benefits that cover more educational expenses, provide a
living allowance, money for books and the ability to transfer unused
educational benefits to spouses or children, over 360,000 Veterans and
family members have enrolled in college. When you include all other
college education programs, that number exceeds 600,000. This new GI
Bill is important”not only for the numbers who are accepted into
schools”but for the numbers who will be graduating from them in the
years ahead. That's the measure of success.
Implementation of this program continues to be a high priority to the
Department and we have made great progress in fulfilling its
objectives. The enclosure contains comments on GAO's draft report and
responds to your recommendations.
Sincerely,
Signed by:
Eric K. Shinseki:
Enclosure:
[End of letter]
Enclosure:
Department of Veterans Affairs (VA) Comments to Government
Accountability Office (GAO) Draft Report: Information Technology:
Veterans Affairs Can Improve its Development Process for its New
Education Benefits System (GAO-11-115).
GAO Recommendation: To help guide the development and implementation
of the Chapter 33 long-term solution, we recommend that the Secretary
of Veterans Affairs direct the Under Secretary for Benefits to take
the following five actions:
Recommendation 1: Establish performance measures for goals and
identify constraints to provide better clarity in the vision and
expectations of the project.
VA Response: Concur. The Office of Information and Technology (OIT)
will develop performance measures consistent with automating the Post-
9/11 GI Bill in order to evaluate SPAWAR and hold them accountable for
achieving these goals. Target Completion Date: March 1, 2011.
Recommendation 2: Establish bidirectional traceability between
requirements and legislation, policies, and business rules to provide
assurance that the system will be developed as expected.
VA Response: Concur. VA believes that GAO's statement incorrectly
implies that requirements traceability was not occurring when it in
fact was being carefully documented, missing only the legislation
element, which will require the assistance of legal experts to be
correctly accomplished. VBA and OIT will work together to trace the
rules on the long-term solution back to the legislation. Target
Completion Date: June 30, 2011.
Recommendation 3: Define the conditions that must be present to
consider work "done" in adherence with agency policy and guidance.
VA Response: Concur. The fiscal year 2011 Automate GI Bill Operating
Plan outlines the conditions that will allow the project to be
declared a success. At the Chapter 33 (CH33) working group level (VBA
and OIT), VA will clarify the definition of "done" and ensure that it
is being applied consistently. Target Completion Date: December 1,
2010.
Recommendation 4: Implement an oversight tool to clearly communicate
velocity and the changes to project scope over time.
VA Response: Non-Concur. Development metrics and models were
established and implemented to forecast and measure development
velocity. Based on lessons learned, these models were expanded to
forecast and measure velocity at each scrum team.
Recommendation 5: Improve the adequacy of the unit and functional
testing processes to reduce the amount of system rework.
VA Response: Non-Concur. VA's testing approach is compatible with
Agile development, where unit, functional, and end-user testing are
collaboratively accomplished and all significant errors are identified
and resolved prior to deployment.
[End of section]
Appendix III: Comments from the Veterans Affairs' Assistant Secretary
for Information and Technology:
Department Of Veterans Affairs:
Assistant Secretary For Information And Technology:
Washington Dc 20420:
November 12, 2010:
Ms. Valerie Melvin:
Director, Information Management And Human Capital Issues:
U.S. Government Accountability Office:
441 G Street, NW:
Washington, DC 20548:
Dear Ms. Melvin:
The Department of Veterans Affairs (VA) has reviewed the Government
Accountability Office's (GAO) draft report, "Information Technology:
Veterans Affairs Can Improve its Development Process for its New
Education Benefits System" (GAO-11-115) and has concurred on three
recommendations and non-concurred on two recommendations. As we
expressed during our phone call on September 11, 2010, VA has some
concerns regarding GAO's draft report.
Chairman Filner's request to GAO was to "determine the status of VA's
development and implementation" and to "evaluate the agency's
effectiveness in managing its information technology for this
project". VA believes GAO fell short of meeting this charge by
omitting key facts, and presenting an unnecessarily negative view of
our status and effectiveness to Congress.
Despite unanimous predictions to the contrary, VA successfully
converted all processing of new Post-9/11 GI Bill claims to the Long
Term Solution (LTS) prior to the commencement of the Fall 2010
enrollment process. Since installation, processing with the new system
has been nearly flawless, with no significant "bugs" encountered. The
Veterans Benefits Administration claims processors like the new system
and find it easy and efficient to use. By dramatically changing its
development processes, adopting the Agile methodology for this
project, VA also dramatically changed its system development results.
Prior to GAOs initial presentation to Congress, VA officials provided
GAO with this information and we believe that GAO should have included
all of these facts in its report to accurately present "the status of
VA's development and implementation" of the new GI Bill LTS.
Additionally, because Agile methodologies are not broadly used in the
federal sector, this may have been the first exposure the GAO team
performing this audit had to this methodology. Limited exposure to
Agile methodology possibly caused GAO to present incorrect assumptions
as facts. A specific example is the statement that "...the quality of
unit and functional testing performed during Release 2 was
inadequate..." VA utilized a testing approach compatible with Agile
development, where unit, functional, and end-user testing were
collaboratively accomplished, which ensured that all significant
errors were identified and resolved prior to deployment. While
different from traditional Waterfall" development techniques used on
large systems development programs throughout government, the results
speak for themselves. All three releases deployed during the GAO audit
were installed with no significant (defined as severity I or 2) errors.
Finally, we believe that GAO missed a substantial opportunity to
positively influence real change in the results of information
technology (IT) systems development across the federal government As
GAO noted as recently as May 2010, use of waterfall development
methodologies within VA had caused continual, largescale systems
development failures. An objective evaluation of VA's effectiveness in
managing its IT for this project would focus on the fact that VA
recognized its failings and adopted Agile methodologies, with the
result being a stunning and unpredicted success. If VA, with one of
the worst track records in systems development (as amply documented
over many years by VA's Inspector General and GAO), has been able to
achieve such positive results, what are the implications for failing
IT programs across government?
The enclosure provides specific comments to the draft report and
discusses each of the recommendations. VA appreciates the opportunity
to comment on your draft report.
Sincerely,
Signed by:
Roger W. Baker:
Enclosure:
[End of section]
Appendix IV: GAO Contact and Staff Acknowledgments:
GAO Contact:
Valerie C. Melvin, (202) 512-6304 or melvinv@gao.gov:
Staff Acknowledgments:
In addition to the contact named above, key contributions to this
report were made by Christie M. Motley, Assistant Director; Rebecca E.
Eyler; David A. Hong; Ashley D. Houston; John C. Martin; and Charles
E. Youman.
[End of section]
Footnotes:
[1] Pub. L. No. 110-252, Secs. 5001-5003, June 30, 2008.
[2] VA's legacy systems, among others, include a financial payment
system, an education information system, and a veteran demographic and
service data system. These legacy systems contain essential
information required for calculating the benefit, such as prior
benefit payments, academic institution rates, and veterans' service
dates. VA planned to complete interfaces to all legacy systems except
for its financial payment system, which is planned for the third
release.
[3] Agile software development is not a set of tools or a single
methodology, but a philosophy based on selected values, such as, the
highest priority is to satisfy customers through early and continuous
delivery of valuable software; delivering working software frequently,
from a couple of weeks to a couple of months; and that working
software is the primary measure of progress. For more information on
Agile development, see [hyperlink, http://www.agilealliance.org].
[4] This number represents actual expenditures, obligated funds, and
planned obligated funds through fiscal year 2011.
[5] One of the key Agile principles is that the delivery of completed
software be defined, commonly referred to as the definition of "done."
This is critical to the development process to help ensure that, among
other things, testing has been adequately performed and the required
documentation has been developed.
[6] GAO, Business Modernization: Improvements Needed in Management of
NASA's Integrated Financial Management Program, [hyperlink,
http://www.gao.gov/products/GAO-03-507] (Washington, D.C.: April 30,
2003).
[7] Pub. L. No. 110-252, Secs. 5001-5003, June 30, 2008.
[8] VA's legacy systems, among others, include a financial payment
system, an education information system, and a veteran demographic and
service data system. These legacy systems contain essential
information required for calculating the benefit, such as prior
benefit payments, academic institution rates, and veterans' service
dates.
[9] Agile software development is not a set of tools or a single
methodology, but a philosophy based on selected values such as, the
highest priority is to satisfy customers through early and continuous
delivery of valuable software, delivering working software frequently,
from a couple of weeks to a couple of months, and that working
software is the primary measure of progress. For more information on
Agile development, see [hyperlink, http://www.agilealliance.org].
[10] Pub. L. No. 110-252, Secs. 5001-5003 and Pub. L. No. 111-32, Sec.
1002.
[11] VA Project Management Accountability System (PMAS) Guide 1.0,
March 2010.
[12] VA planned to complete interfaces to all legacy systems except
for its financial payment system, which is planned for the third
release.
[13] This number represents actual expenditures, obligated funds, and
planned obligated funds through fiscal year 2011.
[14] One of the key Agile principles is that the delivery of completed
software be defined, commonly referred to as the definition of "done."
This is critical to the development process to help ensure that, among
other things, testing has been adequately performed and the required
documentation has been developed.
[15] VA's two other line administrations are the Veterans Health
Administration and the National Cemetery Administration.
[16] The regional processing offices are located in Atlanta, Georgia;
Buffalo, New York; Muskogee, Oklahoma; and St. Louis, Missouri.
[17] Among others, contractors such as Agilex Technologies, Inc., Booz
Allen Hamilton, GeoLogics, and Lockheed Martin, support the Chapter 33
system development.
[18] VA Project Management Accountability System (PMAS) Guide 1.0,
March 2010.
[19] This estimate does not include maintenance costs past the end of
fiscal year 2011 because program officials stated this will be
budgeted under a different VBA initiative.
[20] Carnegie Mellon Software Engineering Institute, Mary Ann Lapham,
et al., Considerations for Using Agile in DOD Acquisition (Pittsburgh,
Penn., April 2010).
[21] For a brief history on iterative and incremental development and
the origins of Agile methods, see Carnegie Mellon Software Engineering
Institute, Hillel Glazer, et al., CMMIO or Agile: Why Not Embrace
Both! (Pittsburgh, Penn., November 2008).
[22] The Agile Manifesto was written and signed by a group of
methodologists, who called themselves the Agile Alliance. Basic
principles are set forth in this document and include, for example,
that business people and developers must work together daily and
throughout the project. For more information on the creation of the
Agile Manifesto, see [hyperlink,
http://agilemanfesto.org/history.html].
[23] Mike Cohn, Succeeding with Agile: Software Development Using
Scrum (Boston, Mass.: Pearson Education, Inc., 2010); Agile Estimating
and Planning (Upper Saddle River, N.J.: Pearson Education, Inc.,
2006); User Stories Applied (Boston, Mass.: Pearson Education, Inc.,
2004); and Ken Schwaber, Agile Project Management with Scrum (Redmond,
Wash.: Microsoft Press, 2004).
[24] Institute of Electrical and Electronics Engineers (IEEE), Systems
and software engineering ” software life cycle processes, IEEE Std.
12207-2008, (Piscataway, N.J., January 2008) and Carnegie Mellon
Software Engineering Institute, CMMI for Development, Version 1.2,
CMU/SEI-2006-TR-008 (Pittsburgh, Penn., August 2006).
[25] One of the widely used methodologies of implementing Agile values
is Scrum. For more information on the Scrum approach see [hyperlink,
http://www.scrumalliance.org/].
[26] Pub. L. No. 111-32, Sec. 1002, June 24, 2009, amended the Post-
9/11 Educational Assistance Act of 2008 by adding the Marine Gunnery
Sergeant John David Fry Scholarship (see 38 U.S.C. § 3311), which
includes in the act benefits for the children of service members who
died in the line of duty on or after Sept. 11, 2001. Eligible children
attending school may receive up to the highest public, in-state
undergraduate tuition and fees, plus a monthly living stipend and book
allowance under the program.
[27] Institute of Electrical and Electronics Engineers (IEEE), Systems
and software engineering ” software life cycle processes, IEEE Std.
12207-2008, (Piscataway, N.J., January 2008) and Carnegie Mellon
Software Engineering Institute, CMMI for Development, Version 1.2,
CMU/SEI-2006-TR-008 (Pittsburgh, Penn., August 2006).
[28] Carnegie Mellon Software Engineering Institute, Capability
Maturity Model® Integration for Development, Version 1.2 (Pittsburgh,
Penn., August 2006), and Software Acquisition Capability Maturity
Model® (SA-CMM®) version 1.03, CMU/SEI-2002-TR-010 (Penn., March
2002); and the Institute of Electrical and Electronic Engineers
(IEEE), 1362-1998, IEEE Guide for Information Technology”System
Definition”Concept of Operations Document (New York, N.Y.,1998).
[29] Maintaining bidirectional requirement traceability means that
system-level requirements can be traced both backward to higher-level
business or operational requirements, and forward to system design
specifications and test plans.
[30] Department officials noted that prior to this upgrade, they were
able to establish traceability to test cases manually.
[31] GAO, Year 2000 Computing Crisis: A Testing Guide, [hyperlink,
http://www.gao.gov/products/GAO/AIMD-10.1.21] (Washington, D.C.:
November 1998); Information Technology: Customs Automated Commercial
Environment Progressing, but Need for Management Improvements
Continues, [hyperlink, http://www.gao.gov/products/GAO-05-267]
(Washington, D.C.: Mar. 14, 2005); and Homeland Security: Visitor and
Immigrant Status Program Operating, but Management Improvements Are
Still Needed, [hyperlink, http://www.gao.gov/products/GAO-06-318T]
(Washington, D.C.: Jan. 25, 2006).
[32] For further information on unit and functional testing, see GAO,
Indian Trust Funds: Challenges Facing Interior's Implementation of New
Trust Asset and Accounting Management System, [hyperlink,
http://www.gao.gov/products/GAOIT-AIMD-99-238] (Washington, D.C.: Jul.
14, 1999) and GAO, Financial Management Systems: Additional Efforts
Needed to Address Key Causes of Modernization Failures, [hyperlink,
http://www.gao.gov/products/GA0-06-184] (Washington, D.C.: March 27,
2006).
[33] Defect numbers were reported as of June 29, 2010. Program
officials described high-priority defects as defects that could
"break" the system and must be fixed.
[34] GAO, Business Modernization: Improvements Needed in Management of
NASA's Integrated Financial Management Program, [hyperlink,
http://www.gao.gov/products/GAO-03-507] (Washington, D.C.: April 2003).
[35] For more information on how defects result in unplanned rework
and increased costs, see [hyperlink,
http://www.gao.gov/products/GAO-06-184].
[36] Program officials stated that they had previously used a burn-
down chart that showed velocity for all teams in Release 1. However,
in Release 2, they decided that they would provide burn-down charts at
the team level, but not at the overall project level.
[End of section]
GAO's Mission:
The Government Accountability Office, the audit, evaluation and
investigative arm of Congress, exists to support Congress in meeting
its constitutional responsibilities and to help improve the performance
and accountability of the federal government for the American people.
GAO examines the use of public funds; evaluates federal programs and
policies; and provides analyses, recommendations, and other assistance
to help Congress make informed oversight, policy, and funding
decisions. GAO's commitment to good government is reflected in its core
values of accountability, integrity, and reliability.
Obtaining Copies of GAO Reports and Testimony:
The fastest and easiest way to obtain copies of GAO documents at no
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each
weekday, GAO posts newly released reports, testimony, and
correspondence on its Web site. To have GAO e-mail you a list of newly
posted products every afternoon, go to [hyperlink, http://www.gao.gov]
and select "E-mail Updates."
Order by Phone:
The price of each GAO publication reflects GAO‘s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black and
white. Pricing and ordering information is posted on GAO‘s Web site,
[hyperlink, http://www.gao.gov/ordering.htm].
Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537.
Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional
information.
To Report Fraud, Waste, and Abuse in Federal Programs:
Contact:
Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]:
E-mail: fraudnet@gao.gov:
Automated answering system: (800) 424-5454 or (202) 512-7470:
Congressional Relations:
Ralph Dawn, Managing Director, dawnr@gao.gov:
(202) 512-4400:
U.S. Government Accountability Office:
441 G Street NW, Room 7125:
Washington, D.C. 20548:
Public Affairs:
Chuck Young, Managing Director, youngc1@gao.gov:
(202) 512-4800:
U.S. Government Accountability Office:
441 G Street NW, Room 7149:
Washington, D.C. 20548: