Human Capital
Implementing Pay for Performance at Selected Personnel Demonstration Projects
Gao ID: GAO-04-83 January 23, 2004
There is a growing understanding that the federal government needs to fundamentally rethink its current approach to pay and to better link pay to individual and organizational performance. Federal agencies have been experimenting with pay for performance through the Office of Personnel Management's (OPM) personnel demonstration projects. GAO identified the approaches that selected personnel demonstration projects have taken to implement their pay for performance systems. These projects include: the Navy Demonstration Project at China Lake (China Lake), the National Institute of Standards and Technology (NIST), the Department of Commerce (DOC), the Naval Research Laboratory (NRL), the Naval Sea Systems Command Warfare Centers (NAVSEA) at Dahlgren and Newport, and the Civilian Acquisition Workforce Personnel Demonstration Project (AcqDemo). We selected these demonstration projects based on factors such as status of the project and makeup of employee groups covered. We provided drafts of this report to officials in the Department of Defense (DOD) and DOC for their review and comment. DOD provided written comments concurring with our report. DOC provided minor technical clarifications and updated information. We provided a draft of the report to the Director of OPM for her information.
The demonstration projects took a variety of approaches to designing and implementing their pay for performance systems to meet the unique needs of their cultures and organizational structures. GAO strongly supports the need to expand pay for performance in the federal government. How it is done, when it is done, and the basis on which it is done can make all the difference in whether such efforts are successful. High-performing organizations continuously review and revise their performance management systems. These demonstration projects show an understanding that how to better link pay to performance is very much a work in progress at the federal level. Additional work is needed to strengthen efforts to ensure that performance management systems are tools to help them manage on a day-to-day basis. In particular, there are opportunities to use organizationwide competencies to evaluate employee performance that reinforce behaviors and actions that support the organization's mission, translate employee performance so that managers make meaningful distinctions between top and poor performers with objective and fact-based information, and provide information to employees about the results of the performance appraisals and pay decisions to ensure reasonable transparency and appropriate accountability mechanisms are in place.
GAO-04-83, Human Capital: Implementing Pay for Performance at Selected Personnel Demonstration Projects
This is the accessible text file for GAO report number GAO-04-83
entitled 'Human Capital: Implementing Pay for Performance at Selected
Personnel Demonstration Projects' which was released on February 25,
2004.
This text file was formatted by the U.S. General Accounting Office
(GAO) to be accessible to users with visual impairments, as part of a
longer term project to improve GAO products' accessibility. Every
attempt has been made to maintain the structural and data integrity of
the original printed product. Accessibility features, such as text
descriptions of tables, consecutively numbered footnotes placed at the
end of the file, and the text of agency comment letters, are provided
but may not exactly duplicate the presentation or format of the printed
version. The portable document format (PDF) file is an exact electronic
replica of the printed version. We welcome your feedback. Please E-mail
your comments regarding the contents or accessibility features of this
document to Webmaster@gao.gov.
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed
in its entirety without further permission from GAO. Because this work
may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this
material separately.
Report to Congressional Requesters:
January 2004:
Human Capital:
Implementing Pay for Performance at Selected Personnel Demonstration
Projects:
GAO-04-83:
GAO Highlights:
Highlights of GAO-04-83, a report to congressional requesters:
Why GAO Did This Study:
There is a growing understanding that the federal government needs to
fundamentally rethink its current approach to pay and to better link
pay to individual and organizational performance. Federal agencies
have been experimenting with pay for performance through the Office of
Personnel Management‘s (OPM) personnel demonstration projects.
GAO identified the approaches that selected personnel demonstration
projects have taken to implement their pay for performance systems.
These projects include: the Navy Demonstration Project at China Lake
(China Lake), the National Institute of Standards and Technology
(NIST), the Department of Commerce (DOC), the Naval Research
Laboratory (NRL), the Naval Sea Systems Command Warfare Centers
(NAVSEA) at Dahlgren and Newport, and the Civilian Acquisition
Workforce Personnel Demonstration Project (AcqDemo). We selected these
demonstration projects based on factors such as status of the project
and makeup of employee groups covered.
We provided drafts of this report to officials in the Department of
Defense (DOD) and DOC for their review and comment. DOD provided
written comments concurring with our report. DOC provided minor
technical clarifications and updated information. We provided a draft
of the report to the Director of OPM for her information.
What GAO Found:
The demonstration projects took a variety of approaches to designing
and implementing their pay for performance systems to meet the unique
needs of their cultures and organizational structures, as shown in the
table below.
Demonstration Project Approaches to Implementing Pay for Performance:
Using competencies to evaluate employee performance:
High-performing organizations use validated core competencies as a key
part of evaluating individual contributions to organizational results.
To this end, AcqDemo and NRL use core competencies for all positions.
Other demonstration projects, such as NIST, DOC, and China Lake, use
competencies based on the individual employee‘s position.
Translating employee performance ratings into pay increases and
awards:
Some projects, such as China Lake and NAVSEA‘s Newport division,
established predetermined pay increases, awards, or both depending on
a given performance rating, while others, such as DOC and NIST,
delegated the flexibility to individual pay pools to determine how
ratings would translate into performance pay increases, awards, or
both. The demonstration projects made some distinctions among
employees‘ performance.
Considering current salary in making performance-based pay decisions:
Several of the demonstration projects, such as AcqDemo and NRL,
consider an employee‘s current salary when making performance pay
increases and award decisions to make a better match between an
employee‘s compensation and contribution to the organization.
Managing costs of the pay for performance system:
According to officials, salaries, training, and automation and data
systems were the major cost drivers of implementing their pay for
performance systems. The demonstration projects used a number of
approaches to manage the costs.
Providing information to employees about the results of performance
appraisal and pay decisions:
To ensure fairness and safeguard against abuse, performance-based pay
programs should have adequate safeguards, including reasonable
transparency in connection with the results of the performance
management process. To this end, several of the demonstration projects
publish information, such as the average performance rating,
performance pay increase, and award.
Source: GAO.
[End of table]
GAO strongly supports the need to expand pay for performance in the
federal government. How it is done, when it is done, and the basis on
which it is done can make all the difference in whether such efforts
are successful. High-performing organizations continuously review and
revise their performance management systems. These demonstration
projects show an understanding that how to better link pay to
performance is very much a work in progress at the federal level.
Additional work is needed to strengthen efforts to ensure that
performance management systems are tools to help them manage on a day-
to-day basis. In particular, there are opportunities to use
organizationwide competencies to evaluate employee performance that
reinforce behaviors and actions that support the organization's
mission, translate employee performance so that managers make
meaningful distinctions between top and poor performers with objective
and fact-based information, and provide information to employees about
the results of the performance appraisals and pay decisions to ensure
reasonable transparency and appropriate accountability mechanisms are
in place.
www.gao.gov/cgi-bin/getrpt?GAO-04-83.
To view the full product, including the scope and methodology, click
on the link above. For more information, contact J. Christopher Mihm
at (202) 512-6806 or mihmj@gao.gov.
[End of section]
Contents:
Letter:
Results in Brief:
Background:
Selected Demonstration Projects Took Various Approaches to Implement
Their Pay for Performance Systems:
Concluding Observations:
Agency Comments:
Appendixes:
Appendix I: Objective, Scope, and Methodology:
Appendix II: Demonstration Project Profiles:
Navy Demonstration Project at China Lake (China Lake):
National Institute of Standards and Technology (NIST):
Department of Commerce (DOC):
Naval Research Laboratory (NRL):
Naval Sea Systems Command Warfare Centers (NAVSEA):
Civilian Acquisition Personnel Demonstration Project (AcqDemo):
Appendix III: Comments from the Department of Defense:
Appendix IV: GAO Contacts and Staff Acknowledgments:
GAO Contacts:
Acknowledgments:
Tables:
Table 1: Selected GS Funding Sources Available for Employee Salary
Increases:
Table 2: China Lake's Pay Increase Distribution (2002):
Table 3: NAVSEA Newport Division's Pay Increase and Award Distribution
(2002):
Table 4: DOC's Pay Increase and Award Distribution (2002):
Table 5: Cumulative Percentage Increase in Average Salaries for
Demonstration Project and Comparison Group Employees by Year of the
Project, as Reported by the Demonstration Projects:
Table 6: Direct Inflation-Adjusted Cost of Training in the First 5 Years
of the Demonstration Projects (in 2002 Dollars), as Reported by the
Demonstration Projects:
Table 7: Inflation-Adjusted Cost of Automation and Data Systems for
Selected Demonstration Projects (in 2002 Dollars), as Reported by the
Demonstration Projects:
Figures:
Figure 1: China Lake's Rating and Pay Distribution Structure:
Figure 2: China Lake's Rating Distribution by Numerical Rating (2002):
Figure 3: NAVSEA Newport Division's Rating and Performance Pay
Distribution Structure:
Figure 4: NAVSEA Newport Division's Rating Distribution (2002):
Figure 5: DOC's Rating Distribution (2002):
Figure 6: AcqDemo's Consideration of Current Salary in Making
Performance Pay Decisions:
Figure 7: Funding Sources Linked to Pay Decisions in Selected Personnel
Demonstration Projects as of Fiscal Year 2003:
Figure 8: Pay Bands, Intervals, and Corresponding Permanent Pay
Increases for NIST's Scientific and Engineering Career Path:
Figure 9: Sample of NAVSEA Newport Division's Rating Category
Distribution Data Provided to Employees:
Figure 10: Sample of NIST's Distribution of Average Performance Rating
Scores Provided to Employees:
Figure 11: Selected Employee Attitude Data for China Lake:
Figure 12: Selected Employee Attitude Data for NIST:
Figure 13: Selected Employee Attitude Data for DOC:
Figure 14: Selected Employee Attitude Data for NRL:
Figure 15: Selected Employee Attitude Data for NAVSEA:
Figure 16: Selected Employee Attitude Data for AcqDemo:
Abbreviations:
AcqDemo: Civilian Acquisition Workforce Personnel Demonstration
Project:
CPDF: Central Personnel Data File:
DOC: Department of Commerce:
DOD: Department of Defense:
FEPCA: Federal Employees Pay Comparability Act of 1990:
GPI: general pay increase:
GS: General Schedule:
NAVSEA: Naval Sea Systems Command Warfare Centers:
NIST: National Institute of Standards and Technology:
NRL: Naval Research Laboratory:
OPM: Office of Personnel Management:
QSI: quality step increase:
WGI: within-grade increase:
Letter January 23, 2004:
The Honorable George V. Voinovich:
Chairman:
Subcommittee on Oversight of Government Management, the Federal
Workforce, and the District of Columbia:
Committee on Governmental Affairs:
United States Senate:
The Honorable Jo Ann Davis:
Chairwoman:
Subcommittee on Civil Service and Agency Organization:
Committee on Government Reform:
House of Representatives:
To successfully transform themselves, high-performing organizations
have found that they must fundamentally change their cultures so that
they are more results-oriented, customer-focused, and collaborative in
nature, and have recognized that an effective performance management
system can help them drive internal change and achieve desired results.
Our prior work, done at your request, has identified nine key practices
for effective performance management based on experiences in public
sector organizations both in the United States and abroad.[Footnote 1]
The key practices are as follows:
1. Align individual performance expectations with organizational goals.
2. Connect performance expectations to crosscutting goals.
3. Provide and routinely use performance information to make program
improvements.
4. Require follow-up actions to address organizational priorities.
5. Use competencies to provide a fuller assessment of performance.
6. Link pay to individual and organizational performance.
7. Make meaningful distinctions in performance.
8. Involve employees and stakeholders to gain ownership of performance
management systems.
9. Maintain continuity during transitions.
Among these practices, there is a growing understanding that the
federal government needs to fundamentally rethink its current approach
to pay and better link pay to individual and organizational
performance. To this end, Congress has taken important steps to
implement results-oriented pay reform and modern performance management
systems across government. Most recently, Congress provided the
Department of Defense (DOD) flexibility to revise its performance
management system to better link pay to performance and required DOD to
incorporate employee involvement, provide ongoing performance
feedback, and include effective safeguards to ensure fairness and
equity, among other things, in DOD's revised system.
Congress also established a Human Capital Performance Fund to reward
agencies' highest performing and most valuable employees. To be
eligible, agencies are to submit plans for approval by the Office of
Personnel Management (OPM) that incorporate a link between pay for
performance and the agency's strategic plan, employee involvement,
ongoing performance feedback, and effective safeguards to ensure fair
management of the system, among other things. In the first year of
implementation, up to 10 percent of the amount appropriated is to be
available to train those involved on making meaningful distinctions in
performance. In addition, Congress created a wider, more open pay range
for senior executive compensation, thus allowing for pay to be more
directly tied to individual performance, contribution to the agency's
performance, or both, as determined under a rigorous performance
management system that as designed and applied, makes meaningful
distinctions based on relative performance.
Further, in November 2002, Congress established the Department of
Homeland Security and provided it human capital flexibilities to design
a performance management system and specifically to consider different
approaches to pay. We reported that the department's effort to design
its system could be particularly instructive in light of future
requests for human capital flexibilities.[Footnote 2] Legislation is
currently pending, which you sponsored and introduced, that would
provide GAO additional authority to more fully link employees' annual
salary increases to performance.
Federal agencies have been experimenting with pay for performance
through OPM's personnel demonstration projects. Over the past 25 years,
OPM has approved 17 projects, 12 of which have implemented pay for
performance systems. At your request, this report identifies the
approaches that 6 of these personnel demonstration projects have taken
to implement their pay for performance systems. These projects are:
* the Navy Demonstration Project at China Lake (China Lake),
* the National Institute of Standards and Technology (NIST),
* the Department of Commerce (DOC),
* the Naval Research Laboratory (NRL),
* the Naval Sea Systems Command Warfare Centers (NAVSEA) at Dahlgren
and Newport, and:
* the Civilian Acquisition Workforce Personnel Demonstration Project
(AcqDemo).
To address the objective of this report, we focused on OPM's personnel
demonstration projects because they are required to prepare designs,
conduct employee feedback, and complete evaluations of their results,
among other things. We selected these demonstration projects based on
factors such as status of the project and makeup of employee groups
covered. We analyzed Federal Register notices outlining the major
features of each demonstration project, operating manuals, annual and
summative evaluations, employee attitude survey results, project
briefings, training materials, rating and payout data, and cost data as
reported by the agencies without verification by GAO, as well as other
relevant documentation. We also interviewed cognizant officials from
OPM; demonstration project managers, human resource officials, and
participating supervisors and employees; and union and other employee
representatives. We did not independently evaluate the effectiveness of
the demonstration projects. We assessed the reliability of cost,
salary, rating, and performance pay distribution data provided by the
demonstration projects and determined that the data were sufficiently
reliable for the purposes of this report, with the exception of the DOC
salary data, which we do not present.
We performed our work in the Washington, D.C., metropolitan area from
December 2002 through August 2003 in accordance with generally accepted
government auditing standards. Appendix I provides additional
information on our objective, scope, and methodology. Appendix II
presents profiles of the demonstration projects, including selected
elements of their performance management systems, employee attitude
data, and reported effects.
Results in Brief:
We found that the demonstration projects took a variety of approaches
to designing and implementing their pay for performance systems to meet
the unique needs of their cultures and organizational structures.
Specifically, the demonstration projects took different approaches to:
* using competencies to evaluate employee performance,
* translating employee performance ratings into pay increases and
awards,
* considering current salary in making performance-based pay decisions,
* managing costs of the pay for performance system, and:
* providing information to employees about the results of performance
appraisal and pay decisions.
Using competencies to evaluate employee performance. High-performing
organizations use validated core competencies as a key part of
evaluating individual contributions to organizational results. Core
competencies applied organizationwide can help reinforce employee
behaviors and actions that support the organization's mission, goals,
and values and can provide a consistent message to employees about how
they are expected to achieve results. AcqDemo and NRL use core
competencies for all positions across the organization to evaluate
performance. Other demonstration projects, such as NIST, DOC, and China
Lake, use competencies based primarily on the individual position. (See
p. 9.)
Translating employee performance ratings into pay increases and awards.
High-performing organizations seek to create pay, incentive, and reward
systems that clearly link employee knowledge, skills, and contributions
to organizational results. These organizations make meaningful
distinctions between acceptable and outstanding performance of
individuals and appropriately reward those who perform at the highest
level. To this end, the demonstration projects took different
approaches in translating individual employee performance ratings into
permanent pay increases, one-time awards, or both in their pay for
performance systems. Some projects, such as China Lake and NAVSEA's
Newport division, established predetermined pay increases, awards, or
both depending on a given performance rating, while others, such as DOC
and NIST, delegated the flexibility to individual pay pools to
determine how ratings would translate into pay increases, awards, or
both. While the demonstration projects made some distinctions among
employees' performance, the data and experience show that making such
meaningful distinctions remains a work in progress. (See p. 12.)
Considering current salary in making performance-based pay decisions.
Several of the demonstration projects consider an employee's current
salary when making pay increase and award decisions. By considering
salary in such decisions, the projects intend to make a better match
between an employee's compensation and his or her contribution to the
organization. Thus, two employees with comparable contributions could
receive different performance pay increases and awards depending on
their current salaries. For example, AcqDemo determines if employees
are "appropriately compensated," "under-compensated," or "over-
compensated" when it compares employee contribution scores to salary.
(See p. 23.)
Managing costs of the pay for performance system. According to OPM, the
increased costs of implementing alternative personnel systems should be
acknowledged and budgeted for up front. Based on data the demonstration
projects provided, direct costs associated with salaries, training, and
automation and data systems were the major cost drivers of implementing
their pay for performance systems. The demonstration projects used a
number of approaches to manage the direct costs of implementing and
maintaining pay for performance systems. In making their pay decisions,
some of the demonstration projects use funding sources such as the
annual general pay increase and locality pay adjustment. Several
demonstration projects managed salary costs by considering fiscal
conditions and the labor market when determining how much to budget for
pay increases, managing movement through the pay band, and providing a
mix of one-time awards and permanent pay increases. (See p. 25.)
Providing information to employees about the results of performance
appraisal and pay decisions. We have observed that a more performance-
based pay system should have adequate safeguards to ensure fairness and
guard against abuse. One such safeguard is to ensure reasonable
transparency and appropriate accountability mechanisms in connection
with the results of the performance management process. To this end,
several of the demonstration projects publish information for employees
on internal Web sites about the results of performance appraisal and
pay decisions, such as the average performance rating, the average pay
increase, and the average award for the organization and for each
individual department, while other demonstration projects publish no
information on the results of the performance cycle. (See p. 36.)
We provided drafts of this report to the Secretaries of Defense and
Commerce for their review and comment. DOD's Principal Deputy, Under
Secretary of Defense for Personnel and Readiness, provided written
comments, which are presented in appendix III. DOD concurred with our
report and stated that it is a useful summary of the various approaches
that the demonstration projects undertook to implement their pay for
performance systems and that their experiences provide valuable insight
into federal pay for performance models. DOD also noted that the NAVSEA
demonstration project training and automation cost data are estimated
rather than actual costs. We made the appropriate notation. While DOC
did not submit written comments, DOC's Classification, Pay, and HR
Demonstration Program Manager provided minor technical clarifications
and updated information. We made those changes where appropriate. We
provided a draft of the report to the Director of OPM for her
information.
Background:
Congress granted OPM the authority to conduct personnel demonstration
projects under the Civil Service Reform Act of 1978 to test new
personnel
and pay systems.[Footnote 3] A federal agency is to obtain the
authority from OPM to waive existing laws and regulations in Title 5 to
propose, develop, test, and evaluate alternative approaches to managing
its human capital. Under the demonstration project authority, no
waivers of law are to be permitted in areas of employee leave, employee
benefits, equal employment opportunity, political activity, merit
system principles, or prohibited personnel practices. The law also
contains certain limitations and requirements, including:
* 5-year time limit for duration of projects,
* 5,000 employee cap on participation,
* restriction to 10 concurrent demonstration projects governmentwide,
* union and employee consultation,
* published formal project plan in the Federal Register,
* notification of Congress and employees of the demonstration project,
and:
* project evaluations.
OPM guidance requires that agencies conduct at least three evaluations-
-after implementation, after at least 3 and a half years, and after the
original scheduled end of the project--that are to address the
following questions:
* Did the project accomplish the intended purpose and goals? If not,
why not?
* Was the project implemented and operated appropriately and
accurately?
* What were the costs, relative to the benefits of the project?
* What was the impact on veterans and other equal employment
opportunity groups?
* Were merit systems principles adhered to and prohibited personnel
practices avoided?
* Can the project or portions thereof be generalized to other agencies
or governmentwide?
The demonstration projects can link some or all of the funding sources
for pay increases available under the current federal compensation
system, the General Schedule (GS), to an employee's level of
performance.[Footnote 4] Table 1 defines selected funding sources.
Table 1: Selected GS Funding Sources Available for Employee Salary
Increases:
Funding source: General pay increase (GPI); Description: Established
under the Federal Employees Pay Comparability Act of 1990 (FEPCA), the
GPI is to be determined annually and delivered automatically and
uniformly to GS employees. The GPI is to be based on the Employment
Cost Index, which is a statistical measure maintained by the Bureau of
Labor Statistics that considers changes in private sector labor
costs.
Funding source: Locality pay adjustment; Description: Established
under FEPCA, locality pay is to address any gap between federal and
nonfederal salaries and is to be determined annually and delivered
automatically and uniformly to most GS employees within a given
locality. Locality pay is to supplement the rate of basic pay in the
48 contiguous states where nonfederal pay exceeds federal pay by more
than 5 percent. The President's Pay Agent, comprised of the Secretary
of Labor and the Directors of the Office of Management and Budget and
OPM, is to recommend and the President is to approve what, if any, the
percentage of increase should be.
Funding source: Within-grade increase (WGI); Description: The WGI,
also known as a "step increase," is a periodic increase in a GS
employee's rate of basic pay to the next higher pay level or "step" of
that grade. To receive a WGI, an employee must wait a prescribed
amount of time and be performing at an acceptable level of competence.
OPM reports that the WGI is designed to reward experience and loyalty
and is based on a judgment that the employee's work is of an
"acceptable level of competence" but does not distinguish between very
good and moderately good performance[A].
Funding source: Quality step increase (QSI); Description: A QSI is to
recognize high-quality performance. Similar to a WGI, a QSI advances
the employee to the next higher step but ahead of the required waiting
period. To receive a QSI, an employee must demonstrate sustained high-
quality performance.
Funding source: Career ladder promotion; Description: Federal
employees may be appointed to positions with "career ladders," a
series of developmental positions of increasing difficulty, through
which an employee may be promoted to higher grade levels without
competition.
Source: OPM.
[A] U.S. Office of Personnel Management, A Fresh Start for Federal Pay:
The Case for Modernization (Washington, D.C.: April 2002).
[End of table]
Selected Demonstration Projects Took Various Approaches to Implement
Their Pay for Performance Systems:
High-performing organizations seek to create pay, incentive, and reward
systems based on valid, reliable, and transparent performance
management systems with adequate safeguards and link employee
knowledge, skills, and contributions to organizational results. To that
end, we found that the demonstration projects took a variety of
approaches to designing and implementing their pay for performance
systems to meet the unique needs of their cultures and organizational
structures. Specifically, the demonstration projects took different
approaches to:
* using competencies to evaluate employee performance,
* translating employee performance ratings into pay increases and
awards,
* considering current salary in making performance-based pay decisions,
* managing costs of the pay for performance system, and:
* providing information to employees about the results of performance
appraisal and pay decisions.
Using Competencies to Evaluate Employee Performance:
High-performing organizations use validated core competencies as a key
part of evaluating individual contributions to organizational results.
Competencies define the skills and supporting behaviors that
individuals are expected to demonstrate and can provide a fuller
picture of an individual's performance. To this end, we found that the
demonstration projects took different approaches to evaluating employee
performance. AcqDemo and NRL use core competencies for all positions
across the organization. Other demonstration projects, such as NIST,
DOC, and China Lake, use competencies based primarily on the individual
employee's position.
Applying competencies organizationwide. Core competencies applied
organizationwide can help reinforce employee behaviors and actions that
support the organization's mission, goals, and values and can provide a
consistent message to employees about how they are expected to achieve
results. AcqDemo evaluates employee performance against one set of
"factors," which are applied to all employees. "Discriminators" and
"descriptors" further define the factors by career path and pay band.
According to AcqDemo, taken together, the factors, discriminators, and
descriptors are relevant to the success of a DOD acquisition
organization.[Footnote 5]
AcqDemo's six factors are (1) problem solving, (2) teamwork/
cooperation, (3) customer relations, (4) leadership/supervision, (5)
communication, and (6) resource management. Discriminators further
define each factor. For example, discriminators for problem solving
include scope of responsibility, creativity, complexity, and
independence. Descriptors identify contributions by pay band. For
example, a descriptor for problem solving at one pay band level is
"resolves routine problems within established guidelines," and at a
higher level, a descriptor is "anticipates problems, develops sound
solutions and action plans to ensure program/mission accomplishment."
All factors must be used and cannot be supplemented. While the pay pool
manager may weight the factors, according to an official, no
organization within AcqDemo has weighted the factors to date. Managers
are authorized to use weights sparingly because contributions in all
six factors are important to ensuring AcqDemo's overall success as well
as to developing the skills of the acquisition workforce. If weights
are used, they are to be applied uniformly across all positions within
the pay pool. The six factors are initially weighted equally and no
factor can be weighted less than one-half of its initial weight.
Employees are to be advised of the weights at the beginning of the
rating period.
While AcqDemo applies organizationwide competencies across all
employees, NRL has established "critical elements" for each career path
and allows supervisors to add individual performance expectations. The
critical elements are the key aspects of work that supervisors are to
consider in evaluating employee performance. Each critical element has
discriminators and descriptors. Specifically, for the Science and
Engineering Professionals career path, one critical element is
"scientific and technical problem solving." That element's
discriminators are (1) level of oversight, (2) creativity, (3)
technical communications, and (4) recognition. For recognition, the
descriptors include "recognized within own organization for technical
ability in assigned areas" as one level of contribution and "recognized
internally and externally by peers for technical expertise" as the next
level of contribution.
NRL's system allows supervisors to supplement the descriptors to
further describe what is expected of employees. According to an NRL
demonstration project official, this flexibility allows the supervisor
to better communicate performance expectations. Further, pay pool
panels may weight the critical elements, including a weight of zero.
Weighted elements are to be applied consistently to groups within a
career path, such as Bench Level, Supervisor, Program Manager, or
Support for the Science and Engineering Professionals career path.
According to an NRL official, panels commonly weight critical elements
but rarely weight an element to zero. Further, panels use weighting
most often for the Science and Engineering Professionals career path.
Determining individual position-based competencies. Other
demonstration projects determine competencies based primarily on the
individual position. NIST and DOC identify "critical elements" tailored
to each individual position.[Footnote 6] According to a DOC
demonstration project official, DOC tailors critical elements to
individual positions because their duties and responsibilities vary
greatly within the demonstration project. Each employee's performance
plan is to have a minimum of two and a maximum of six critical elements
along with the major activities to accomplish the element. Supervisors
are to assign a weight to each critical element on the basis of its
importance, the time required to accomplish it, or both. According to
NIST and DOC officials, weighting is done at the supervisory level and
is not tracked at the organizational level.
To evaluate the accomplishment of critical elements, DOC uses its
organizationwide Benchmark Performance Standards. They range from the
highest standard of performance, "objectives were achieved with maximum
impact, through exemplary work that demonstrated exceptional
originality, versatility, and creativity" to the lowest, "objectives
and activities were not successfully completed, because of failures in
quality, quantity, completeness, or timelines of work." Supervisors can
develop supplemental performance standards as needed.
Similarly, each China Lake employee has a performance plan that
includes criteria tailored to individual responsibilities. The criteria
are to be consistent with the employee's work unit's goals and
objectives and can be set in two ways, depending on the nature of the
position. The "task approach" defines an individual's output. The
"function approach" defines the required skills and how well they are
to be performed. Employees and supervisors choose from a menu of
skills, such as planning, analysis, coordination, and reporting/
documentation. A China Lake official stated that some of its work units
require core competencies, such as teamwork and self-development, for
all employees. According to the official, while developing core
competencies sends a message about what is important to the
organization, tailoring individual performance plans can focus
employees' attention on changing expectations.
Translating Employee Performance Ratings into Pay Increases and Awards:
High-performing organizations seek to create pay, incentive, and reward
systems that clearly link employee knowledge, skills, and contributions
to organizational results. These organizations make meaningful
distinctions between acceptable and outstanding performance of
individuals and appropriately reward those who perform at the highest
level. Performance management systems in these leading organizations
typically seek to achieve three key objectives: (1) provide candid and
constructive feedback to help individual employees maximize their
potential in understanding and realizing the goals and objectives of
the agency, (2) provide management with the objective and fact-based
information it needs to reward top performers, and (3) provide the
necessary information and documentation to deal with poor performers.
To this end, the demonstration projects took different approaches in
translating individual employee performance ratings into permanent pay
increases, one-time awards, or both in their pay for performance
systems. Some projects, such as China Lake and NAVSEA's Newport
division, established predetermined pay increases, awards, or both
depending on a given performance rating. Others, such as DOC and NIST,
delegated the flexibility to individual pay pools to determine how
ratings translate into pay increases, awards, or both. Overall, while
the demonstration projects made some distinctions among employees'
performance, the data and experience to date show that making such
meaningful distinctions remains a work in progress.
Setting predetermined pay increases and awards. China Lake's assessment
categories translate directly to a predetermined range of permanent pay
increases, as shown in figure 1.[Footnote 7] Supervisors are to rate
employees in one of three assessment categories and recommend numerical
ratings, based on employees' performance and salaries, among other
factors. For employees receiving "highly successful" ratings, a
Performance Review Board assigns the numerical ratings. For "less than
fully successful" ratings, the first-line supervisor and a second-level
reviewer assign the numerical ratings, based on a problem-solving
team's findings and a personnel advisor's input. The numerical rating
determines how many "increments" the employee will receive. An
increment is a permanent pay increase of about 1.5 percent of an
employee's base salary.
Figure 1: China Lake's Rating and Pay Distribution Structure:
[See PDF for image]
Note: All employees receive the locality pay adjustment regardless of
assessment category or numerical rating.
[End of figure]
China Lake made some distinctions in performance across employees'
ratings, as shown in figure 2:[Footnote 8]
* 11.3 percent of employees received a "1," the highest numerical
rating, and:
* a total of six employees (0.2 percent) were rated "less than fully
successful" and received numerical ratings of "4" or "5."
Figure 2: China Lake's Rating Distribution by Numerical Rating (2002):
[See PDF for image]
[End of figure]
Note: Percentages total more than 100 percent due to rounding.
At China Lake, the average pay increase rose with performance, as shown
in table 2.
* The average permanent pay increase ranged from 1.8 to 5.3 percent.
* Six employees were rated as "less than fully successful" and thus
were to receive no performance pay increases and half or none of the
GPI. According to a China Lake official, employees rated as "less than
fully successful" are referred to a problem-solving team, consisting of
the supervisor, reviewer, personnel advisor, and other appropriate
officials, that determines what corrective actions are necessary.
Table 2: China Lake's Pay Increase Distribution (2002):
Assessment category: Highly successful;
Numerical rating: 1;
Number of employees receiving permanent pay increases: 191;
Increase as a percentage of base pay: Average: 5.3;
Increase as a percentage of base pay: Lowest: 1.5;
Increase as a percentage of base pay: Highest: 9.3.
Assessment category: Highly successful;
Numerical rating: 2;
Number of employees receiving permanent pay increases: 929;
Increase as a percentage of base pay: Average: 3.4;
Increase as a percentage of base pay: Lowest: 1.5;
Increase as a percentage of base pay: Highest: 5.6.
Assessment category: Fully successful;
Numerical rating: 3;
Number of employees receiving permanent pay increases: 526;
Increase as a percentage of base pay: Average: 1.8;
Increase as a percentage of base pay: Lowest: 1.3;
Increase as a percentage of base pay: Highest: 2.7.
Assessment category: Less than fully successful;
Numerical rating: 4;
Number of employees receiving permanent pay increases: 0;
Increase as a percentage of base pay: Average: N/A;
Increase as a percentage of base pay: Lowest: N/A;
Increase as a percentage of base pay: Highest: N/A.
Assessment category: Less than fully successful;
Numerical rating: 5;
Number of employees receiving permanent pay increases: 0;
Increase as a percentage of base pay: Average: N/A;
Increase as a percentage of base pay: Lowest: N/A;
Increase as a percentage of base pay: Highest: N/A.
Total;
Number of employees receiving permanent pay increases: 1,646;
Source: DOD.
Legend: N/A= data are not applicable.
Notes: Data do not include the GPI or the locality pay adjustment.
Employees whose salaries are at the top of the pay band cannot receive
permanent pay increases; therefore, the number of employees receiving
pay increases differs from those receiving ratings.
[End of table]
Similar to China Lake, at NAVSEA's Newport division, a performance
rating category translates directly to a predetermined range of
permanent pay increases, one-time awards, or both, as shown in figure
3. Newport translates ratings into pay increases and awards in three
steps. First, supervisors are to rate employees as "acceptable" or
"unacceptable." Employees rated as unacceptable are not eligible for
pay increases or awards. Employees rated as acceptable are to be
further assessed on their performance relative to their salaries.
Supervisors assess acceptable employees into three rating categories:
contributors, major contributors, or exceptional contributors.
Supervisors also make recommendations for the number of pay points to
be awarded, from 0 to 4, depending on the rating category and the
employees' salaries. Pay pool managers review and department heads
finalize supervisor recommendations. A pay point equals 1.5 percent of
the midpoint salary of the pay band. Pay points may be permanent pay
increases or one-time awards.
Figure 3: NAVSEA Newport Division's Rating and Performance Pay
Distribution Structure:
[See PDF for image]
Note: All employees receive the full GPI and locality pay adjustment
regardless of rating category.
[End of figure]
Newport allows for some flexibility in deciding whether employees
receive permanent pay increases, one-time awards, or both. Newport's
guidelines state that those who make greater contributions should
receive permanent increases to base pay, while employees whose
contributions are commensurate with their salaries receive one-time
awards. In addition, employees whose salaries fall below the midpoint
of the pay band are more likely to receive permanent pay increases,
while employees above the midpoint of the pay band are more likely to
receive one-time awards.
NAVSEA's Newport division made some distinctions in performance across
employees' ratings.[Footnote 9] As shown in figure 4,
* about 80 percent of employees were rated in the top two categories
(exceptional contributor and major contributor) and:
* no employees were rated unacceptable.
Figure 4: NAVSEA Newport Division's Rating Distribution (2002):
[See PDF for image]
Note: Percentages total less than 100 percent due to rounding.
[End of figure]
In addition, at NAVSEA's Newport division, the average pay increase and
award amount rose with performance, as shown in table 3.
* The average permanent pay increase ranged from 1.6 to 2.9 percent.
* The average performance award ranged from $1,089 to $2,216.
Table 3: NAVSEA Newport Division's Pay Increase and Award Distribution
(2002):
Rating: Exceptional contributor;
Permanent pay increase:
Number of employees receiving permanent pay increases: 686;
Increase as a percentage of base pay:
Average: 2.9;
Lowest: 0.1;
Highest: 7.0;
Number of employees receiving performance awards: 615;
Performance award amount: Average: $2,216;
Performance award amount: Lowest: $561;
Performance award amount: Highest: $5,680.
Rating: Major contributor;
Permanent pay increase:
Number of employees receiving permanent pay increases: 602;
Increase as a percentage of base pay:
Average: 2.0;
Lowest: 0.9;
Highest: 5.3;
Performance awards:
Number of employees receiving performance awards: 613;
Performance award amount: Average: 1,592;
Performance award amount: Lowest: 561;
Performance award amount: Highest: 4,260.
Rating: Contributor;
Permanent pay increase:
Number of employees receiving permanent pay increases: 124;
Increase as a percentage of base pay:
Average: 1.6;
Lowest: 1.2;
Highest: 1.8;
Performance awards:
Number of employees receiving performance awards: 143;
Performance award amount: Average: 1,089;
Performance award amount: Lowest: 519;
Performance award amount: Highest: 2,212.
Rating: Unacceptable;
Permanent pay increase:
Number of employees receiving permanent pay increases: 0;
Increase as a percentage of base pay:
Average: N/A;
Lowest: N/A;
Highest: N/A;
Performance awards:
Number of employees receiving performance awards: 0;
Performance award amount: Average: N/A;
Performance award amount: Lowest: N/A;
Performance award amount: Highest: N/A.
Total;
Permanent pay increase:
Number of employees receiving permanent pay increases: 1,412;
Performance awards:
Number of employees receiving performance awards: 1,371.
Source: DOD.
Legend: N/A= data are not applicable.
Notes: Data do not include the GPI or locality pay adjustment.
Employees can receive their pay as permanent increases or one-time
awards; therefore, the number of employees receiving pay increases and
awards differs from those receiving ratings.
[End of table]
Delegating pay decisions to pay pools. Some demonstration projects,
such as NIST and DOC, delegate the flexibility to individual pay pools
to determine how ratings translate into permanent pay increases and
one-time awards. For example, supervisors are to evaluate employees on
a range of performance elements on a scale of 0 to 100. Employees with
scores less than 40 are to be rated as "unsatisfactory" and are not
eligible to receive performance pay increases, awards, the GPI, or the
locality pay adjustment. Employees with scores over 40 are to be rated
as "eligible;" receive the full GPI and locality pay adjustment; and be
eligible for a performance pay increase, award, or both.
Pay pool managers have the flexibility to determine the amount of the
pay increase, award, or both for each performance score, depending on
where they fall within the pay band. Employees lower in the pay band
are eligible for larger pay increases as a percentage of base pay than
employees higher in the pay band, and employees whose salaries are at
the top of the pay band and who therefore can no longer receive
permanent salary increases may receive awards.
According to our analysis, in its 2002 rating cycle, DOC made few
distinctions in performance in its distribution of ratings.[Footnote
10] As shown in figure 5,
* 100 percent of employees scored 40 or above and over 86 percent of
employees scored 80 or above and:
* no employees were rated as unsatisfactory.
Figure 5: DOC's Rating Distribution (2002):
[See PDF for image]
[End of figure]
According to a DOC official, a goal of the demonstration project is to
address poor performance early. An official also noted that poor
performers may choose to leave the organization before they receive
ratings of unsatisfactory or are placed on a performance improvement
plan. Employees who are placed on a performance improvement plan and
improve their performance within the specified time frame (typically
less than 90 days) are determined to be eligible for the GPI and
locality pay adjustment for the remainder of the year.
Our analysis also shows that DOC made few distinctions in performance
in its distribution of awards. As shown in table 4, 10 employees who
scored from 60 to 69 received an average performance award of $925,
while employees who scored from 70 to 79 received an average of $742.
Our analysis suggests that DOC's policy of delegating flexibility to
individual pay pools to determine performance awards could explain why,
without an independent reasonableness review, some employees with lower
scores receive larger awards than employees with higher scores.
According to DOC, it reviews pay pool decisions within but not across
organizational units.
Table 4: DOC's Pay Increase and Award Distribution (2002):
Rating; Eligible: 90-100;
Permanent pay increase:
Number of employees receiving permanent pay increases: 1,014;
Average as a percentage of base pay: Average: 3.9;
Average as a percentage of base pay: Lowest: 0.7;
Average as a percentage of base pay: Highest: 15.0;
Performance award:
Number of employees receiving performance awards: 1,079;
Performance award amount: Average: $1,781;
Performance award amount: Lowest: $250;
Performance award amount: Highest: $7,500.
Rating; Eligible: 80-89;
Permanent pay increase:
Number of employees receiving permanent pay increases: 1,121;
Average as a percentage of base pay: Average: 3.1;
Average as a percentage of base pay: Lowest: 0.02;
Average as a percentage of base pay: Highest: 11.0;
Performance award:
Permanent pay increase:
Number of employees receiving performance awards: 1,099;
Performance award amount: Average: 1,117;
Performance award amount: Lowest: 100;
Performance award amount: Highest: 6,000.
Rating; Eligible: 70-79;
Permanent pay increase:
Number of employees receiving permanent pay increases: 250;
Average as a percentage of base pay: Average: 2.4;
Average as a percentage of base pay: Lowest: 0.2;
Average as a percentage of base pay: Highest: 9.0;
Performance award:
Number of employees receiving performance awards: 181;
Performance award amount: Average: 742;
Performance award amount: Lowest: 50;
Performance award amount: Highest: 2,000.
Rating; Eligible: 60-69;
Permanent pay increase:
Number of employees receiving permanent pay increases: 18;
Average as a percentage of base pay: Average: 0.9;
Average as a percentage of base pay: Lowest: 0.2;
Average as a percentage of base pay: Highest: 3.2;
Performance award:
Number of employees receiving performance awards: 10;
Performance award amount: Average: 925;
Performance award amount: Lowest: 300;
Performance award amount: Highest: 2,500.
Rating; Eligible: 50-59;
Permanent pay increase:
Number of employees receiving permanent pay increases: 1;
Average as a percentage of base pay: Average: 1.2;
Average as a percentage of base pay: Lowest: 1.2;
Average as a percentage of base pay: Highest: 1.2;
Performance award:
Number of employees receiving performance awards: 1;
Performance award amount: Average: 300;
Performance award amount: Lowest: 300;
Performance award amount: Highest: 300.
Rating; Eligible: 40-49;
Permanent pay increase:
Number of employees receiving permanent pay increases: 0;
Average as a percentage of base pay: Average: N/A;
Average as a percentage of base pay: Lowest: N/A;
Average as a percentage of base pay: Highest: N/A;
Performance award:
Number of employees receiving performance awards: 1;
Performance award amount: Average: 200;
Performance award amount: Lowest: 200;
Performance award amount: Highest: 200.
Rating; Unsatisfactory:
Permanent pay increase:
Number of employees receiving permanent pay increases: 0;
Average as a percentage of base pay: Average: N/A;
Average as a percentage of base pay: Lowest: N/A;
Average as a percentage of base pay: Highest: N/A;
Performance award:
Number of employees receiving performance awards: 0;
Performance award amount: Average: N/A;
Performance award amount: Lowest: N/A;
Performance award amount: Highest: N/A.
Total:
Permanent pay increase:
Number of employees receiving permanent pay increases: 2,404;
Performance Award:
Number of employees receiving performance awards: 2,371.
Source: GAO analysis of DOC data.
Legend: N/A= data are not applicable.
Notes: Data do not include the GPI or the locality pay adjustment.
Not all employees who receive ratings receive pay increases or awards;
therefore, the number of employees receiving pay increases or awards
differs from those receiving ratings.
[End of table]
NIST also delegates pay decisions to individual pay pools. The NIST
100-point rating system is similar to DOC's system. Employees with
scores under 40 are rated as "unsatisfactory" and do not receive the
GPI, locality pay adjustment, or performance pay increases or awards.
Employees with scores over 40 receive the full GPI and locality pay
adjustment and are eligible to receive performance pay increases,
awards, or both. Similar to DOC, in its 2002 rating cycle, NIST made
few distinctions in performance in its distribution of ratings.
Specifically,
* 99.9 percent of employees scored 40 or above, and nearly 78 percent
of employees scored 80 or above, and:
* 0.1 percent, or 3 employees, were rated as unsatisfactory.
Considering Current Salary in Making Performance-Based Pay Decisions:
Several of the demonstration projects consider an employee's current
salary when making decisions on permanent pay increases and one-time
awards. By considering salary in such decisions, the projects intend to
make a better match between an employee's compensation and his or her
contribution to the organization. Thus, two employees with comparable
contributions could receive different pay increases and awards
depending on their current salaries.
At AcqDemo, supervisors recommend and pay pool managers approve
employees' "contribution scores." Pay pools then plot contribution
scores against the employees' current salaries and a "standard pay
line" to determine if employees are "appropriately compensated,"
"under-compensated," or "over-compensated," given their
contributions.[Footnote 11] Figure 6 shows how AcqDemo makes its
performance pay decisions for employees who receive the same
contribution scores but earn different salaries.
Figure 6: AcqDemo's Consideration of Current Salary in Making
Performance Pay Decisions:
[See PDF for image]
[End of figure]
AcqDemo has reported that it has made progress in matching employees'
compensation to their contributions to the organization. From 1999 to
2002, appropriately compensated employees increased from about 63
percent to about 72 percent, under-compensated employees decreased from
about 30 percent to about 27 percent, and over-compensated employees
decreased from nearly 7 percent to less than 2 percent.
NRL implemented a similar system intended to better match employee
contributions with salary. Data from NRL show that it has made progress
in matching employees' compensation to their contributions to the
organization. From 1999 to 2002, "normally compensated" employees, or
employees whose contributions match their compensation, increased from
about 68 percent to about 81 percent; under-compensated employees
decreased from about 25 percent to about 16 percent; and over-
compensated employees decreased from about 7 percent to about 3
percent.
Similar to AcqDemo's and NRL's approach, NAVSEA's Dahlgren division
recently redesigned its pay for performance system to better match
compensation and contribution. Because Dahlgren implemented its new
system in 2002, performance data were not available. Less
systematically, China Lake and NAVSEA's Newport division consider
current salary in making pay and award decisions. For example, at
Newport, supervisors within each pay pool are to list all employees in
each pay band by salary before a rating is determined and then evaluate
each employee's contribution to the organization considering that
salary. If their contributions exceed expectations, employees are
considered for permanent pay increases. If contributions meet
expectations, employees are considered for one-time awards.
Managing Costs of the Pay for Performance System:
OPM reports that the increased costs of implementing alternative
personnel systems should be acknowledged and budgeted for up
front.[Footnote 12] Based on the data the demonstration projects
provided us, direct costs associated with salaries, training, and
automation and data systems were the major cost drivers of implementing
their pay for performance systems. The demonstration projects reported
other direct costs, such as evaluations and administrative expenses.
The demonstration projects used a number of approaches to manage the
direct costs of implementing and maintaining their pay for performance
systems.
Salary Costs:
Under the current GS system, federal employees annually receive the GPI
and, where appropriate, a locality pay adjustment, as well as
periodically receiving WGIs. The demonstration projects use these and
other funding sources under the GS to make their pay decisions, as
shown in figure 7.
Figure 7: Funding Sources Linked to Pay Decisions in Selected Personnel
Demonstration Projects as of Fiscal Year 2003:
[See PDF for image]
[A] According to AcqDemo officials, some AcqDemo organizational units
guaranteed the GPI for the first year to assure employees'
understanding and fair implementation of the process and others
guaranteed the GPI for additional, but limited, years to obtain local
union agreement to enter the demonstration project.
[End of figure]
The aggregated average salary data that some of the demonstration
projects were able to provide do not allow us to determine whether
total salary costs for the demonstration projects are higher or lower
than their GS comparison groups. However, our analysis shows that the
demonstration projects' cumulative percentage increases in average
salaries varied in contrast to their GS comparison groups. For example,
as shown in table 5, after the first year of each demonstration
project's implementation, the differences in cumulative percentage
increase in average salary between the demonstration project employees
and their GS comparison group ranged from -2.9 to 2.7 percentage
points.
Table 5: Cumulative Percentage Increase in Average Salaries for
Demonstration Project and Comparison Group Employees by Year of the
Project, as Reported by the Demonstration Projects:
China Lake;
Year 1; D%: 10.3;
Year 1; C%: 7.6;
Year 1; Difference%: 2.7;
Year 2; D%: 17.7;
Year 2; C%: 14.6;
Year 2; Difference%: 3.1;
Year 3; D%: 24.7;
Year 3; C%: 20.1;
Year 3; Difference%: 4.6;
Year 4; D%: 28.0;
Year 4; C%: 23.5;
Year 4; Difference%: 4.5;
Year 5; D%: 31.6;
Year 5; C%: 27.7;
Year 5; Difference%: 3.9.
NIST;
Year 1; D%: 4.2;
Year 1; C%: 2.7;
Year 1; Difference%: 1.5;
Year 2; D%: 10.1;
Year 2; C%: 7.1;
Year 2; Difference%: 3.0;
Year 3; D%: 17.3;
Year 3; C%: 12.1;
Year 3; Difference%: 5.2;
Year 4; D%: 24.2;
Year 4; C%: 16.6;
Year 4; Difference%: 7.6;
Year 5; D%: 31.1;
Year 5; C%: 21.9;
Year 5; Difference%: 9.2.
NRL;
Year 1; D%: 1.9;
Year 1; C%: 4.8;
Year 1; Difference%: -2.9;
Year 2; D%: 5.4;
Year 2; C%: 8.6;
Year 2; Difference%: -3.2;
Year 3; D%: 10.4;
Year 3; C%: 13.6;
Year 3; Difference%: -3.2.
NAVSEADahlgren;
Year 1; D%: 3.5;
Year 1; C%: 4.9;
Year 1; Difference%: -1.4;
Year 2; D%: 7.8;
Year 2; C%: 10.0;
Year 2; Difference%: -2.2;
Year 3; D%: 10.8;
Year 3; C%: 13.8;
Year 3; Difference%: -3.0;
Year 4; D%: 13.6;
Year 4; C%: 19.1;
Year 4; Difference%: -5.5.
NAVSEA Newport;
Year 1; D%: 3.8;
Year 1; C%: 4.0;
Year 1; Difference%: -1.0;
Year 2; D%: 8.5;
Year 2; C%: 8.6;
Year 2; Difference%: -0.1;
Year 3; D%: 11.0;
Year 3; C%: 13.6;
Year 3; Difference%: -2.6.
Source: GAO analysis of OPM, DOC, and DOD data.
Legend: D = demonstration project; C = comparison group for the
demonstration project in the GS system.
Notes: We calculated the percentage increase in average salaries using
the demonstration project's or comparison group's aggregated average
salary in the year prior to the project's implementation as the
baseline.
Data are as reported by the demonstration projects without verification
by GAO.
Shaded areas indicate that the demonstration project has not yet
reached those years.
Based on our review of the DOC salary data, we determined that the data
were not adequate for use in our comparative analyses of salary growth.
Therefore, we do not present DOC's salary data.
According to a demonstration project official, AcqDemo does not collect
comparable salary data due to its constantly changing and growing
participant base. Therefore, we do not present AcqDemo's average salary
data. AcqDemo reports that demonstration project salaries increased 0.7
percent higher than GS salaries in fiscal year 2000 (year 1) and 2001
(year 2) and 0.9 percent higher in fiscal year 2002 (year 3).
[End of table]
The demonstration projects used several approaches to manage salary
costs, including (1) choosing the method of converting employees into
the demonstration project, (2) considering fiscal conditions and the
labor market, (3) managing movement through the pay band, and (4)
providing a mix of awards and performance pay increases.
Choosing the method of converting employees into the demonstration
project. When the demonstration projects converted employees from the
GS system to the pay for performance system, they compensated each
employee for the portion of the WGI that the employee had earned either
as a permanent increase to base pay or a one-time lump sum payment.
Four of the six demonstration projects (China Lake, NRL, NAVSEA, and
AcqDemo) gave employees permanent increases to base pay, while the
remaining two demonstration projects (NIST and DOC) gave employees one-
time lump sum payments.
Both methods of compensating employees have benefits and drawbacks,
according to demonstration project officials. Giving permanent pay
increases at the point of conversion into the demonstration project
recognizes that employees had already earned a portion of the WGI, but
a drawback is that the salary increases are compounded over time, which
increases the organization's total salary costs. However, the officials
said that giving permanent pay increases garnered employees' support
for the demonstration project because employees did not feel like they
would have been better off under the GS system.
Considering fiscal conditions and the labor market. In determining how
much to budget for pay increases, demonstration projects considered the
fiscal condition of the organization as well as the labor market. For
example, China Lake, NIST, NRL, and NAVSEA receive a portion of their
funding from a working capital fund and thus must take into account
fiscal conditions when budgeting for pay increases and awards. These
organizations rely, in part, on sales revenue rather than direct
appropriations to finance their operations. The organizations establish
prices for their services that allow them to recover their costs from
their customers. If the organizations' services become too expensive
(i.e., salaries are too high), they become less competitive with the
private sector.
A demonstration project official at NAVSEA's Newport division said that
as an organization financed in part through a working capital fund, it
has an advantage over organizations that rely completely on
appropriations because it can justify adjusting pay increase and awards
budgets when necessary to remain competitive with the private sector.
Newport has had to make such adjustments. In fiscal year 2002, the
performance pay increase and award pools were funded at lower levels
(1.4 percent and 1.7 percent of total salaries for pay increases and
awards, respectively) than in 2001 (1.7 percent and 1.8 percent,
respectively) because of fiscal constraints. As agreed with one of its
unions, Newport must set aside a minimum of 1.4 percent of salaries for
its pay increases, which is equal to historical spending under GS for
similar increases.
NAVSEA's Newport division also considers the labor market and uses
regional and industry salary information compiled by the American
Association of Engineering Societies when determining how much to set
aside for pay increases and awards. In fiscal year 2001, Newport funded
pay increases and awards at a higher level (1.7 percent and 1.8 percent
of total salaries, respectively) than in fiscal year 2000 (1.4 percent
and 1.6 percent, respectively) in response to higher external engineer,
scientist, and information technology personnel salaries.
Managing movement through the pay band. Because movement through the
pay band is based on performance, demonstration project employees could
progress through the pay band more quickly than under the GS. Some
demonstration projects have developed ways intended to manage this
progression to prevent all employees from eventually migrating to the
top of the pay band and thus increasing salary costs.
NIST and DOC manage movement through the pay band by recognizing
performance with larger pay increases early in the pay band and career
path and smaller increases higher in the pay band and career path. Both
of these demonstration projects divided each pay band into five
intervals. The intervals determine the maximum percentage increase
employees could receive for permanent pay increases. The intervals,
shown in figure 8, have helped NIST manage salary costs, according to a
NIST official.
Figure 8: Pay Bands, Intervals, and Corresponding Permanent Pay
Increases for NIST's Scientific and Engineering Career Path:
[See PDF for image]
[End of figure]
Similarly, some of the demonstration projects, including China Lake and
NAVSEA's Dahlgren division, have checkpoints or "speed bumps" in their
pay bands intended to manage salary costs as well as ensure that
employees' performance coincides with their salaries as they progress
through the band. These projects established checkpoints designed to
ensure that only the highest performers move into the upper half of the
pay band. For example, when employees' salaries at China Lake reach the
midpoint of the pay band, they must receive ratings of highly
successful, which are equivalent to exceeding expectations, before they
can receive additional salary increases. A Performance Review Board,
made up of senior management, is to review all highly successful
ratings.
Providing a mix of awards and pay increases. Some of the demonstration
projects intended to manage costs by providing a mix of one-time awards
and permanent pay increases. Rewarding an employee's performance with
an award instead of an equivalent increase to base pay can reduce
salary costs in the long run because the agency only has to pay the
amount of the award one time, rather than annually. For example, at
NAVSEA's Newport division, as employees move higher into the pay band,
they are more likely to receive awards than permanent increases to base
pay. According to a Newport official, expectations increase along with
salaries and thus it is more likely that their contributions would
meet, rather than exceed, expectations.
To manage costs, China Lake allows pay pools to transfer some of their
budgets for permanent pay increases to their budgets for awards. A
China Lake official said that because China Lake receives a portion of
its funding from a working capital fund, it is not only important to
give permanent salary increases to high-performing employees, but also
to give increases China Lake can afford the next year. China Lake does
not track how much funding is transferred from performance pay increase
budgets to awards budgets.
Training Costs:
We have reported that agencies will need to invest resources, including
time and money, to ensure that employees have the information, skills,
and competencies they need to work effectively in a rapidly changing
and complex environment.[Footnote 13] This includes investments in
training and developing employees as part of an agency's overall effort
to achieve cost-effective and timely results. Agency managers and
supervisors are often aware that investments in training and
development initiatives can be quite large. However, across the federal
government, evaluation efforts have often been hindered by the lack of
accurate and reliable data to document the total costs of training
efforts. Each of the demonstration projects trained employees on the
performance management system prior to implementation to make employees
aware of the new approach, as well as periodically after implementation
to refresh employee familiarity with the system. The training was
designed to help employees understand competencies and performance
standards; develop performance plans; write self-appraisals; become
familiar with how performance is evaluated and how pay increases and
awards decisions are made; and know the roles and responsibilities of
managers, supervisors, and employees in the appraisal and payout
processes.
Generally, demonstration projects told us they incurred direct and
indirect costs associated with training. Direct training costs that the
demonstration projects reported included costs for contractors,
materials, and travel related to developing and delivering training to
employees and managers. As shown in table 6, total direct costs that
the demonstration projects reported for training through the first 5
years of the projects' implementation range from an estimated $33,000
at NAVSEA's Dahlgren division to more than $1 million at China
Lake.[Footnote 14] (NIST reported no direct costs associated with
training.) Training costs, as indicated by the cost per employee, were
generally higher in the year prior to implementation, except for
AcqDemo's, which increased over time.
Table 6: Direct Inflation-Adjusted Cost of Training in the First 5
Years of the Demonstration Projects (in 2002 Dollars), as Reported by
the Demonstration Projects:
Demonstration project: China Lake;
Year prior to implementation: $203;
Cost per demonstration project employee: $21;
Cost per demonstration project employee: No data;
Cost per demonstration project employee: No data;
Cost per demonstration project employee: No data;
Cost per demonstration project employee: No data;
Prior to implementation through year 5: $1,226,000.
Demonstration project: NIST;
Year prior to implementation: 0;
Cost per demonstration project employee: 0;
Cost per demonstration project employee: 0;
Cost per demonstration project employee: 0;
Cost per demonstration project employee: 0;
Cost per demonstration project employee: 0;
Prior to implementation through year 5: 0.
Demonstration project: DOC;
Year prior to implementation: 12;
Cost per demonstration project employee: 6;
Cost per demonstration project employee: $5;
Cost per demonstration project employee: $8;
Cost per demonstration project employee: $8;
Cost per demonstration project employee: No data;
Prior to implementation through year 5: 105,000.
Demonstration project: NRL;
Year prior to implementation: 84;
Cost per demonstration project employee: 5;
Cost per demonstration project employee: 4;
Cost per demonstration project employee: 0;
Cost per demonstration project employee: [Empty];
Cost per demonstration project employee: [Empty];
Prior to implementation through year 5: 248,000.
Demonstration project: NAVSEA-Dahlgren;
Year prior to implementation: 17 (estimate);
Cost per demonstration project employee: 0 (estimate);
Cost per demonstration project employee: 0 (estimate);
Cost per demonstration project employee: 0 (estimate);
Cost per demonstration project employee: 0 (estimate);
Cost per demonstration project employee: 0 (estimate);
Prior to implementation through year 5: 33,000 (estimate).
Demonstration project: NAVSEA Newport;
Year prior to implementation: 26 (estimate);
Cost per demonstration project employee: 4 (estimate);
Cost per demonstration project employee: 1 (estimate);
Cost per demonstration project employee: 1 (estimate);
Cost per demonstration project employee: 1 (estimate);
Cost per demonstration project employee: [Empty];
Prior to implementation through year 5: 68,000 (estimate).
Demonstration project: AcqDemo;
Year prior to implementation: No data;
Cost per demonstration project employee: 8;
Cost per demonstration project employee: 10;
Cost per demonstration project employee: 9;
Cost per demonstration project employee: 20;
Cost per demonstration project employee: $19;
Prior to implementation through year 5: 458,000.
Source: GAO analysis of DOC and DOD data.
Notes: The cost per demonstration project employee is based on the
number of employees in the demonstration project at the same time each
year, not the actual number of employees trained on the demonstration
project, because the demonstration projects do not collect this
information.
Data are as reported by the demonstration projects without verification
by GAO.
Shaded squares indicate that the demonstration project has not yet
reached those years.
[End of table]
While the demonstration projects did not report indirect costs
associated with training employees on the demonstration project,
officials stated that indirect costs, such as employee time spent
developing, delivering, or attending training, could nonetheless be
significant. Likewise, the time spent on the "learning curve" until
employees are proficient with the new system could also be significant.
For example, although NIST did not capture its indirect training costs,
agency officials told us that prior to implementation, each NIST
employee was in training for 1 day. Since its implementation, NIST
offers optional one-half day training three times a year for all
employees. AcqDemo offered 8 hours of training for employees prior to
implementation and a minimum of 4 hours of training after
implementation. All potential new participants also received eight
hours of training prior to implementation at their site. Supervisors
and human resources professionals at AcqDemo were offered an additional
8 hours of training each year after the demonstration project was
implemented. According to a DOC official, prior to conversion to the
demonstration project, DOC provided a detailed briefing to
approximately 400 employees to increase employee understanding of the
project. In addition, employees could schedule one-on-one counseling
sessions with human resources staff to discuss individual issues and
concerns.
Some of the demonstration projects, including China Lake, DOC, and
NAVSEA's Dahlgren and Newport divisions, managed training costs by
relying on current employees to train other employees on the
demonstration project. According to demonstration project officials,
while there are still costs associated with developing and delivering
in-house training, total training costs are generally reduced by using
employees rather than hiring contractors to train employees. For
example, China Lake took a "train the trainer" approach by training a
group of employees on the new flexibilities in the demonstration
project and having those employees train other employees. According to
a demonstration project official, an added benefit of using employees
to train other employees is that if the person leading the training is
respected and known, then the employees are more likely to support the
demonstration project. The official said that one drawback is that not
all employees are good teachers, so their skills should be carefully
considered.
AcqDemo used a combination of contractors and in-house training to
implement its training strategy. According to an AcqDemo official, the
relatively higher per demonstration project employee costs in years 4
and 5 are a result of AcqDemo's recognition that more in-depth and
varied training was needed for current AcqDemo employees to refresh
their proficiency in the system; for new participants to familiarize
them with appraisal and payout processes; as well as for senior
management, pay pool managers and members, and human resources
personnel to give them greater detail on the process.
Automation and Data Systems Costs:
As a part of implementing a pay for performance system, some of the
demonstration projects installed new or updated existing automated
personnel systems. Demonstration projects reported that total costs
related to designing, installing, and maintaining automation and data
systems ranged from an estimated $125,000 at NAVSEA's Dahlgren division
to an estimated $4.9 million at AcqDemo, as shown in table 7.
Table 7: Inflation-Adjusted Cost of Automation and Data Systems for
Selected Demonstration Projects (in 2002 Dollars), as Reported by the
Demonstration Projects:
Dollars in thousands:
Prior to implementation;
China Lake[A]: No data;
NIST: 0;
DOC: 0;
NRL: $1,467;
NAVSEA-Dahlgren: $125 (estimate);
NAVSEA Newport: $333 (estimate);
AcqDemo: 0.
Cumulative cost since implementation;
China Lake[A]: No data;
NIST: 0;
DOC: $2,317;
NRL: 2,166;
NAVSEA-Dahlgren: 0 (estimate);
NAVSEA Newport: 463 (estimate);
AcqDemo: $4,871 (estimate).
Total;
China Lake[A]: No data;
NIST: 0;
DOC: $2,317;
NRL: $3,633;
NAVSEA-Dahlgren: $125 (estimate);
NAVSEA Newport: $796 (estimate);
AcqDemo: $4,871 (estimate).
Source: GAO analysis of DOC and DOD data.
Notes: Data are as reported by the demonstration projects without
verification by GAO.
Costs may not sum to totals due to rounding.
[A] Automation and data systems were not widely used when the China
Lake demonstration project was implemented in 1980.
[End of table]
To manage data system costs, some demonstration projects modified
existing data systems rather than designing completely new systems to
meet their information needs. For example, NAVSEA's divisions worked
together to modify DOD's existing Defense Civilian Personnel Data
System to meet their needs for a revised performance appraisal system.
Similarly, DOC imported the performance appraisal system developed by
NIST and converted the payout system to a Web-based system. While NIST
reported that it incurred no direct costs for automation and data
systems, officials told us it used in-house employees, NIST's
Information Technology Laboratory staff, to develop a data system to
automate performance ratings, scores, increases, and awards.
NRL used a combination of in-house employees and contractors to
automate its performance management system. While reported automation
and data systems' costs were higher for NRL than for most other
demonstration projects, NRL reports that its automated system has
brought about savings each year of an estimated 10,500 hours of work,
$266,000, and 154 reams of paper since the demonstration project was
implemented in 1999.
Providing Information to Employees about the Results of Performance
Appraisal and Pay Decisions:
We have observed that a performance management system should have
adequate safeguards to ensure fairness and guard against abuse. One
such safeguard is to ensure reasonable transparency and appropriate
accountability mechanisms in connection with the results of the
performance management process. To this end, NIST, NAVSEA's Newport
Division, NRL, and AcqDemo publish information for employees on
internal Web sites about the results of performance appraisal and pay
decisions, such as the average performance rating, the average pay
increase, and the average award for the organization and for each
individual unit. Other demonstration projects publish no information on
the results of the performance cycle.
NAVSEA's Newport division publishes results of its annual performance
cycle. Newport aggregates the data so that no individual employee's
rating or payout can be determined to protect confidentiality.
Employees can compare their performance rating category against others
in the same unit, other units, and the entire division, as shown in
figure 9.
Figure 9: Sample of NAVSEA Newport Division's Rating Category
Distribution Data Provided to Employees:
[See PDF for image]
[End of figure]
Until recently, only if requested by an employee would NIST provide
information such as the average rating, pay increase, and award amount
for the employee's pay pool. To be more open, transparent, and
responsive to employees, NIST officials told us that in 2003, for the
first time, NIST began to publish the results of the performance cycle
on its internal Web site. NIST published averages of the performance
rating scores, as shown in figure 10, as well as the average
recommended pay increase amounts and the average awards by career path,
for the entire organization, and for each organizational unit.
According to one NIST official, the first day the results were
published on the internal Web site, the Web site was visited more than
1,600 times.
Figure 10: Sample of NIST's Distribution of Average Performance Rating
Scores Provided to Employees:
[See PDF for image]
[A] Indicates that there were not enough employees in the unit to
protect confidentiality; therefore, no data are reported.
Publishing the results of the performance management process can
provide employees with the information they need to better understand
the performance management system. However, according to an official,
DOC does not currently publish performance rating and payout results
even though DOC's third year evaluation found that demonstration
project participants continued to raise concerns that indicated their
lack of understanding about the performance appraisal process.
According to the evaluation, focus group and survey results indicated
the need for increased understanding on topics such as how pay pools
work, how salaries are determined, and how employees are rated.
Employees were also interested in knowing more about the results of the
performance appraisal process. One union representative told us that a
way to improve the demonstration project would be to publish
information. In past years, according to employee representatives, some
employees and union representatives at DOC have used the Freedom of
Information Act to request and obtain the information. According to a
DOC official, DOC plans to discuss the publication of average scores by
each major unit and look for options to increase employee understanding
of the performance management system at upcoming Project Team and
Departmental Personnel Management Board meetings.
Concluding Observations:
Linking pay to performance is a key practice for effective performance
management. As Congress, the administration, and federal agencies
continue to rethink the current approach to federal pay to place
greater emphasis on performance, the experiences of personnel
demonstration projects can provide insights into how some organizations
within the federal government are implementing pay for performance. The
demonstration projects took different approaches to using competencies
to evaluate employee performance, translating performance ratings into
pay increases and awards, considering employees' current salaries in
making performance pay decisions, managing costs of the pay for
performance systems, and providing information to employees about the
results of performance appraisal and pay decisions. These different
approaches were intended to enhance the success of the pay for
performance systems because the systems were designed and implemented
to meet the demonstration projects' unique cultural and organizational
needs.
We strongly support the need to expand pay for performance in the
federal government. How it is done, when it is done, and the basis on
which it is done can make all the difference in whether such efforts
are successful. High-performing organizations continuously review and
revise their performance management systems to achieve results,
accelerate change, and facilitate two-way communication throughout the
year so that discussions about individual and organizational
performance are integrated and ongoing. To this end, these
demonstration projects show an understanding that how to better link
pay to performance is very much a work in progress at the federal
level.
Additional work is needed to strengthen efforts to ensure that
performance management systems are tools to help the demonstration
projects manage on a day-to-day basis. In particular, there are
opportunities to use organizationwide competencies to evaluate employee
performance that reinforce behaviors and actions that support the
organization's mission, translate employee performance so that managers
can make meaningful distinctions between top and poor performers with
objective and fact-based information, and provide information to
employees about the results of the performance appraisals and pay
decisions to ensure that reasonable transparency and appropriate
accountability mechanisms are in place.
Agency Comments:
We provided drafts of this report to the secretaries of Defense and
Commerce for their review and comment. DOD's Principal Deputy, Under
Secretary of Defense for Personnel and Readiness, provided written
comments, which are presented in appendix III. DOD concurred with our
report and stated that it is a useful summary of the various approaches
that the demonstration projects undertook to implement their pay for
performance systems and that their experiences provide valuable insight
into federal pay for performance models. DOD also noted that the NAVSEA
demonstration project training and automation cost data are estimated
rather than actual costs. We made the appropriate notation. While DOC
did not submit written comments, DOC's Classification, Pay, and HR
Demonstration Program Manager provided minor technical clarifications
and updated information. We made those changes where appropriate. We
provided a draft of the report to the Director of OPM for her
information.
As agreed with your offices, unless you publicly announce its contents
earlier, we plan no further distribution of this report until 30 days
after its date. At that time, we will provide copies of this report to
other interested congressional parties, the Secretaries of Defense and
Commerce, and the Director of OPM. We will also make this report
available to others upon request. In addition, the report will be
available at no charge on the GAO Web site at [Hyperlink, http://
www.gao.gov].
If you have any questions about this report, please contact me or Lisa
Shames on (202) 512-6806. Other contributors are acknowledged in
appendix IV.
Signed by:
J. Christopher Mihm:
Director, Strategic Issues:
[End of section]
Appendixes:
Appendix I: Objective, Scope, and Methodology:
To meet our objective to identify the approaches that selected
personnel demonstration projects have taken to implement their pay for
performance systems, we chose the following demonstration projects: the
Navy Demonstration Project at China Lake (China Lake), the National
Institute of Standards and Technology (NIST), the Department of
Commerce (DOC), the Naval Research Laboratory (NRL), the Naval Sea
Systems Command Warfare Centers (NAVSEA) at Dahlgren and Newport, and
the Civilian Acquisition Workforce Personnel Demonstration Project
(AcqDemo). We selected these demonstration projects based on our review
of the projects and in consultation with the Office of Personnel
Management (OPM). Factors we considered in selecting these
demonstration projects included the type of pay for performance system,
type of agency (defense or civilian), status of the project (ongoing,
permanent, or complete), date the project was implemented, and number
and type of employees covered (including employees covered by a union).
To identify the different approaches that the demonstration projects
took in implementing their pay for performance systems, we analyzed
Federal Register notices outlining the major features and regulations
for each demonstration project, operating manuals, annual and summative
evaluations, employee attitude survey results, project briefings,
training materials, rating and payout data, cost data, rating
distribution data from OPM's Central Personnel Data File (CPDF), and
other relevant documentation. In addition, we spoke with cognizant
officials from OPM; demonstration project managers, human resource
officials, and participating supervisors and employees; and union and
other employee representatives.
We prepared a data collection instrument to obtain actual and estimated
cost data from the six demonstration projects. We tested the instrument
with a demonstration project official to ensure that the instrument was
clear and comprehensive. After revising the instrument based on the
official's recommendations, we administered the instrument via e-mail
and followed up with officials via telephone, as necessary. Officials
from the six demonstration projects provided actual cost data where
available and estimated data when actual data were not available. Cost
data reported are actual unless otherwise indicated. We adjusted cost
data for inflation using the Consumer Price Index, in 2002 dollars. We
provide average salary data, as reported by the demonstration projects
and OPM without verification by GAO. The aggregated average salary data
do not allow us to determine whether total salary costs for the
demonstration projects are higher or lower than their General Schedule
(GS) comparison groups.
We did not independently evaluate the effectiveness of the
demonstration projects or independently validate the data provided by
the agencies or published in the evaluations. We assessed the
reliability of cost, salary, rating, and performance pay distribution
data provided by the demonstration projects by (1) performing manual
and electronic testing of required data elements, (2) reviewing
existing information about the data, and (3) interviewing agency
officials knowledgeable about the data. We determined that the data
were sufficiently reliable for the purposes of this report, with the
exception of the DOC salary data, which we do not present. Based on our
review of the DOC salary data we determined that the data were not
adequate for use in our comparative analyses of salary growth. An
evaluation of the DOC demonstration project reported that data were
missing in critical fields, such as pay and performance
scores.[Footnote 15]
We did not independently verify the CPDF data for September 30, 2002.
However, in a 1998 report (OPM's Central Personnel Data File: Data
Appear Sufficiently Reliable to Meet Most Customer Needs, [Hyperlink,
http://www.gao.gov/cgi-bin/getrpt?GAO/GGD-98-199] GAO/GGD-98-199,
Sept. 30, 1998), we reported that governmentwide data from the CPDF for
key variables, such as GS-grade, agency, and career status, were 97
percent or more accurate. However, we did not verify the accuracy of
employee ratings.
We performed our work in the Washington, D.C., metropolitan area from
December 2002 through August 2003 in accordance with generally accepted
government auditing standards.
[End of section]
Appendix II Demonstration Project Profiles:
Navy Demonstration Project at China Lake (China Lake):
[See PDF for image]
Source: GAO analysis of DOD and OPM data:
[End of figure]
Purpose:
The Navy Demonstration Project[Footnote 16] was to:
* develop an integrated approach to pay, performance appraisal, and
classification;
* allow greater managerial control over personnel functions; and:
* expand the opportunities available to employees through a more
responsive and flexible personnel system.
Selected Elements of the Performance Management System:
Competencies: Competencies are tailored to an individual's position.
The employees and their supervisors are to develop performance plans,
which identify the employees' responsibilities and expected results. In
addition, all supervisors are to include certain management
competencies from a menu of managerial factors that best define their
responsibilities, such as developing objectives, organizing work, and
selecting and developing people.
Feedback: Supervisors are to conduct two progress reviews of employees'
performance, set at 5 and 9 months in the performance cycle.
Self-assessment: Employees are strongly encouraged to list
accomplishments for their supervisors' information when determining the
performance rating.
Levels of performance rating: The levels are highly successful (rating
levels 1 or 2), fully successful (rating level 3), or less than fully
successful (rating levels 4 or 5).
Safeguards:
* Second-level review: Second-level supervisors are to review all
assessments. In addition, an overall assessment of highly successful is
to be sent to the appropriate department's Performance Review Board for
the assignment of an official rating of "1" or "2." The supervisor and
reviewer are to assign a "4" or "5" rating based on a problem-solving
team's findings and a personnel advisor's input.
* Grievance process: Generally, employees may request reconsideration
of their ratings in writing to the third-level supervisor and indicate
why a higher rating is warranted and what rating is desired. The third-
level supervisor can either grant the request or request that a
recommending official outside of the immediate organization or chain of
authority be appointed. The employee is to receive a final decision in
writing within 21 calendar days.
Selected Employee Attitude Data:
Figure 11: Selected Employee Attitude Data for China Lake:
[See PDF for image]
Source: DOD.
Legend: N/A = data are not applicable; N = number of respondents.
[End of figure]
Other Interventions:
Reduction in force. To allow for increased retention of high-performing
employees at all levels by ranking employees based on performance for
retention standings.
Salary flexibility. To set entry-level salaries to take into account
market conditions.
Selected Reported Effects:
A demonstration project evaluation reported the following
effects.[Footnote 17]
* Employees viewed performance improvements within their control and
reported increased recognition of individual performance.
* The perception of a pay-performance link was significantly
strengthened under the demonstration pay for performance system, but
not in the comparison group.
* Pay satisfaction increased slightly at the demonstration sites and
declined at the control laboratories.
* Employees and supervisors cited improved communication, a more
objective focus, and clearer performance expectations as major system
benefits.
* Employees and supervisors perceived their performance appraisal
system to be more flexible than the comparison group, to focus more on
actual work requirements, and thus to be more responsive to laboratory
needs.
* Employees at the demonstration project reported having more input
into the development of performance plans than employees in the
comparison group.
Sources for Additional Information:
[Hyperlink, http://www.nawcwpns.navy.mil/~hrd/demo.htm];
[Hyperlink, http://www.nawcwpns.navy.mil/~hrd/demo.htm];
(Last accessed on Nov. 7, 2003)
[Hyperlink, http://www.opm.gov/demos/main.asp.
(Last accessed on Nov. 7, 2003):
National Institute of Standards and Technology (NIST):
[See PDF for image]
Source: GAO analysis of DOC and OPM data.
[End of figure]
Purpose:
The NIST demonstration project, formerly known as the National Bureau
of Standards, was to:
* improve hiring and allow NIST to compete more effectively for high-
quality researchers,
* motivate and retain staff,
* strengthen the manager's role in personnel management, and:
* increase the efficiency of personnel systems.
Selected Elements of the Performance Management System:
Competencies: Competencies, called "critical elements," are based on
the individual position. Employee performance plans are to have a
minimum of two and a maximum of six critical elements, which the
supervisor weights, based on the importance of the critical element,
the time required to accomplish the critical element, or both.
Managers' and supervisors' performance plans are to include a critical
element on diversity and it must be weighted at least 15 points.
Feedback: Supervisors are to conduct midyear reviews of all employees
to discuss accomplishments or deficiencies and modify the initial
performance plans, if necessary.
Self-assessment: Employees are to submit lists of accomplishments for
their supervisors' information when determining the performance
ratings.
Levels of performance rating: The levels are "eligible" or
"unsatisfactory." On a scale of 0 to 100, employees who receive scores
over 40 are rated eligible and those with scores below 40
unsatisfactory.
Safeguards:
* Second-level review: Pay pool managers are to review recommended
scores from supervisors and select a payout for each employee. Pay pool
managers are to present the decisions to the next higher official for
review if the pay pool manager is also a supervisor. The organizational
unit director is to approve awards and review all other decisions.
* Grievance procedure: Employees may grieve their performance ratings,
scores, and pay increases by following DOC's Administrative Grievance
Procedure or appropriate negotiated grievance procedures.
Selected Employee Attitude Data:
Figure 12: Selected Employee Attitude Data for NIST:
[See PDF for image]
Sources: U.S. Office of Personnel Management, Implementation Report
National Institute of Standards and Technology Personnel Management
Demonstration Project (Washington, D.C.: Aug. 18, 1989) and Summative
Evaluation Report National Institute of Standards and Technology
Demonstration Project: 1988-1995 (Washington, D.C.: June 27, 1997).
Legend: N/A = data are not applicable; N = number of respondents.
[A] OPM reported that 47 percent of 3,200 NIST employees and 44 percent
of 2,392 comparison group employees responded to the survey.
[End of figure]
Other Interventions:
Reduction in force. To credit an employee with an overall performance
score in the top 10 percent of scores within a peer group with 10
additional years of service for retention purposes.
Supervisory differential. To establish supervisory intervals within a
pay band that allow for a maximum rate up to 6 percent higher than the
maximum rate of the nonsupervisory intervals within the pay band.
Hiring flexibility. To provide flexibility in setting initial salaries
within pay bands for new appointees, particularly for hard-to-fill
positions in the Scientific and Engineering career path.
Extended probation. To require employees in the Scientific and
Engineering career path to serve a probationary period of 1 to 3 years.
Selected Reported Effects:
A demonstration project evaluation reported the following
effects.[Footnote 18]
* Recruitment bonuses were used sparingly but successfully to attract
candidates who might not have accepted federal jobs otherwise.
* NIST has become more competitive with the private sector and
employees are less likely to leave for reasons of pay.
* NIST was able to provide significant performance-based awards, some
with merit increases as high as 20 percent. NIST succeeded in retaining
more of its high performers than the comparison group.
* Managers reported significantly increased authority over hiring and
pay decisions.
* Managers reported that they felt significantly less restricted by
personnel rules and regulations than other federal managers.
Source for Additional Information:
[Hyperlink, http://www.opm.gov/demos/main.asp];
(Last accessed on Nov. 7, 2003):
Department of Commerce (DOC):
[See PDF for image]
Source: GAO analysis of DOC and OPM data.
[End of figure]
Purpose:
The DOC demonstration project was to test whether the interventions of
the NIST demonstration project could be successful in environments with
different missions and different organizational hierarchies.
Selected Elements of the Performance Management System:
Competencies: Competencies, called "critical elements," are tailored to
each individual position. Performance plans are to have a minimum of
two and a maximum of six critical elements. The supervisor is to weight
each critical element, based on the importance of the element, the time
required to accomplish it, or both, so that the total weight of all
critical elements is 100 points. Organizationwide benchmark performance
standards are to define the range of performance, and the supervisor
may add supplemental performance standards to a performance plan.
Performance plans for managers and supervisors are to include critical
elements such as recommending or making personnel decisions; developing
and appraising subordinates; fulfilling diversity, equal opportunity,
and affirmative action responsibilities; and program and managerial
responsibilities.
Feedback: Supervisors are to conduct midyear reviews of all employees
to discuss accomplishments or deficiencies and modify the initial
performance plans, if necessary.
Self-assessment: Employees are to submit lists of accomplishments for
their supervisors' information when determining the performance
ratings.
Levels of performance rating: The levels are "eligible" or
"unsatisfactory." On a scale of 0 to 100, employees who receive scores
over 40 are rated eligible and those with scores below 40
unsatisfactory.
Safeguards:
* Second-level review: The pay pool manager is to review recommended
scores from subordinate supervisors and select a payout for each
employee. The pay pool manager is to present the decisions to the next
higher official for review if the pay pool manager is also a
supervisor.
* Grievance procedure: Employees may request reconsideration of
performance decisions, excluding awards, by the pay pool manager
through DOC's Administrative Grievance Procedure or appropriate
negotiated grievance procedures.
Selected Employee Attitude Data:
Figure 13: Selected Employee Attitude Data for DOC:
[See PDF for image]
Source: Booz Allen Hamilton, Department of Commerce Personnel
Management Demonstration Project Evaluation Operational Year Technical
Report (Washington, D.C.: Oct. 8, 2002).
Legend: N = number of respondents.
[End of figure]
Other Interventions:
Reduction in force. To credit employees with performance scores in the
top 30 percent of a career path in a pay pool with 10 additional years
of service for retention purposes. Other employees rated "eligible"
receive 5 additional years of service for retention credit.
Supervisory performance pay. To offer employees who spend at least 25
percent of their time performing supervisory duties pay up to 6 percent
higher than the regular pay band.
Probationary period. To require a 3-year probationary period for newly
hired science and engineering employees performing research and
development duties.
Selected Reported Effects:
A demonstration project evaluation reported the following
effects.[Footnote 19]
* The pay for performance system continues to exhibit a positive link
between pay and performance. For example, in year 4 of the
demonstration project, employees with higher performance scores were
more likely to receive pay increases and on average received larger pay
increases than employees with lower scores.
* Some of the recruitment and staffing interventions have been
successful. For example, supervisors are taking advantage of their
ability to offer more flexible starting salaries. Additionally, the
demonstration project has expedited the classification process. DOC's
evaluator recommended that DOC should more fully implement the
recruitment and staffing interventions.
* The 3-year probationary period for scientists and engineers continues
to be used, but assessing its utility remains difficult.
* On the other hand, some retention interventions receive little use or
have not appeared to affect retention. For example, the supervisor
performance pay intervention is not affecting supervisor retention.
Sources for Additional Information:
[Hyperlink, http://ohrm.doc.gov/employees/demo_project.htm] (Last
accessed Nov. 7, 2003):
[Hyperlink, http://www.opm.gov/demos/main.asp] (Last accessed
Nov. 7, 2003):
Naval Research Laboratory (NRL):
[See PDF for image]
Source: GAO analysis of DOD and OPM data.
[End of figure]
Purpose:
The NRL demonstration project was to:
* provide increased authority to manage human resources,
* enable NRL to hire the best qualified employees,
* compensate employees equitably at a rate that is more competitive
with the labor market, and:
* provide a direct link between levels of individual contribution and
the compensation received.
Selected Elements of the Performance Management System:
Competencies: Each career path has two to three "critical elements."
Each critical element has generic descriptors that explain the type of
work, degree of responsibility, and scope of contributions. Pay pool
managers may weight critical elements and may establish supplemental
criteria.
Feedback: Supervisors and employees are to, on an ongoing basis, hold
discussions to specify work assignments and performance expectations.
The supervisor or the employee can request a formal review during the
appraisal process.
Self-assessment: Employees are to submit yearly accomplishment reports
for the supervisors' information when determining the performance
appraisals.
Levels of performance rating: The levels are acceptable or
unacceptable. Employees who are rated acceptable are then determined to
be "over-compensated," "under-compensated," or within the "normal pay
range," based on their contribution scores and salaries.
Safeguards:
* Second-level review: The pay pool panel and pay pool manager are to
compare element scores for all of the employees in the pay pool; make
adjustments, as necessary; and determine the final contribution scores
and pay adjustments for the employees.
* Grievance procedure: Employees can grieve their appraisals through a
two-step process. Employees are to first grieve their scores in
writing, and the pay pool panel reviews the grievances and makes
recommendations to the pay pool manager, who issues decisions in
writing. If employees are not satisfied with the pay pool manager's
decisions, they can then file formal grievances according to NRL's
formal grievance procedure.
Selected Employee Attitude Data:
Figure 14: Selected Employee Attitude Data for NRL:
[See PDF for image]
[End of figure]
Legend: N/A = data are not applicable; N = number of respondents.
Source: DOD.
Other Interventions:
Reduction in force. To credit an employee's basic Federal Service
Computation Date with up to 20 years based on the results of the
appraisal process.
Hiring flexibility. To provide opportunities to consider a broader
range of candidates and flexibility in filling positions.
Extended probationary period. To extend the probationary period to 3
years for certain occupations.
Selected Reported Effects:
A demonstration project evaluation reported the following
effects.[Footnote 20]
From 1996 to 2001:
* Managers' satisfaction with authority to determine employees' pay and
job classification increased from 10 percent of managers to 33 percent.
* Employees' satisfaction with opportunities for advancement increased
from 26 percent to 41 percent.
* The perceived link between pay and performance is stronger under the
demonstration project and increased from 41 percent to 61 percent.
* On the other hand, the percentage of employees who agreed that other
employers in the area paid more than the government for the kind of
work that they do increased from 67 to 76 percent.
Sources for Additional Information:
[Hyperlink, http://hroffice.nrl.navy.mil/personnel_demo/index.htm]
(Last accessed on Nov. 7, 2003)
[Hyperlink, http://www.opm.gov/demos/main.asp]
(Last accessed on Nov. 7, 2003):
Naval Sea Systems Command Warfare Centers (NAVSEA):
[See PDF for image]
Source: GAO analysis of DOD and OPM data.
[End of figure]
Purpose:
The NAVSEA demonstration project was to:
* develop employees to meet the changing needs of the organization;
* help employees achieve their career goals;
* improve performance in current positions;
* retain high performers; and:
* improve communication with customers, colleagues, managers, and
employees.
Selected Elements of the Performance Management System:
Competencies: Each division may implement regulations regarding the
competencies and criteria by which employees are rated. NAVSEA's
Dahlgren division uses three competencies for all employees, and the
Newport division uses eight competencies.
Feedback: Each division may implement regulations regarding the timing
and documentation of midyear feedback. Dahlgren requires at least one
documented feedback session at midyear. Beginning in fiscal year 2004,
Newport requires a documented midyear feedback session.
Self-assessment: Each division has the flexibility to determine whether
and how employees document their accomplishments. Dahlgren requires
employees to provide summaries of their contributions for their
supervisors' information. Newport encourages employees to provide self-
assessments.
Levels of performance rating: All of the divisions use the ratings
"acceptable" and "unacceptable."
Safeguards:
* Second-level review: Divisions are to design the performance
appraisal and payout process. Supervisors at Dahlgren's division and
department levels review ratings and payouts to ensure that the
competencies are applied uniformly and salary adjustments are
distributed equitably. At Newport, second-level supervisors review
recommendations by direct supervisors, make changes to achieve balance
and equity within the organization, then submit the recommendations to
pay pool managers, who are to go through the same process and forward
the recommendations to the department head for final approval.
* Grievance procedure: Divisions are to design their grievance
procedures. Dahlgren and Newport have informal and formal
reconsideration processes. In Dahlgren's informal process, the employee
and supervisor are to discuss the employee's concern and reach a mutual
understanding, and the pay pool manager is to approve any changes. If
the employee is not satisfied with the result of the informal process,
the employee is to submit a formal request to the pay pool manager, who
is to make the final decision. In Newport's informal process, the
employee is to submit a written request to the pay pool manager, who
may revise the rating and payout decision or confirm it. If the
employee is not satisfied with the result of the informal process, the
employee may formally appeal to the department head, who is to render a
decision.
Selected Employee Attitude Data:
Figure 15: Selected Employee Attitude Data for NAVSEA:
[See PDF for image]
Source: DOD.
Legend: N/A = data are not applicable; N = number of respondents.
[End of figure]
Other Interventions:
Advanced in-hire rate. To set, upon initial appointment, an
individual's pay anywhere within the band level consistent with the
qualifications of the individual and requirements of the position.
Scholastic achievement appointments. To employ an alternative examining
process that provides NAVSEA the authority to appoint undergraduates
and graduates to professional positions.
Selected Reported Effects:
A demonstration project evaluation reported the following
effects.[Footnote 21]
From 1996 to 2001:
* The percentage of people who agreed that their managers promote
effective communication among different work groups increased from 31
to 43 percent.
* On the other hand, NAVSEA employees' response to the statement "High
performers tend to stay with this organization" stayed constant at
about 30 percent during this time.
* Additionally, the percentage of employees who said that they have all
of the skills needed to do their jobs remained consistent at 59 and 62
percent, respectively.
Sources for Additional Information:
[Hyperlink, http://www.nswc.navy.mil/wwwDL/XD/HR/DEMO/main.html]
(Last accessed on Nov. 7, 2003)
[Hyperlink, http://www.opm.gov/demos/main.asp]
(Last accessed on Nov. 7, 2003):
Civilian Acquisition Personnel Demonstration Project (AcqDemo):
[See PDF for image]
Source: GAO analysis of DOD and OPM data.
[A] Pub. L. No. 105-85 removed the 5,000 employee participant cap at
AcqDemo.
[End of figure]
Purpose:
AcqDemo was to:
* attract, motivate, and retain a high-quality acquisition workforce;
* achieve a flexible and responsive personnel system;
* link pay to employee contributions to mission accomplishment; and:
* gain greater managerial control and authority over personnel
processes.
Selected Elements of the Performance Management System:
Competencies: Six core contribution "factors," as well as
"discriminators" and "descriptors," are used to evaluate every
employee.
Feedback: AcqDemo requires at least one formal feedback session
annually and encourages informal and frequent communication between
supervisors and employees, including discussion of any inadequate
contribution. Each service, agency, or organization may require one or
more additional formal or informal feedback sessions.
Self-assessment: Employees can provide a list of contributions for each
factor.
Levels of performance rating: The levels are "appropriately
compensated," "over-compensated," and "under-compensated."
Safeguards:
* Second-level review: The supervisors and the pay pool manager are to
ensure consistency and equity across ratings. The pay pool manager is
to approve the employee's overall contribution score, which is
calculated based on the employee's contribution ratings.
* Grievance procedure: Employees may grieve their ratings and actions
affecting the general pay increase or performance pay increases. An
employee covered by a negotiated grievance procedure is to use that
procedure to grieve his or her score. An employee not under a
negotiated grievance procedure is to submit the grievance first to the
rating official, who will submit a recommendation to the pay pool
panel. The pay pool panel may accept the rating official's
recommendation or reach an independent decision. The pay pool panel's
decision is final unless the employee requests reconsideration by the
next higher official to the pay pool manager. That official would then
render the final decision on the grievance.
Selected Employee Attitude Data:
Figure 16: Selected Employee Attitude Data for AcqDemo:
[See PDF for image]
[End of figure]
Source: DOD.
Legend: N/A = data are not applicable; N = number of respondents.
[End of figure]
Other Interventions:
Voluntary emeritus program. To provide a continuing source of corporate
knowledge and valuable on-the-job training or mentoring by allowing
retired employees to voluntarily return without compensation and
without jeopardizing retirement pay.
Extended probationary period. To provide managers a length of time
equal to education and training assignments outside of the supervisors'
review to properly assess the contribution and conduct of new hires in
the acquisition environment.
Scholastic achievement appointment. To provide the authority to appoint
degreed candidates meeting desired scholastic criteria to positions
with positive education requirements.
Flexible appointment authority. To allow an agency to make a modified
term appointment to last from 1 to 5 years when the need for an
employee's services is not permanent.
Selected Reported Effects:
A demonstration project evaluation reported the following
effects.[Footnote 22]
* Attrition rates for over-compensated employees increased from 24.1 in
2000 to 31.6 percent in 2002. Attrition rates for appropriately
compensated employees increased from 11.5 in 2000 to 14.1 percent in
2002. Attrition rates for under-compensated employees decreased from
9.0 in 2000 to 8.5 in 2001 and then increased to 10.2 percent in 2002.
* Increased pay-setting flexibility has allowed organizations in
AcqDemo to offer more competitive salaries, which has improved
recruiting.
* Employees' perception of the link between pay and contribution
increased, from 20 percent reporting that pay raises depend on their
contribution to the organization's mission in 1998 to 59 percent in
2003.
Sources for Additional Information:
[Hyperlink, http://www.acq.osd.mil/acqdemo/]
(Last accessed on Nov. 7, 2003)
[Hyperlink, http://www.opm.gov/demos/index.asp]
(Last accessed on Nov. 7, 2003):
[End of section]
Appendix III: Comments from the Department of Defense:
OFFICE OF THE UNDER SECRETARY OF DEFENSE
4000 DEFENSE PENTAGON
WASHINGTON, D.C. 20301-4000:
PERSONNEL AND Readiness:
DEC 15 2003:
Mr. J. Christopher Mihm:
Director, Strategic Issues:
U.S. General Accounting Office:
Washington, DC 20548:
Dear Mr. Mihm:
This is the Department of Defense response to the GAO draft report,
GAO-04-XXX, HUMAN CAPITAL: Implementing Pay for Performance at Selected
Personnel Demonstration Projects, dated November 21, 2003 (GAO Code
450183).
We concur with the report but ask that you please add the following
note to Tables 6 and 7 on pages 30 and 32: NAVSEA-Dahlgren and Newport
did not separately track the cost of training and automation efforts
specifically related to the Personnel Demonstration Project over the
past 5 years. The cost data shown on pages 30-32 of the report are
estimates and may not accurately reflect the actual costs incurred.
The report provides a useful summary of the various approaches that the
personnel demonstration projects undertook to implement their pay for
performance systems. The experiences of the personnel demonstration
projects provide valuable insight into federal pay for performance
models. The Department's review of the successes, lessons learned, and
challenges of the personnel demonstration projects resulted in the
identification of the best practices in performance management. In
collaboration with our employee representatives, the Department will
now fashion these best practices into a meaningful pay for performance
model under the new National Security Personnel System.
We appreciate the opportunity to comment on the draft report.
Sincerely,
Signed by:
Charles S. Abell:
Principal Deputy:
[End of section]
Appendix IV: GAO Contacts and Staff Acknowledgments:
GAO Contacts:
J. Christopher Mihm, (202) 512-6806 or [Hyperlink, mihmj@gao.gov],
Lisa Shames, (202) 512-6806 or [Hyperlink, shamesl@gao.gov].
Acknowledgments:
In addition to the individuals named above, Michelle Bracy, Ron La Due
Lake, Hilary Murrish, Adam Shapiro, and Marti Tracy made key
contributions to this report.
(450183):
FOOTNOTES
[1] U.S. General Accounting Office, Results-Oriented Cultures: Creating
a Clear Linkage between Individual Performance and Organizational
Success, GAO-03-488 (Washington, D.C.: Mar. 14, 2003).
[2] U.S. General Accounting Office, Human Capital: DHS Personnel System
Design Effort Provides for Collaboration and Employee Participation,
GAO-03-1099 (Washington, D.C.: Sept. 30, 2003).
[3] Two governmentwide initiatives were intended to implement pay for
performance systems for supervisors and managers. The Merit Pay System
was established under the Civil Service Reform Act of 1978 and ended in
1984. Its successor--the Performance Management and Recognition System-
-ended in 1993.
[4] The GS is the federal government's main pay system for "white-
collar" positions. The GS is composed of 15 grade levels. Each grade is
divided into 10 specific pay levels called "steps."
[5] See U.S. General Accounting Office, An Evaluation Framework for
Improving the Procurement Function (Exposure Draft) (Washington, D.C.:
October 2003), for more information on a framework to enable a high-
level, qualitative assessment of the strengths and weaknesses of
agencies' procurement functions.
[6] At DOC, all managerial and supervisory employees are also evaluated
on core critical elements, such as recommending or making personnel
decisions; developing and appraising subordinates; and fulfilling
diversity, equal opportunity, and affirmative action responsibilities,
in addition to program responsibilities.
[7] China Lake gives managers discretion in determining how awards are
distributed among employees with ratings of "fully successful" or
above.
[8] As a point of comparison, in 2002, about 48 percent of GS employees
across the executive branch under a similar five-level rating system
were rated in the highest category and less than 1 percent were rated
as less than fully successful.
[9] As a point of comparison, in 2002, about 92 percent of GS employees
across the executive branch under a similar four-level rating system
were rated in the top two categories and about 0.1 percent were rated
as unacceptable.
[10] As a point of comparison, in 2002, about 99.9 percent of GS
employees across the executive branch under a similar two-level rating
system passed and about 0.1 percent failed.
[11] The "standard pay line" spans from the dollar equivalent of GS-1,
step 1, to the dollar equivalent of GS-15, step 10. Appropriately
compensated employees' salaries fall within the "normal pay range,"
which encompasses an area of +/-4.0 points from the standard pay line.
[12] U.S. Office of Personnel Management, Demonstration Projects and
Alternative Personnel Systems: HR Flexibilities and Lessons Learned
(Washington, D.C.: September 2001).
[13] U.S. General Accounting Office, Human Capital: A Guide for
Assessing Strategic Training and Development Efforts in the Federal
Government (Exposure Draft), GAO-03-893G (Washington, D.C.: July 1,
2003).
[14] All dollars were inflation-adjusted to 2002 dollars because the
demonstration projects took place over a variety of years.
[15] Booz Allen Hamilton, Department of Commerce Personnel Management
Demonstration Project Evaluation Year Four Report (McLean, Va.:
September 2003).
[16] The Navy Demonstration Project was also implemented at the Space
and Naval Systems Command in San Diego, California.
[17] Source: U.S. Office of Personnel Management, A Summary Assessment
of the Navy Demonstration Project (Washington, D.C.: February 1986).
[18] Source: U.S. Office of Personnel Management, Summative Evaluation
Report National Institute of Standards and Technology Demonstration
Project: 1988-1995 (Washington, D.C.: June 27, 1997).
[19] Source: Booz Allen Hamilton, Department of Commerce Personnel
Management Demonstration Project Evaluation Year Four Report (McLean,
Va.: September 2003).
[20] Sources: U.S. Office of Personnel Management, 2002 Summative
Evaluation DOD S&T Reinvention Laboratory Demonstration Program
(Washington, D.C.: August 2002), and DOD. The OPM report evaluated all
of the projects in the Science and Technology Reinvention Laboratory
Demonstration Program and presented the results together, rather than
by demonstration project. Data are based on survey information provided
by DOD.
[21] Sources: U.S. Office of Personnel Management, 2002 Summative
Evaluation DOD S&T Reinvention Laboratory Demonstration Program
(Washington, D.C.: August 2002), and DOD. The OPM report evaluated all
of the projects in the Science and Technology Reinvention Laboratory
Demonstration Program and presented the results together, rather than
by demonstration project. Data are based on survey information provided
by DOD.
[22] Source: Cubic Applications, Inc., DOD Civilian Acquisition
Workforce Personnel Demonstration Project: Interim Evaluation Report
Volume I - Management Report (Alexandria, Va.: July 2003).
GAO's Mission:
The General Accounting Office, the investigative arm of Congress,
exists to support Congress in meeting its constitutional
responsibilities and to help improve the performance and accountability
of the federal government for the American people. GAO examines the use
of public funds; evaluates federal programs and policies; and provides
analyses, recommendations, and other assistance to help Congress make
informed oversight, policy, and funding decisions. GAO's commitment to
good government is reflected in its core values of accountability,
integrity, and reliability.
Obtaining Copies of GAO Reports and Testimony:
The fastest and easiest way to obtain copies of GAO documents at no
cost is through the Internet. GAO's Web site ( www.gao.gov ) contains
abstracts and full-text files of current reports and testimony and an
expanding archive of older products. The Web site features a search
engine to help you locate documents using key words and phrases. You
can print these documents in their entirety, including charts and other
graphics.
Each day, GAO issues a list of newly released reports, testimony, and
correspondence. GAO posts this list, known as "Today's Reports," on its
Web site daily. The list contains links to the full-text document
files. To have GAO e-mail this list to you every afternoon, go to
www.gao.gov and select "Subscribe to e-mail alerts" under the "Order
GAO Products" heading.
Order by Mail or Phone:
The first copy of each printed report is free. Additional copies are $2
each. A check or money order should be made out to the Superintendent
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or
more copies mailed to a single address are discounted 25 percent.
Orders should be sent to:
U.S. General Accounting Office
441 G Street NW,
Room LM Washington,
D.C. 20548:
To order by Phone:
Voice: (202) 512-6000:
TDD: (202) 512-2537:
Fax: (202) 512-6061:
To Report Fraud, Waste, and Abuse in Federal Programs:
Contact:
Web site: www.gao.gov/fraudnet/fraudnet.htm E-mail: fraudnet@gao.gov
Automated answering system: (800) 424-5454 or (202) 512-7470:
Public Affairs:
Jeff Nelligan, managing director, NelliganJ@gao.gov (202) 512-4800 U.S.
General Accounting Office, 441 G Street NW, Room 7149 Washington, D.C.
20548: