Human Capital

DOD Needs to Improve Implementation of and Address Employee Concerns about Its National Security Personnel System Gao ID: GAO-08-773 September 10, 2008

The Department of Defense (DOD) has begun implementing the National Security Personnel System (NSPS), its new human capital system for managing civilian personnel performance. As of May 2008, about 182,000 civilian employees were under NSPS. DOD's implementation of NSPS will have far-reaching implications for DOD and civil service reform across the federal government. Based on our prior work looking at performance management in the public sector and DOD's challenges in implementing NSPS, GAO developed an initial list of safeguards that NSPS should include to ensure it is fair, effective, and credible. Congress required GAO to determine (1) the extent to which DOD has implemented internal safeguards to ensure the fairness, effectiveness, and credibility of NSPS; and (2) how DOD civilian personnel perceive NSPS and what actions DOD has taken to address these perceptions. To conduct this work, GAO analyzed relevant documents and employee survey results; interviewed appropriate officials; and conducted discussion groups with employees and supervisors at 12 selected installations.

While DOD has taken some steps to implement internal safeguards to ensure that NSPS is fair, effective, and credible, the implementation of some safeguards could be improved. Specifically, DOD has taken steps to (1) involve employees in the system's design and implementation, (2) link employee objectives and agency goals, (3) train employees on the system's operation, (4) require ongoing performance feedback between supervisors and employees, (5) better link individual pay to performance, (6) allocate agency resources for the system, (7) include predecisional safeguards to determine if rating results are fair and nondiscriminatory, (8) provide reasonable transparency, and (9) provide meaningful distinctions in employee performance. GAO believes continued monitoring of all of these safeguards is needed to ensure that DOD's actions are effective as more employees become covered by NSPS. GAO also determined that DOD could immediately improve its implementation of three safeguards. First, DOD does not require a third party to analyze rating results for anomalies prior to finalizing employee ratings, and therefore it is unable to determine whether ratings are fair and nondiscriminatory before they are finalized. Second, the process lacks transparency because DOD does not require commands to publish final rating distributions, though doing so is recognized as a best practice by DOD and GAO. Third, NSPS guidance may discourage rating officials from making meaningful distinctions in employee ratings because it indicated that the majority of employees should be rated at the "3" level, on a scale of 1 to 5, resulting in a hesitancy to award ratings in other categories. Without steps to improve implementation of these safeguards, employee confidence in the system will ultimately be undermined. Although DOD employees under NSPS are positive regarding some aspects of performance management, DOD does not have an action plan to address the generally negative employee perceptions of NSPS. According to DOD's survey of civilian employees, employees under NSPS are positive about some aspects of performance management, such as connecting pay to performance. However, employees who had the most experience under NSPS showed a negative movement in their perceptions. For example, the percent of NSPS employees who believe that NSPS will have a positive effect on DOD's personnel practices declined from 40 percent in 2006 to 23 percent in 2007. Negative perceptions also emerged during discussion groups that GAO held. For example, employees and supervisors were concerned about the excessive amount of time required to navigate the process. Although the Office of Personnel Management issued guidance recommending that agencies use employee survey results to provide feedback to employees and implement an action plan to guide their efforts to address employee assessments, DOD has not developed an action plan to address employee perceptions. While it is reasonable for DOD to allow employees some time to accept NSPS because organizational changes often require time to adjust, it is prudent to address persistent negative employee perceptions. Without such a plan, DOD is unable to make changes that could result in greater employee acceptance of NSPS.

Recommendations

Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.

Director: Team: Phone:


GAO-08-773, Human Capital: DOD Needs to Improve Implementation of and Address Employee Concerns about Its National Security Personnel System This is the accessible text file for GAO report number GAO-08-773 entitled 'Human Capital: DOD Needs to Improve Implementation of and Address Employee Concerns about Its National Security Personnel System' which was released on September 10, 2008. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. Report to Congressional Committees: United States Government Accountability Office: GAO: September 2008: Human Capital: DOD Needs to Improve Implementation of and Address Employee Concerns about Its National Security Personnel System: GAO-08-773: GAO Highlights: Highlights of GAO-08-773, a report to congressional committees. Why GAO Did This Study: The Department of Defense (DOD) has begun implementing the National Security Personnel System (NSPS), its new human capital system for managing civilian personnel performance. As of May 2008, about 182,000 civilian employees were under NSPS. DOD‘s implementation of NSPS will have far-reaching implications for DOD and civil service reform across the federal government. Based on our prior work looking at performance management in the public sector and DOD‘s challenges in implementing NSPS, GAO developed an initial list of safeguards that NSPS should include to ensure it is fair, effective, and credible. Congress required GAO to determine (1) the extent to which DOD has implemented internal safeguards to ensure the fairness, effectiveness, and credibility of NSPS; and (2) how DOD civilian personnel perceive NSPS and what actions DOD has taken to address these perceptions. To conduct this work, GAO analyzed relevant documents and employee survey results; interviewed appropriate officials; and conducted discussion groups with employees and supervisors at 12 selected installations. What GAO Found: While DOD has taken some steps to implement internal safeguards to ensure that NSPS is fair, effective, and credible, the implementation of some safeguards could be improved. Specifically, DOD has taken steps to (1) involve employees in the system‘s design and implementation, (2) link employee objectives and agency goals, (3) train employees on the system‘s operation, (4) require ongoing performance feedback between supervisors and employees, (5) better link individual pay to performance, (6) allocate agency resources for the system, (7) include predecisional safeguards to determine if rating results are fair and nondiscriminatory, (8) provide reasonable transparency, and (9) provide meaningful distinctions in employee performance. GAO believes continued monitoring of all of these safeguards is needed to ensure that DOD‘s actions are effective as more employees become covered by NSPS. GAO also determined that DOD could immediately improve its implementation of three safeguards. First, DOD does not require a third party to analyze rating results for anomalies prior to finalizing employee ratings, and therefore it is unable to determine whether ratings are fair and nondiscriminatory before they are finalized. Second, the process lacks transparency because DOD does not require commands to publish final rating distributions, though doing so is recognized as a best practice by DOD and GAO. Third, NSPS guidance may discourage rating officials from making meaningful distinctions in employee ratings because it indicated that the majority of employees should be rated at the ’3“ level, on a scale of 1 to 5, resulting in a hesitancy to award ratings in other categories. Without steps to improve implementation of these safeguards, employee confidence in the system will ultimately be undermined. Although DOD employees under NSPS are positive regarding some aspects of performance management, DOD does not have an action plan to address the generally negative employee perceptions of NSPS. According to DOD‘s survey of civilian employees, employees under NSPS are positive about some aspects of performance management, such as connecting pay to performance. However, employees who had the most experience under NSPS showed a negative movement in their perceptions. For example, the percent of NSPS employees who believe that NSPS will have a positive effect on DOD‘s personnel practices declined from 40 percent in 2006 to 23 percent in 2007. Negative perceptions also emerged during discussion groups that GAO held. For example, employees and supervisors were concerned about the excessive amount of time required to navigate the process. Although the Office of Personnel Management issued guidance recommending that agencies use employee survey results to provide feedback to employees and implement an action plan to guide their efforts to address employee assessments, DOD has not developed an action plan to address employee perceptions. While it is reasonable for DOD to allow employees some time to accept NSPS because organizational changes often require time to adjust, it is prudent to address persistent negative employee perceptions. Without such a plan, DOD is unable to make changes that could result in greater employee acceptance of NSPS. What GAO Recommends: GAO is recommending that DOD improve the implementation of some safeguards and develop and implement an action plan to address employee concerns about NSPS. DOD generally concurred with our recommendations, with the exception of one requiring predecisional review of ratings. To view the full product, including the scope and methodology, click on [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-08-773]. For more information, contact Brenda S. Farrell at (202) 512-3604 or farrellb@gao.gov. [End of section] Contents: Letter: Results in Brief: Background: DOD Has Taken Steps to Implement Internal Safeguards to Ensure Fairness of NSPS; However, Implementation of Some Safeguards Could Be Improved: Although DOD Civilian Employees under NSPS Identified Some Positive Aspects of the System, DOD Does Not Have a Plan for Addressing the Generally Negative Employee Perceptions of NSPS: Conclusions: Recommendations for Executive Action: Agency Comments and Our Evaluation: Appendix I: Scope and Methodology: Appendix II: Example of Linking Performance to Mission and Objectives: Appendix III: Additional Responses to 2007 Status of Forces Survey of DOD Civilian Employees: Appendix IV: Other Themes Discussed by Department of Defense Civilians during GAO Discussion Groups: Appendix V: Comments from the Department of Defense: Appendix VI: GAO Contact and Staff Acknowledgments: Related GAO Products: Tables: Table 1: Number of DOD Civilian Employees Phased into NSPS, as of May 2008: Table 2: Percentage of Employees in Each Rating Category by DOD and Pay Pools Visited: Table 3: Estimated Percentage of Responses from Status of Forces Survey for DOD Civilian Employees, May 2007: Table 4: Estimated Percentage of Spiral 1.1 Employees‘ Responses for Select Questions from the May 2007, November 2006, and May 2006 Administrations of the Status of Forces Survey for DOD Civilian Employees: Table 5: Composition of Discussion Groups: Table 6: Composition of Discussion Groups by Demographic Category per Component: Table 7: Estimated Percentage of Employees Responding to Questions about Overall Satisfaction and Leadership and Management in May 2007 Status of Forces Survey-Civilian: Table 8: Estimated Percentage of Employees Responding to Questions about Leadership and Management, Motivation/Development/Involvement, and Performance Management in May 2007 Status of Forces Survey- Civilian: Table 9: Estimated Percentage of Employees Responding to Question about Performance Management in May 2007 Status of Forces Survey-Civilian: Table 10: Estimated Percentage of Employees Responding to Question about Performance Management in May 2007 Status of Forces Survey- Civilian: Table 11: Estimated Percentage of Employees Responding to Questions about Retention and Commitment in May 2007 Status of Forces Survey- Civilian: Table 12: Estimated Percentage of Employees Responding to Questions about the National Security Personnel System in May 2007 Status of Forces Survey-Civilian: Table 13: Estimated Percentage of Employees Responding to Questions about the National Security Personnel System in May 2007 Status of Forces Survey-Civilian: Table 14: Additional Themes that Emerged during Discussion Groups with Select Employees: Figures: Figure 1: NSPS Design and Implementation Team Organization: Figure 2: Phases of NSPS Performance Management Process: Figure 3: Example of NSPS Pay Pool Organization: Figure 4: Example of Linking Performance to Mission and Objectives: Abbreviations: DOD: Department of Defense: NSPS: National Security Personnel System: SOFS: Status of Forces Survey: PEO: Program Executive Office: PAA: Performance Appraisal Application: DMDC: Defense Manpower Data Center: [End of section] United States Government Accountability Office: Washington, DC 20548: September 10, 2008: Congressional Committees: In 2007, we reported that strategic human capital management remained a high-risk area because the federal government now faces one of the most significant transformations to the civil service in half a century, as momentum grows toward making governmentwide changes to agency pay, classification, and performance management systems.[Footnote 1] The Department of Defense (DOD) is in the initial stages of implementing its new human capital system for managing civilian personnel--the National Security Personnel System (NSPS). NSPS significantly redesigned the rules, regulations, and processes that govern the way that civilian employees are hired, compensated, and promoted at DOD. As a result, DOD is in a period of transition and faces an array of challenges and opportunities to enhance performance, ensure accountability, and position itself for the future. In a series of testimonies prior to the enactment of the NSPS legislation in 2003, we raised a number of critical issues about the proposed regulations for NSPS.[Footnote 2] Since then, we have provided congressional committees with insight on DOD's process to design its new personnel management system, the extent to which DOD's process reflects key practices for successful transformation, the need for internal controls and transparency of funding, and the most significant challenges facing DOD in implementing NSPS.[Footnote 3] While GAO supports human capital reform in the federal government, how such reform is done, when it is done, and the basis upon which it is done can make all the difference in whether such efforts are successful. Specifically, we have noted in testimonies and reports that DOD and other federal agencies must ensure that performance management systems contain appropriate internal safeguards, such as assuring reasonable transparency in connection with the results of the performance management process. We developed an initial list of safeguards based on our extensive body of work looking at the performance management practices used by leading public sector organizations both in the United States and in other countries as well as on our experiences in implementing a modern performance management system for our own staff at GAO.[Footnote 4] Implementing internal safeguards is a way to ensure that pay-for-performance systems in the government are fair, effective, and credible.[Footnote 5] Additionally, we reported that the implementation of NSPS will have far-reaching implications, not just for DOD, but for civil service reform across the federal government because NSPS could serve as a model for governmentwide transformation in human capital. In light of these challenges and implications, in March 2007 the Senate Armed Services Committee asked us to review the implementation of the NSPS performance management system to determine the extent to which DOD has effectively incorporated internal safeguards that we had previously identified as key to successful implementation of performance management systems in the federal government and assess employee attitudes toward NSPS. Further, the National Defense Authorization: Act for Fiscal Year 2008[Footnote 6] required us to determine the extent to which DOD has effectively incorporated accountability mechanisms and internal safeguards in NSPS and to assess employee attitudes toward NSPS. We assessed the extent to which DOD's performance management system has incorporated the following safeguards:[Footnote 7] * Involve employees, their representatives, and other stakeholders in the design of the system, to include employees directly involved in validating any related implementation of the system. * Assure that the agency's performance management system links employee objectives to the agency's strategic plan, related goals, and desired outcomes. * Implement a pay-for-performance evaluation system to better link individual pay to performance, and provide an equitable method for appraising and compensating employees. * Provide adequate training and retraining for supervisors, managers, and employees in the implementation and operation of the performance management system. * Institute a process for ensuring ongoing performance feedback and dialogue between supervisors, managers, and employees throughout the appraisal period, and setting timetables for review. * Assure that certain predecisional internal safeguards exist to help achieve consistency, equity, nondiscrimination, and nonpoliticization of the performance management process (e.g., independent reasonableness reviews by a third party or reviews of performance rating decisions, pay determinations, and promotions before they are finalized to ensure that they are merit-based, as well as pay panels who consider the results of the performance appraisal process and other information in connection with final pay decisions). * Assure that there are reasonable transparency and appropriate accountability mechanisms in connection with the results of the performance management process, including periodic reports on internal assessments and employee survey results relating to performance management and individual pay decisions while protecting individual confidentiality. * Assure that the agency's performance management system results in meaningful distinctions in individual employee performance. * Provide a means for ensuring that adequate agency resources are allocated for the design, implementation, and administration of the performance management system. To address this congressional request and mandate, we established the following objectives: (1) To what extent has DOD implemented accountability mechanisms and internal safeguards to ensure the fairness, effectiveness, and credibility of NSPS; and (2) How do DOD civilian personnel perceive NSPS and what actions has DOD taken to address these perceptions? To determine the extent to which DOD had implemented safeguards to ensure the fairness, effectiveness, and credibility of NSPS, we identified safeguards specified in the National Defense Authorization Act for Fiscal Year 2008, as well as other safeguards GAO has previously identified as key internal safeguards, and analyzed regulations and other guidance provided by officials in DOD and the four components' headquarters--the Army, Navy, Air Force, and Fourth Estate.[Footnote 8] We also reviewed documents, such as pay pool business rules and regulations obtained during 12 site visits--3 for each component--to military installations. Further, we interviewed appropriate agency officials at various levels within DOD and conducted interviews with officials of various management levels at each site we visited. The sites were selected because they contained a large number or concentrated group of civilian employees that had been placed under NSPS and were geographically distributed throughout the United States. In addition, to determine how DOD civilian employees' perceive NSPS, we analyzed the results of DOD's May 2006, November 2006, and May 2007 Status of Forces Survey (SOFS) of civilian employees. These surveys gauge initial employee attitudes toward NSPS and in our analysis, we begin to identify trends.[Footnote 9] Further, we assessed DOD's survey methodology and found that DOD's surveys of DOD civilians were generally conducted in accordance with standard research practices; however, there were some areas that could be improved. We also conducted small group discussions with employees and supervisors at each of the 12 sites we visited. While the information from our discussion groups is not generalizable to the entire population of DOD civilians, it provides valuable insight into civilians' perceptions about the implementation of NSPS. For more information about our scope and methodology, see appendix I. We conducted this performance audit from August 2007 to July 2008 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. Results in Brief: While DOD has taken some steps to implement internal safeguards to ensure that the NSPS performance management system is fair, effective, and credible, the implementation of some of these safeguards could be improved. Specifically, DOD has taken some steps to (1) involve employees in the system's design and implementation; (2) link employee objectives and the agency's strategic goals and mission; (3) train and retrain employees in the system's operation; (4) provide ongoing performance feedback between supervisors and employees; (5) better link individual pay to performance in an equitable manner; (6) allocate agency resources for the system's design, implementation, and administration; (7) include predecisional internal safeguards to determine whether rating results are consistent, equitable, and nondiscriminatory; (8) provide reasonable transparency of the system and its operation; and (9) impart meaningful distinctions in individual employee performance. For example, all 12 sites we visited trained employees on NSPS, and the DOD-wide tool used to compose self- assessments links employees' objectives to the commands' or agencies' strategic goals and mission. We believe continued monitoring of all of these safeguards is needed to ensure that DOD's actions are effective as implementation proceeds and more employees become covered by NSPS. We also determined that DOD could immediately improve its implementation of three safeguards: predecisional internal safeguards, reasonable transparency, and meaningful distinctions in employee performance. First, DOD is unable to determine whether NSPS rating results are nondiscriminatory before they are finalized because it does not require a third party to analyze the predecisional rating results for anomalies. According to Program Executive Office (PEO) officials, DOD does not require a predecisional analysis because of concerns that pay pool panels might adjust their results even if assessments did not warrant changes. PEO officials also stated that DOD's analysis of final results by demographics is sufficient to ensure fairness and nondiscrimination. Second, employees at some installations do not have transparency over the final results of the performance management process because DOD does not require commands to publish rating distributions for employees. In fact, 3 of the sites we visited decided not to publish the overall final rating and share distribution results. Third, NSPS performance management guidance may discourage rating officials from making meaningful distinctions in employee performance because this guidance emphasized that most employees should be evaluated as a "3" (or "valued performer") on a scale of 1 to 5. According to NSPS implementing issuance, rating results should be based on how well employees complete their job objectives using the performance indicators. Although DOD and most of the installations we visited emphasized that there was not a forced distribution of ratings, some pay pool panel members acknowledged that there was a hesitancy to award employee ratings in categories other than "3". Until DOD effectively implements these three safeguards, employees will not have assurance that NSPS is fair, equitable, and credible, which ultimately could undermine employees' confidence and result in failure of the system. We are recommending that DOD improve the implementation of these three safeguards by (1) requiring a third party to perform predecisional demographic and other analysis as appropriate for pay pools, (2) requiring overall final rating results to be published, and (3) encouraging pay pools and supervisors to use all categories of ratings as appropriate. In commenting on a draft of this report, DOD concurred with our recommendation to require overall final rating results to be published and partially concurred with our recommendation to encourage pay pools and supervisors to use all categories of ratings as appropriate. DOD did not concur with our recommendation to require a third party to perform predecisional demographic analysis as appropriate for pay pools, noting, among other things, that postdecisional analysis of results is more useful to identify barriers and corrective actions. We, however, continue to believe that our recommendation has merit and that identifying an anomaly in the ratings prior to finalizing them would allow management to investigate the situation and determine whether any non-merit-based factors contributed to the anomaly. Although DOD employees under NSPS are positive regarding some aspects of the NSPS performance management system, DOD does not have an action plan to address the generally negative employee perceptions of NSPS identified in both the department's SOFS for civilian employees and discussion groups we held at 12 select installations. According to our analysis of DOD's most recent survey from May 2007, NSPS employees expressed slightly more positive attitudes than their DOD colleagues who remain under the General Schedule system about some goals of performance management, such as connecting pay to performance and receiving feedback regularly. For example, an estimated 43 percent of NSPS employees compared to an estimated 25 percent of all other DOD employees said that pay raises depend on how well employees perform their jobs.[Footnote 10] However, responses from NSPS employees with the most experience under NSPS showed a downward movement in their attitude toward other elements of the system. For example, the estimated percentage of employees who agreed that their performance appraisal was a fair reflection of their performance declined from 67 percent in May 2006 to 52 percent May 2007. In addition, the percent of NSPS employees who believe that NSPS will have a positive effect on DOD's personnel practices dropped from 40 percent in May 2006 to 23 percent in 2007. Our focus group meetings gave rise to views consistent with DOD's survey results. While some civilian employees and supervisors under NSPS seemed optimistic about the intent of the system, most of the DOD employees and supervisors we spoke with expressed a consistent set of wide-ranging concerns. Specifically, employees noted: (1) NSPS's negative impact on employee motivation and morale, (2) the excessive amount of time and effort required to navigate the performance management process, (3) the potential influence that employees' and supervisors' writing skills have on panels' assessments of employee ratings, (4) the lack of transparency and understanding of the pay pool panel process, and (5) the rapid pace at which the system was implemented, which often resulted in employees feeling unprepared and unable to find answers to their questions. These negative attitudes are not surprising given that organizational transformations often entail fundamental and radical change that require an adjustment period to gain employee acceptance and trust. To address employee attitudes and acceptance, the Office of Personnel Management issued guidance that recommends--and we believe it is a best practice--that agencies use employee survey results to provide feedback to employees and develop and implement an action plan that guides their efforts to address the results of employee assessments. However, according to PEO officials, DOD has not developed a specific action plan to address critical issues identified by employee perceptions, because they want employees to have more time under the system before making changes. Without such a plan, DOD is unable to make changes that address employee perceptions that could result in greater employee acceptance and, ultimately, the successful implementation of the performance management system. We are recommending that DOD develop and implement a specific action plan to address employee perceptions of NSPS ascertained from DOD's surveys and employee focus groups. The plan should include actions to mitigate employee concerns about, for example, the potential influence that employees' and supervisors' writing skills have on the panels' assessment of employee ratings or other issues consistently identified by employees or supervisors. DOD partially concurred with our recommendation, noting that it will address areas of weakness identified in its comprehensive, in progress evaluation of NSPS and is institutionalizing a continuous improvement strategy. Background: The National Defense Authorization Act for Fiscal Year 2004 provided DOD with the authority to establish a pay-for-performance management system as part of NSPS.[Footnote 11]DOD established a team to design and implement NSPS and manage the transformation process. In April 2004, the Secretary of Defense appointed an NSPS Senior Executive to, among other things, design, develop, and implement NSPS. Under the Senior Executive's authority, the Program Executive Office (PEO) was established as the central policy and program office for NSPS. The PEO's responsibilities includes designing the human resource/pay-for- performance systems, developing communication and training strategies, modifying personnel information technology, and preparing joint enabling regulations and internal DOD implementing regulations. As the central DOD-wide program office, the PEO directs and oversees the components' NSPS program managers, who report to their parent components and the NSPS PEO. These program managers also serve as their components' NSPS action officers and participate in the development, planning, implementation, and deployment of NSPS. Figure 1 shows the organization of the NSPS design and implementation team. Figure 1: NSPS Design and Implementation Team Organization: [See PDF for image] This figure is an illustration of an organizational chart, as follows: (direct reporting authority unless otherwise indicated) Senior Executive: - Overarching Integrated Product Team (indirect reporting authority); * Program Executive Officer (PEO); - Chief of Staff; - Deputy PEO; - Senior Advisory Group (indirect reporting authority); - Army Program Management Office (indirect reporting authority); - Air Force Program Management Office (indirect reporting authority); - Department of the Navy Program Management Office[A] (indirect reporting authority); - Washington Headquarters Service Program Management Office[B] (indirect reporting authority); - Training; - Implementation and deployment; - Program Evaluation; - Human Resources Information Systems; - Legislative and public affairs; - Legal; - Budget and financial management. * Director, Human Resources Systems; - Deputy; * Director, Labor Relations and Appeals; - Deputy. Source: DOD. [A] Includes the U.S. Navy and the U.S. Marine Corps. [B] The Washington Headquarters Services is a field activity that reports to the Director of Administration and Management, which has oversight responsibility for DOD's "Fourth Estate" entities. The "Fourth Estate" encompasses those organizational entities in DOD that are not in the military departments or the combatant commands. These include the Office of the Secretary of Defense, the Joint Staff, the Office of the Inspector General of DOD, the defense agencies, and DOD field activities. [End of figure] Table 1 shows DOD has phased (or spiraled) in over 182,000 civilian employees into NSPS as of May 2008.[Footnote 12] Subsequently, the National Defense Authorization Act for Fiscal Year 2008 prohibited the Secretary of Defense from converting more than 100,000 employees to NSPS in any calendar year. In response to this and other legislative changes that resulted in revising NSPS regulations, the PEO has not developed a new timeline for phasing in the remaining approximately 273,000 employees.[Footnote 13] Table 1: Number of DOD Civilian Employees Phased into NSPS, as of May 2008: Spiral: 1.1; Number of employees: 11,391. Spiral: 1.2; Number of employees: 67,586. Spiral: 1.3; Number of employees: 35,147. Spiral: 2.1; Number of employees: 17,305. Spiral: 2.2; Number of employees: 50,438. Spiral: Employees not associated with a particular spiral; Number of employees: 763. Spiral: Total number of employees; Number of employees: 182,630. Source: GAO analysis of DOD data. Note: Employees not associated with a particular spiral--or conversion group--are employees who are currently under NSPS, but whose positions were not coded to show the spiral. [End of table] The performance management process of NSPS is ongoing and consists of several phases that are repeated in each annual performance cycle, as shown in figure 2. The planning phase that starts the cycle involves supervisors (or rating officials) and employees working together to establish performance plans. This includes (1) developing job objectives--the critical work employees perform that is aligned with their organizational goals and focused on results--and (2) identifying contributing factors--the attributes and behaviors that identify how the critical work established in the job objectives is going to be accomplished (e.g., cooperation and teamwork). After the planning phase comes the monitoring and developing phase, during which ongoing communication between supervisors and employees occurs to ensure that work is accomplished; attention is given to areas that need to be addressed; and managers, supervisors, and employees have a continued and shared understanding of expectations and results. In the rating phase, the supervisor prepares a written assessment that captures the employee's accomplishments during the appraisal period. In the final-- or reward--phase, employees should be appropriately rewarded or compensated for their performance with performance payouts. During this phase, employee assessments are reviewed by multiple parties to determine employees' ratings and, ultimately, performance payouts. Figure 2: Phases of NSPS Performance Management Process: [See PDF for image] This figure illustrates a repeating loop of the phases of NSPS Performance Management Process, as follows: Plan; Monitor; Develop; Rate; Reward; repeat the loop. Source: GAO rendering of DOD data. [End of figure] The performance management process under NSPS is organized by pay pools. A pay pool is a group of employees who share in the distribution of a common pay-for-performance fund.[Footnote 14] The key parties that make up pay pools are the employee, supervisor, higher-level rating authority, pay pool panel, pay pool manager, performance review authority, and, in some instances, the sub-pay pool[Footnote 15] as shown in figure 3. Figure 3: Example of NSPS Pay Pool Organization: [See PDF for image] This figure is an illustration of the NSPS Pay Pool Organization, as follows: Performance review authority (PRA); Pay Pool Manager; * Pay Pool panel member; - Rating official; employees; - Rating official; employees; * Pay Pool panel member (sub-pay pool manager); - Rating official (sub-pay pool panel member); employees; - Rating official (sub-pay pool panel member); employees; - Rating official (sub-pay pool panel member); employees. Source: DOD. [End of figure] Each of these groups has defined responsibilities under the performance management process. For example, employees are encouraged to be involved throughout the performance management cycle, including: initially working with their supervisors to develop job objectives and identify associated contributing factors; identifying and recording accomplishments and results throughout the appraisal period; and participating in interim reviews and end-of-year assessments, for example by preparing self-assessments. Supervisors (or rating officials) are responsible for effectively managing the performance of their employees. This includes: * clearly communicating performance expectations; * aligning performance expectations and employee development with organization mission and goals; * working with employees to develop written job objectives reflective of expected accomplishments and contributions for the appraisal period and identifying applicable contributing factors; * providing employees meaningful, constructive, and candid feedback relative to performance expectations, including at least one documented interim review; * making meaningful distinctions among employees based on performance and contribution; and: * providing recommended ratings of record, share assignments, and payout distributions to the pay pool. The higher level reviewer, typically the rating official's supervisor, is responsible for reviewing and approving job objectives and recommended employee assessments. The higher level reviewer is the first step in assuring consistency of ratings, because this individual looks across multiple ratings. The next level of review is with the pay pool panel or, in some cases, the sub-pay pool panel. The pay pool panel is a board of management officials who are usually in positions of line authority or in senior staff positions with resource oversight for the organizations, groups, or categories of employees comprising the pay pool membership.[Footnote 16] The primary function of the pay pool panel is the reconciliation of ratings of record, share distribution, and payout allocation decisions. Each pay pool has a manager who is responsible for providing oversight of the pay pool panel. The pay pool manager is the final approving official of the rating of record. Performance payout determinations may be subject to higher management review by the performance review authority[Footnote 17] or equivalent review process. The performance review authority provides oversight of several pay pools, and addresses the consistency of performance management policies within a component, major command, field activity, or other organization as determined by the component. DOD Has Taken Steps to Implement Internal Safeguards to Ensure Fairness of NSPS; However, Implementation of Some Safeguards Could Be Improved: Although DOD has taken some steps to implement internal safeguards to ensure that the NSPS performance management system is fair, effective, and credible, implementation of some safeguards could be improved. Specifically, DOD has taken some steps to implement the safeguards identified in the National Defense Authorization Act for Fiscal Year 2008 as well as safeguards GAO previously identified. These safeguards include: (1) involving employees in the design and implementation of the system; (2) linking employee objectives and the agency's strategic goals and mission; (3) training and retraining employees and supervisors in the system's operation; (4) requiring ongoing performance feedback between supervisors and employees; (5) providing a system to better link individual pay to performance in an equitable manner; (6) allocating agency resources for the design, implementation, and administration of the system; (7) including predecisional internal safeguards to determine whether rating results are consistent, equitable, and nondiscriminatory; (8) providing reasonable transparency of the system and its operation; and (9) assuring meaningful distinctions in individual employee performance. GAO has previously reported that agencies should continually perform management controls, such as monitoring of programs.[Footnote 18] We further reported that agencies can conduct this ongoing monitoring internally or through separate evaluations that are performed by the agency Inspector General or an external auditor, such as GAO. While we believe continued monitoring of all of these safeguards is needed to ensure that DOD's actions are effective as implementation proceeds and more employees become covered by NSPS, we determined that DOD's implementation of three safeguards--predecisional internal safeguards, reasonable transparency, and meaningful distinctions--could be improved immediately. Until DOD effectively implements these safeguards, employees will not have assurance that the system is fair, equitable, and credible, which ultimately could undermine employees' confidence and result in failure of the system. Involve Employees in the Design and Implementation of the System: DOD has taken several steps to involve employees and their stakeholders in the design and implementation of NSPS. For example, DOD solicited comments from employees and unions representing DOD employees during the design of NSPS. Specifically, PEO officials said the department received over 58,000 comments from people in response to the proposed rules published in the Federal Register during the design phase. [Footnote 19] These PEO officials further stated that unions were appropriately engaged in the process and were afforded the opportunity to comment on NSPS through the formal "meet and confer" process with the union coalition.[Footnote 20] However, according to union representatives we spoke with, DOD did not appropriately involve the unions in the design of NSPS. Moreover, in 2005, unions representing DOD employees filed a lawsuit against DOD claiming, among other things, that DOD blocked the unions from meaningful participation in developing NSPS regulations. However, the U.S. District Court for the District of Columbia ruled in favor of DOD, finding that it satisfied its statutory obligation to collaborate with the unions.[Footnote 21] Initially, according to PEO officials, DOD involved some civilian employees in the preliminary design stages of NSPS. For example, in 2004, PEO sponsored about 100 focus groups throughout DOD, including overseas locations.[Footnote 22] Through these focus groups, PEO received comments, ideas, and suggestions, which were summarized and used in various design elements of NSPS. During 2004, DOD also conducted town hall meetings both domestically and overseas to provide employees with information about the status of the design and development of NSPS, communicate with the workforce, and solicit additional thoughts and ideas. Some of these town hall meetings were broadcast live and videotapes of some of these meetings were later rebroadcast on military television channels and websites. The performance management system in DOD's original implementing issuance was based on employees being rated on standard performance factors such as cooperation and teamwork. However, according to a PEO official, DOD received comments from management officials, individual employees, and unions representing DOD employees opposing this approach. As a result, DOD changed the performance management system. At the time of our review, supervisors rated employees on specific job objectives that were either written for the individual employee by the rating official and employee or were standard for an organization, e.g., the Army's standard supervisory objective. In addition, the original performance factors became contributing factors that are identified as essential to completing the job objective. Furthermore, as part of the system design in 2005, DOD awarded a contract to develop a performance factor model and associated benchmark descriptors to use in NSPS. The contractors conducted workshops with a sample of 95 "experienced" employees from three occupation categories- -professional/analytic, supervisory/managerial, and technician/ support. During these workshops, the participants reviewed and revised the performance factors and work behaviors to ensure their relevance, accuracy, and applicability across jobs and organizational components, so that the factors and work behaviors could serve as a basis for clearly communicating performance expectations. Following the workshops, the contractor administered Web-based questionnaires to all DOD employees who would be covered by NSPS. These questionnaires asked employees to rate the importance of each work behavior statement defining a performance factor in terms of its importance for performing their jobs. Valid survey responses were received from approximately 14 percent--or 71,000 employees. The responses from this survey were then used to refine the performance factors, all but one of which were, at the time of this review, functioning as the contributing factors and were used to augment employees' ratings. These performance factors--or contributing factors--are: communication, cooperation and teamwork, critical thinking, customer focus, leadership, resource management, and technical proficiency.[Footnote 23] Employees on the supervisory/ managerial pay schedule have an additional contributing factor-- supervision. The employees and management officials we met with at the installations we visited generally were not involved in either the design or implementation of NSPS at the DOD level; however, we generally found that employees served a contributory role in implementing NSPS at their respective installations and commands. For example, employees at several bases were involved in developing lessons learned following the end of each performance management cycle. Their input was sought through a variety of methods, including e-mail, group discussions, and surveys. During implementation of the system at the DOD level, we generally found that some employees were involved in assessments of the process after the performance cycle. For example, the PEO conducted focus groups with select employees across all of the components. In addition, PEO engaged management from across the components in lessons learned sessions. Link Employee Objectives to the Agency's Strategic Goals and Mission: DOD has made efforts to link employees' objectives to the agency's strategic goals, mission, and desired outcomes. The DOD-wide tool for employee self-assessments and appraisals--the Performance Appraisal Application (PAA)--provides a designated area for the employee's command's mission to be inserted as a guide while employees compose their job objectives and self-assessments. Many NSPS management officials, pay pool panel members, and supervisors we spoke with said that the incorporation of the overall goals in the PAA was a first step in facilitating employees' ability to link their objectives to the agency's goals and missions. In addition, management officials at the sites we visited told us that they verbally communicated to employees how their specific roles facilitate the overall mission during group discussions or other venues. Management officials at some installations stated that they also encouraged first-line supervisors to have conversations about this relationship with their employees. Furthermore, one installation we visited provided employees with a briefing slide that visually explained how employees at that activity fit into the overall component's mission and desired outcomes (see app. II). Training and Retraining in the System's Implementation and Operation: DOD encouraged employees who were transitioning to NSPS to receive a 10- hour training course that covered skills and behaviors necessary to implement and sustain NSPS, foster support and confidence in the system, and facilitate the transition to a performance-based, results- oriented culture. Program officials from all components told us that they required employees who were transitioning to NSPS to take training on NSPS. Specifically, this included all employees among the military services, and at least 80 percent of employees among defense agencies and activities under the fourth estate. Further, we found that the 12 sites we visited provided DOD's introductory training on NSPS to all employees, as well as an additional introductory course for supervisors. DOD also offered specialized training for functional areas covered by the NSPS regulations, such as for supervisors/managers. These specialized training courses cover pay banding, staffing flexibilities, and performance management, among other topics. The core functional training includes 18 hours of basic training and 24 hours of pay pool panel training for managers and supervisors, 10 hours for employees, and 26 or more hours for human resource practitioners. Further, courses aimed at managers and supervisors focus heavily on the performance management aspect of NSPS, and address goal-setting, communicating with employees, and linking individual expectations to the goals and objectives of the organization. DOD also focused on change management training to address the behavioral aspects of moving to NSPS and to better prepare the workforce for the changes that will result from the new system's implementation. Training on NSPS was provided via printed materials such as brochures or pamphlets, Web- based training, and classroom instructor-led training. In addition, some of the installations we visited supplied or had plans to incorporate supplemental training on subjects such as writing self assessments. Moreover, component program officials told us that the components have plans for sustainment training, which is largely the responsibility of the individual components. For example, the Army has incorporated NSPS training into its course for newly promoted supervisors. The Navy is developing just-in-time vignettes and additional training on "soft skills," such as feedback, which will be available for both supervisors and employees. Further, DOD had a number of online training options for employees and supervisors. Ongoing Performance Feedback and Dialogue between Supervisors and Employees: DOD's implementing issuances require supervisors to provide regular and timely performance feedback that is meaningful, constructive, and candid, including at least one documented interim review and an annual performance appraisal during each performance appraisal period. At 10 of the sites we visited, supervisors told us that they communicated performance ratings and feedback to employees in person, as encouraged by DOD. Furthermore, DOD's online system--PAA--allowed supervisors and employees to document interim, final, and any other formal feedback sessions. System to Better Link Individual Pay to Performance in an Equitable Manner: The structure of NSPS, as it was designed, is intended to allow linkage between individual pay and performance in an equitable manner. For example, NSPS has a multirating system that allows distinctions to be made in employee performance, and therefore compensation. For instance, within the five rating categories, employees may receive various numbers of shares according to their rating of record. Since the number of shares awarded determines the employee's overall payout, awarding various numbers of shares permits further granularity--or distinctions- -in linking employees' performance and pay. Moreover, several of the pay pool panel members we spoke with told us they used discretion in assigning higher ratings to ensure that the share value remained significant, and therefore facilitated greater pay increases for those employees awarded more shares--or higher ratings. Means to Ensure that Adequate Agency Resources Are Allocated for System Design, Implementation, and Administration: DOD has taken steps to ensure that agency resources are allocated for the system's design, implementation, and administration, including steps to address--but not fully implement--resource allocation actions we previously recommended that could benefit the long-term implementation of NSPS. As an example, NSPS law[Footnote 24] provides that, to the maximum extent practicable, for fiscal years 2004 through 2012 the aggregate amount of money allocated for civilian compensation for organizations under NSPS may not be less than the amount that would have been allocated under the General Schedule system.[Footnote 25] DOD has taken some actions to ensure that organizations under NSPS receive no less money for performance payments in the pay-banded NSPS than they would have for associated compensation and performance awards under the General Schedule system. For example, according to a PEO official, the department determined the percentage that components must use as their minimum, aggregated pay pool percentage for performance-based salary increases.[Footnote 26] Further, the department has taken steps to address actions we have previously recommended. In July 2007, we found that DOD's November 2005 cost estimate of $158 million to implement NSPS between fiscal years 2005 and 2008 did not include the full cost that DOD expected to incur as a result of implementing the new system.[Footnote 27] Further, we reported that the total amount of funds DOD had expended or obligated to design and implement NSPS during fiscal years 2005 through 2006 could not be determined because DOD had not established an oversight mechanism to ensure that these costs would be fully captured. As a result, we recommended that DOD define all costs needed to manage NSPS, prepare a revised estimate of those system implementation costs in accordance with federal financial accounting standards, and develop a comprehensive oversight framework to ensure that all funds expended or obligated to design and implement NSPS would be fully captured and reported. DOD generally concurred with our recommendations. To address our recommendations, PEO reconvened the DOD-wide NSPS Financial Integrated Product Team in 2007, which recommended, as we did, that the department expand the cost category definitions and clarify the treatment of direct and indirect costs. PEO advised the components of these new definitions and the resulting requirements in September 2007. PEO also provided a revised estimate for implementation costs in the proposed NSPS regulations, published in the Federal Register on May 22, 2008.[Footnote 28] Specifically, DOD estimated the overall costs associated with continuing to implement NSPS will be approximately $143 million from fiscal years 2009 through 2011. To address our recommendation on oversight of reported costs, PEO reports that each component took actions. Specifically, the Army established new account processing codes for NSPS that comply with NSPS reporting categories and identified a central NSPS budget point of contact. Further, the Assistant Secretary of the Army (Financial Management and Comptroller) is providing an independent review to determine whether the Army major commands are meeting established internal procedures for tracking, capturing, and reporting NSPS implementation costs in specific categories. The Navy required all major commands to provide screen shots and/or proof that quarterly implementation costs are recorded in the appropriate accounting system. In addition, the Fourth Estate established unique identifiers for such cost transactions in its organization's financial management and accounting systems, including the Defense Travel System. The Fourth Estate entities are required to verify cost data through trial balances and reconciliations with the Defense Finance Accounting Services' reports and monthly billing reports and each of the entities' comptrollers and/or resource management directorate reviews these latter reports. Lastly, at the time of our review, according to PEO, all Air Force NSPS activity was classified as "sustainment" and those costs were accounted for within the service's existing financial oversight framework, which includes its cost accounting systems. The Air Force has completed deploying NSPS and has no further implementation costs to report; however, future implementation costs may accrue, if additional employees are later converted to NSPS. Predecisional Internal Safeguards to Determine if Rating Results Are Consistent, Equitable, and Nondiscriminatory: DOD has taken some steps to ensure that predecisional internal safeguards are employed; however, the department is unable to determine, prior to the finalization of ratings, whether rating results under NSPS are consistent, equitable, and nondiscriminatory. Specifically, NSPS is designed with multiple layers of review before an employee's appraisal is finalized. For example, a supervisor writes the employee's performance assessment and recommends a rating which is then submitted for review to a higher level reviewer, who often serves as an interim level of review prior to the rating's reaching the pay pool panel for its review. Additionally, at 10 of the 12 installations we visited, sub-pay pools had been established, often based on organizational structure to review a group of or all appraisals. The pay and sub-pay pool panels would either review all of the employee appraisals and self-assessments or they would review a sample of these documents. For example, the pay pool may review all appraisals assessed at the 1, 2, 4, and 5 ratings and randomly select a sample of the appraisals assessed at the 3 level. According to some pay pool panel members, panels reviewed employee appraisals separately, by job focus, such as engineers, to allow for a more consistent measure of employee performance. The panels also made efforts to determine the consistency among rating officials, which, according to the panel members, helps to eliminate bias, discrimination, or politicization. Although these efforts are laudable, DOD is unable to determine whether rating results under the system are consistent, equitable, or nondiscriminatory prior to the ratings' certification because it does not require any predecisional analysis of the ratings. In fact, only one quarter of the installations we visited analyzed the predecisional results of the rating distribution according to demographics, although doing so could expose possible trends, anomalies, or biases within the rating process. DOD does not require the components, or any levels within the department, to have a third party analyze the predecisional demographic results for trends or anomalies in the data. Furthermore, DOD does not provide the individual installations or commands with a means for assessing their rating distributions by demographics. Instead, DOD deliberately designed the computer application used by the pay pool panels to exclude demographic data. Therefore, any installation that performed demographic analysis, including those we visited, had to take additional steps to gather and correlate the necessary data to perform the analysis. DOD officials told us they did not require any predecisional analysis of rating data and did not include demographic data in the computer application because they did not want to introduce the potential for management to be influenced by bias or discrimination by adjusting the ratings so they could fit a certain predetermined or expected distribution by demographics. DOD officials were also concerned that employees might think that their ratings had been influenced by the demographic data for the same reasons. Furthermore, PEO officials stated that the analysis of pay pools' final rating results by demographics was sufficient to identify anomalies or trends associated with equity and nondiscrimination. However, the purpose of analyzing predecisional rating results is to identify any potential egregious decisions or investigate any potential problems, such as blatant discrimination, in a transparent manner before finalizing the ratings. The purpose of this predecisional analysis is not, however, to change the results to portray an "ideal" distribution, or to alter the outcome of the performance management process. Moreover, this type of analysis is not intended to change the rating results unless a mistake was identified. Identifying an anomaly in the data prior to finalizing the rating decisions would enable management to investigate the situation and determine whether the results accurately reflect the employees' performance or an outside factor is affecting the results. Furthermore, our prior work has highlighted other agencies that have implemented predecisional analysis as part of performance management systems.[Footnote 29] Until DOD conducts a predecisional analysis of the rating results to identify possible trends or anomalies, employees may lack confidence in the fairness and credibility of the system. Reasonable Transparency of the System and Its Operation: Although DOD has taken steps to ensure a reasonable amount of transparency during the implementation of NSPS, DOD's performance management system does not provide employees with transparency over the final rating results. For example, DOD has taken actions to provide reasonable transparency by reporting periodically on internal assessments and employee survey results relating to performance management. Specifically, DOD has an "evaluation plan" that calls for it to conduct yearly employee focus groups following the close of the performance management cycle. The department distributed the results of its 2004 focus groups concerning the design of NSPS and its performance management system to the union coalition representing DOD employees and according to officials it plans to brief employee representatives on the results of its 2008 focus groups and other findings from its evaluation of NSPS in spiral One. DOD's assessment of NSPS also includes its collaboration with the Defense Manpower and Data Center to sample and report on the NSPS workforce, and to include specific questions of interest for evaluating NSPS, in its now-yearly survey of DOD civilian employees. The survey results are available on the Defense Manpower and Data Center's Web site, and provided to key management officials in a briefing. Further, DOD's NSPS office facilitates lessons- learned briefings with all four components at the conclusion of each cycle. Despite these efforts, DOD's performance management system does not provide adequate transparency over its rating results to employees because it does not require commands or pay pools to publish their respective rating and share distributions to employees. Although DOD suggests that distributing aggregate data to employees is an effective means for providing transparency, the department does not require commands or pay pools to publish the rating distributions.[Footnote 30] Moreover, NSPS program officials at all four components told us that publishing overall results is considered a best practice. However, three of the installations we visited did not publish the overall rating and share distribution at any level for various reasons or, as officials at one installation told us, for no particular reason at all. Without transparency over rating and share distributions, employees may believe they are not being rated fairly, which ultimately can undermine their confidence in the system. Meaningful Distinctions in Individual Employee Performance: The NSPS performance management system is designed to allow for meaningful distinctions to be made in employee performance. However, NSPS is not being implemented in a way that encourages use of all available rating categories, thus limiting the system's ability to ensure that meaningful distinctions in employee performance, and therefore pay, are made. The performance management system for NSPS consists of five rating categories, of which the lowest rating is a "1" (unacceptable performance) and the highest rating is a "5" (role model performance). Further, the number of shares employees receive is commonly based on the employees' "raw performance scores," and the shares ultimately determine employees' overall payout.[Footnote 31] For example, at the installations we visited, the level "3" rating (valued performer) typically was awarded one or two shares depending on the employee's raw performance scores. The overall number of shares awarded within a pay pool determines the value of the share. This means that the budget does not dictate the ratings because the value of a share depends on how many shares in total are being awarded. Regardless of the value of the share, an employee who receives a "3" rating with two shares would receive twice the payout percentage of an employee who received a "3" rating with one share within the same pay pool. Although DOD has established mechanisms within NSPS to allow for meaningful distinctions to be made, the guidance provided by the leadership at the PEO and component levels may discourage rating officials from using all available rating categories. Specifically, it was verbally expressed during training at multiple levels that the majority of employees should expect to be rated at the "3" level (or valued performer), according to PEO and component officials with whom we spoke. Furthermore, at 10 of the 12 installations we visited, rating officials, panel members, program management, and/or employees told us they were instructed by management, through training, or informed via verbal guidance, to expect that most employees would be evaluated as valued performers. The four components' representatives noted that they received this guidance from PEO, along with the NSPS performance indicators and benchmarks, and disseminated it downward via verbal guidance, often through training. Moreover, PEO officials confirmed that NSPS program management across the components was to communicate downward, through training, that the majority of employees were likely to be rated at the "3" level. In addition, one pay pool panel we visited specified in its business rules that most employees should expect to receive a "3" rating. As a result of this communication, there was a hesitancy to award employee ratings in other categories, across the sites visited. Some pay pool panel members and rating officials with whom we spoke noted that they were reluctant to award too many 4s and 5s. In addition, several rating officials told us that there is a hesitancy to assign lower ratings--specifically a "2" or "1"--due to the additional paperwork and justification required of the supervisor, and the potential for employee backlash. Moreover, during our group discussions with civilian employees, a prevalent theme was that it was impossible to receive a rating higher or lower than a "3." As a result of the explicit guidance that most employees should be rated as a "3" and the reaction of the pay pool panels, supervisors, and employees we met with, it is questionable whether meaningful distinctions are being made in NSPS employees' performance ratings. The verbal guidance that was incorporated in training and town hall meetings with employees--i.e., that most employees should expect to be rated at the "3" level--was intended to prepare employees not to have high expectations of what their ratings would be under NSPS, according to PEO officials. Further, officials within PEO and the components, as well as pay pool panel members and supervisors told us that the prior rating system was inflated and many employees were accustomed to receiving the highest available rating. In other cases, employees were transitioning from a system that either rated the employee as passing or failing. As a result, PEO officials were concerned that the more stringent performance indicators under NSPS needed to be fully communicated to employees. Furthermore, PEO, and most of the installations we visited, emphasized that there was not a forced distribution of ratings. Specifically, PEO guidance prohibits forced rating distributions or quotas, and we have previously reported that making meaningful distinctions in employee performance, such as agencies not imposing a forced distribution of performance ratings-- i.e., a fixed numeric or percentage limitations on any rating levels-- is a key practice in effectively implementing performance management systems.[Footnote 32] Further, according to NSPS implementing issuance, rating results should be based on how well employees complete their job objectives using the performance indicators. We collected and analyzed the rating results from the pay pools we visited, as well as DOD-wide (see table 2). The pay pool rating distributions we reviewed from our 12 site visits revealed that 60 percent or more of employees were rated at the "3" level at 9 pay pools. However, we were unable to determine whether these final distributions were meaningful because we do not have specific knowledge of employees' performance within these pay pools. For example, within one pay pool it is feasible that the vast majority of employees are performing at the "3" level based on the performance indicators and employees' performance. It is also possible that at a different pay pool a vast majority are performing at the "4" level. Table 2: Percentage of Employees in Each Rating Category by DOD and Pay Pools Visited: DOD: Rating category: 1: 0.2%; Rating category: 2: 1.6%; Rating category: 3: 57.0%; Rating category: 4: 36.1%; Rating category: 5: 5.1%. DOD: Redstone Arsenal; Rating category: 1: 0.3; Rating category: 2: 1.0; Rating category: 3: 66.0; Rating category: 4: 31.0; Rating category: 5: 2.0. DOD: Fort Huachuca; Rating category: 1: 0; Rating category: 2: 2.0; Rating category: 3: 63.0; Rating category: 4: 31.0; Rating category: 5: 4.0. DOD: Fort Sam Houston; Rating category: 1: 0; Rating category: 2: 1.0; Rating category: 3: 60.0; Rating category: 4: 38.0; Rating category: 5: 1.0. DOD: Tinker Air Force Base; Rating category: 1: 0.1; Rating category: 2: 4.0; Rating category: 3: 68.0; Rating category: 4: 26.0; Rating category: 5: 2.0. DOD: Randolph Air Force Base[A]; Rating category: 1: 0; Rating category: 2: 1.0; Rating category: 3: 34.0; Rating category: 4: 43.0; Rating category: 5: 21.0. DOD: March Air Reserve Base; Rating category: 1: 0; Rating category: 2: 1.0; Rating category: 3: 69.0; Rating category: 4: 25.0; Rating category: 5: 4.0. DOD: Joint Warfare Analysis Center; Rating category: 1: 0.4; Rating category: 2: 1.0; Rating category: 3: 73.0; Rating category: 4: 24.0; Rating category: 5: 2.0. DOD: Naval Facilities Headquarters; Rating category: 1: 0; Rating category: 2: 1.0; Rating category: 3: 50.0; Rating category: 4: 40.0; Rating category: 5: 9.0. DOD: Marine Corps Tactical Systems Support Activity; Rating category: 1: 0; Rating category: 2: 2.0; Rating category: 3: 84.0; Rating category: 4: 14.0; Rating category: 5: 0. DOD: DOD, Office of Inspector General; Rating category: 1: 0; Rating category: 2: 2.0; Rating category: 3: 53.0; Rating category: 4: 38.0; Rating category: 5: 6.0. DOD: Defense Microelectronics Activity; Rating category: 1: 0; Rating category: 2: 3.0; Rating category: 3: 60.0; Rating category: 4: 27.0; Rating category: 5: 10.0. DOD: Defense Threat Reduction Agency; Rating category: 1: 0.2; Rating category: 2: 2.0; Rating category: 3: 65.0; Rating category: 4: 31.0; Rating category: 5: 2.0. Source: GAO analysis of DOD data. Note: The percentages presented in this table may not total to 100 due to rounding. [A] Officials we interviewed from one of the pay pools at Randolph Air Base told us that they were aware that their pay pool rating distribution had more employees rated in category 4 and 5 than many other pay pools and the norm in DOD; however, they said they believed their ratings accurately reflected their employees' performance. Furthermore, these officials told us that most of the employees who were under NSPS in their pay pool were supervisors and "high performing" employees. [End of table] Unless NSPS is implemented in a manner that encourages meaningful distinctions in employee ratings in accordance with employees' performance, employees will continue to believe they are not rated fairly and that there is an unspoken forced distribution of ratings, and their confidence in the system will continue to be undermined. Although DOD Civilian Employees under NSPS Identified Some Positive Aspects of the System, DOD Does Not Have a Plan for Addressing the Generally Negative Employee Perceptions of NSPS: While DOD employees under NSPS are positive regarding some aspects of NSPS's performance management system, they generally expressed negative perceptions of the system in both DOD's survey and the focus group sessions we held, and DOD does not have a plan to address these negative employee perceptions. Specifically, while DOD's SOFS of civilian employees indicates that attitudes on certain aspects of performance management are more positive among employees who have transitioned to NSPS compared to all other DOD employees, the most recent survey results indicate that attitudes of employees who have been under NSPS the longest have become slightly more negative toward other aspects of performance management. Moreover, civilian employees, including supervisors, expressed concerns or negative attitudes about NSPS during the focus group discussions we held at 12 select installations. These attitudes are not surprising given that organizational transformations often require an adjustment period to gain employee acceptance and trust. However, DOD has not developed a specific action plan for addressing the critical issues raised by employees in both DOD's survey results of employees and the PEO's evaluation of NSPS through focus groups. NSPS Employee Attitudes on Certain Aspects of Performance Management Are More Positive than All DOD Employees, but Have Slightly Declined among Those Employees Who Have Been under NSPS the Longest: DOD's survey of civilian employees indicates that attitudes on certain aspects of performance management, such as pay raises depending on performance, are more positive among employees who have transitioned to NSPS compared to all other DOD employees. However, most recent survey results indicate that attitudes of employees who have been under NSPS the longest have become slightly more negative toward certain other aspects of performance management, such as the overall impact of NSPS on personnel practices at DOD. During our analysis of DOD's survey results for November and May 2006 and May 2007, we noted that employee responses to the questions we identified as related to NSPS and performance management were fairly evenly distributed across the "disagree," "agree," and "neither" responses. As a result, we do not know what the overall trend is or whether this movement in the negative direction will continue in future years. We will be able to identify trends in employee attitudes after employees have had more time under NSPS and additional surveys are administered. Survey Results Indicate that Employees under NSPS Are More Positive than Other DOD Employees about Some Aspects of Performance Management: Our review of the results of the 2007 SOFS for DOD civilian employees found that employees under NSPS are slightly more positive than all other DOD employees about some aspects related to the goals of performance management.[Footnote 33] Specifically, the Office of Personnel Management reported that the goals of performance management under NSPS are to link employee performance, pay, and mission accomplishment as well as to make meaningful distinctions in employee performance.[Footnote 34] For example, an estimated 56 percent of NSPS employees indicated that they believed that bonus and cash awards are based on performance compared to 52 percent of all DOD employees. [Footnote 35] In addition, an estimated 40 percent of NSPS supervisors responded that they agreed they could influence their employee's pay to reflect performance as compared to 27 percent of all DOD supervisors. See table 3 for additional examples. Furthermore, we identified some instances in which spiral 1.1 employees, who were the first to transition into NSPS, showed even more positive attitudes toward performance management. For example, an estimated 25 percent of all DOD employees agreed that pay raises depend on how well employees perform their jobs, compared to 40 percent of all NSPS employees and 43 percent of spiral 1.1 employees. See appendix III for additional survey questions and responses related to performance management and NSPS. Table 3: Estimated Percentage of Responses from Status of Forces Survey for DOD Civilian Employees, May 2007: Question: Performance management; Differences in performance are recognized in a meaningful way; Employee description: DOD; Agree: 34%; Neither: 32%; Disagree: 30%. Question: Performance management; Differences in performance are recognized in a meaningful way; Employee description: NSPS; Agree: 38; Neither: 31; Disagree: 27. Question: Performance management; Bonus and cash awards are based on performance; Employee description: DOD; Agree: 52; Neither: 20; Disagree: 27. Question: Performance management; Bonus and cash awards are based on performance; Employee description: NSPS; Agree: 61; Neither: 19; Disagree: 20. Question: Performance management; In my work unit, steps are taken to deal with a poor performer who cannot or will not improve; Employee description: DOD; Agree: 29; Neither: 30; Disagree: 37. Question: Performance management; In my work unit, steps are taken to deal with a poor performer who cannot or will not improve; Employee description: NSPS; Agree: 34; Neither: 31; Disagree: 30. Question: Performance management; Pay raises depend on how well employees perform their jobs; Employee description: DOD; Agree: 25; Neither: 28; Disagree: 43. Question: Performance management; Pay raises depend on how well employees perform their jobs; Employee description: NSPS; Agree: 40; Neither: 28; Disagree: 28. Question: Personnel actions; I can influence my employees' pay to reflect performance; Employee description: DOD; Agree: 27; Neither: 31; Disagree: 43. Question: Personnel actions; I can influence my employees' pay to reflect performance; Employee description: NSPS; Agree: 40; Neither: 28; Disagree: 32. Question: Leadership and management; Managers communicate their goals and priorities; Employee description: DOD; Agree: 58; Neither: 22; Disagree: 20. Question: Leadership and management; Managers communicate their goals and priorities; Employee description: NSPS; Agree: 63; Neither: 19; Disagree: 17. Source: GAO analysis of DOD data. Note: The estimated percentages are based on a 95 percent confidence interval and the margin of error is within +/-2 percent. The response categories are collapsed for positive ("agree") and negative ("disagree") responses. That is, "agree" is the estimated percentage of employees who responded either "agree" or "strongly agree", while "disagree" is the estimated percentage of employees who responded either "disagree" or "strongly disagree." [End of table] Survey Results in Some Instances Show a Slight Decline in Employee Attitudes among Those Employees Who Have Been under NSPS the Longest: In some instances, DOD's survey results showed a decline in employee attitudes among employees who have been under NSPS the longest. Responses of spiral 1.1 employees, who were among the first employees converted to NSPS, were steadily more negative about NSPS from the May 2006 to the May 2007 DOD survey. At the time of the May 2006 administration of the SOFS for civilians, employees designated as spiral 1.1 had received training on the system and had begun the conversion process, but had not yet gone through a rating cycle and payout under the new system. As part of this training, employees were exposed to the intent of the new system and the goals of performance management and NSPS, which include annual rewards for high performance and increased feedback on employee performance. However, as DOD and the components proceeded with implementation of the system, survey results showed a decrease in employees' optimism about the system's ability to fulfill its intent and reward employees for performance. The changes in attitude reflected in DOD's employee survey are slight, but indicate a movement in employee perceptions. Most of the movement in responses was negative. Specifically, in response to a question about the impact NSPS will have on personnel practices at DOD, the number of positive responses decreased from an estimated 40 percent of spiral 1.1 employees in May 2006 to an estimated 23 percent in May 2007.[Footnote 36] Further, when asked how NSPS compared to previous personnel systems, an estimated 44 percent said it was worse in November 2006, compared to an estimated 50 percent in May 2007.[Footnote 37] Similarly, employee responses to questions about performance management in general were also more negative from May 2006 to May 2007, as shown in table 4. Specifically, the results of the May 2006 survey estimated that about 67 percent of spiral 1.1 employees agreed that the performance appraisal is a fair reflection of performance, compared to 52 percent in May 2007. Further, the number of spiral 1.1 employees who agreed that the NSPS performance appraisal system improves organizational performance decreased from an estimated 35 percent to 23 percent. For additional questions and results related to NSPS and performance management, see appendix III. Table 4: Estimated Percentage of Spiral 1.1 Employees' Responses for Select Questions from the May 2007, November 2006, and May 2006 Administrations of the Status of Forces Survey for DOD Civilian Employees: Question: Performance management; Performance appraisal is fair reflection of performance; Survey administration: May 2006; Agree: 67%; Neither: 20%; Disagree: 12%. Question: Performance management; Performance appraisal is fair reflection of performance; Survey administration: November 2006; Agree: 59; Neither: 22; Disagree: 16. Question: Performance management; Performance appraisal is fair reflection of performance; Survey administration: May 2007; Agree: 52; Neither: 21; Disagree: 25. Question: Performance management; Performance standards/expectations take into account important parts of job; Survey administration: May 2006; Agree: 68; Neither: 20; Disagree: 12. Question: Performance management; Performance standards/expectations take into account important parts of job; Survey administration: November 2006; Agree: 65; Neither: 20; Disagree: 16. Question: Performance management; Performance standards/expectations take into account important parts of job; Survey administration: May 2007; Agree: 59; Neither: 20; Disagree: 20. Question: Performance management; Performance appraisal system improves organizational performance; Survey administration: May 2006; Agree: 35; Neither: 39; Disagree: 26. Question: Performance management; Performance appraisal system improves organizational performance; Survey administration: November 2006; Agree: 30; Neither: 37; Disagree: 34. Question: Performance management; Performance appraisal system improves organizational performance; Survey administration: May 2007; Agree: 23; Neither: 31; Disagree: 47. Question: Performance management; Current performance appraisal system motivates me to perform well; Survey administration: May 2006; Agree: 43; Neither: 33; Disagree: 25. Question: Performance management; Current performance appraisal system motivates me to perform well; Survey administration: November 2006; Agree: 42; Neither: 28; Disagree: 30. Question: Performance management; Current performance appraisal system motivates me to perform well; Survey administration: May 2007; Agree: 38; Neither: 26; Disagree: 36. Question: NSPS; NSPS has improved personnel process for communication between supervisors and employees; Survey administration: May 2006; Agree: N/A; Neither: N/A; Disagree: N/A. Question: NSPS; NSPS has improved personnel process for communication between supervisors and employees; Survey administration: November 2006; Agree: 38; Neither: 36; Disagree: 38. Question: NSPS; NSPS has improved personnel process for communication between supervisors and employees; Survey administration: May 2007; Agree: 34; Neither: 31; Disagree: 34. Question: NSPS; NSPS has improved personnel process for linking pay to performance; Survey administration: May 2006; Agree: N/A; Neither: N/A; Disagree: N/A. Question: NSPS; NSPS has improved personnel process for linking pay to performance; Survey administration: November 2006; Agree: 29; Neither: 38; Disagree: 33. Question: NSPS; NSPS has improved personnel process for linking pay to performance; Survey administration: May 2007; Agree: 28; Neither: 26; Disagree: 46. Question: NSPS; NSPS has improved personnel processes for individual performance supporting organizational mission; Survey administration: May 2006; Agree: N/A; Neither: N/A; Disagree: N/A. Question: NSPS; NSPS has improved personnel processes for individual performance supporting organizational mission; Survey administration: November 2006; Agree: 35; Neither: 39; Disagree: 35. Question: NSPS; NSPS has improved personnel processes for individual performance supporting organizational mission; Survey administration: May 2007; Agree: 33; Neither: 36; Disagree: 33. Source: GAO analysis of DOD data. Note: The estimated percentages are based on a 95 percent confidence interval with a margin of error for the May 2007 results within +/-2 percent and within +/-3 percent for the May 2006 and November 2006 results. The response categories are collapsed for positive ("agree") and negative ("disagree") responses. That is "agree" is the estimated percentage of employees who responded either "agree" or "strongly agree," while "disagree" is the estimated percentage of employees who responded either "disagree" or "strongly disagree." In addition, some responses for the May 2006 survey are "N/A" because those questions were not asked on that survey. [End of table] DOD Employees in Our Discussion Groups Expressed Wide-Ranging but Consistent Concerns about NSPS: In the discussion groups we held, DOD employees and supervisors expressed wide-ranging but consistent concerns about NSPS. While the results of our discussion groups are not generalizable to the entire population of DOD civilians, the themes that emerged from our discussions provide valuable insight into civilian employees' perceptions about the implementation of NSPS and augment DOD's survey findings. During these discussion groups, we found that some civilian employees were optimistic about the intent of the system and its potential benefits, for example, rewarding high performers and improving communication between supervisors and employees. Further, some employees we met with told us that they were satisfied with NSPS and had no complaints about the system. However, during all of our discussion groups, civilian employees, including supervisors, expressed concerns or negative attitudes about NSPS. Prevalent themes or employee perceptions coming out of those discussion groups were that NSPS (1) had a negative impact on motivation and morale, (2) required employees and supervisors to spend excessive amounts of time navigating the performance management process, (3) was biased due to the potential influence that employees' and supervisors' writing skills have on panels' assessments of employee ratings, (4) lacks transparency over the pay pool panel process, and (5) was implemented at a rapid pace. Employees also commented on other aspects of NSPS. (See app. IV for a discussion of less prevalent themes that emerged from our discussion groups.) Given that NSPS has just entered its third year of implementation, these negative attitudes are not surprising. As stated before, our previous work as well as the reports published by the Office of Personnel Management have shown that organizational transformations, such as the adoption of a new performance management system, often entail fundamental and radical changes to an organization that requires an adjustment period to gain employee acceptance and trust.[Footnote 38] As a result, major change management initiatives in large-scale organizations can often take several years to be fully successful. Negative Impact on Motivation and Morale: A prevalent theme from our discussions with both employees and supervisors was that several aspects of NSPS have had a negative impact on employee motivation and morale. Specifically, employees and supervisors at 9 of the 12 sites we visited expressed concern that management had established an unpublished quota for rating distributions and that a majority of employees were arbitrarily placed in the "3" or "valued performer" category. As a result, some employees said they are not motivated to perform above the "3" level, because they think they will receive this rating regardless of their individual performance. In addition, employees suspected that both their ratings and their pay pool's overall rating distribution were predetermined based on the pay pool's available funding. While some employees said that giving too many "4" and "5" ratings would diminish share value, other employees expressed concern that management's attempt to group everyone in the "3" category did not result in the recognition of different levels of performance. Another prevalent theme at 10 of the sites we visited was that a rating of "3" was perceived as being average, and not "valued," despite what they were told during training and in other information they received about the system. As a result, employees at 10 sites we visited stated that distinctions in different levels of performance were not being made, while others stated that, by not distinguishing between differing levels of employee performance, NSPS has the potential to discourage employees from going above and beyond in their performance over time. Furthermore, employees at 8 of the 12 sites we visited questioned the merit of a pay-for-performance system for civil service employees because, for many, money is not a motivator. Employees at 2 locations told us that they valued their reputation and recognition from their supervisor when they did a good job more than they did the monetary reward associated with their rating. As a result, some employees did not agree with having their rating linked to the monetary award. Employees at a couple of sites stated that they would prefer a higher rating regardless of the payout. Employees at several locations told us that they did not trust that the system was in the employees' best interest, but rather was an attempt by the government to "save money at the expense of the employees." Moreover, some employees believed that they were not doing as well financially as their GS counterparts. During discussion groups we heard that employees were comparing pay increases they would have received under the GS system to those they received under NSPS. Discussion group participants also told us that DOD's transition to NSPS has been further complicated by the fact that civilian employees were previously under a performance management system where ratings were inflated for an extended period of time. For example, a supervisor at one location we visited stated that employees at that location were used to the previous performance management system, under which a majority of the workforce received an inflated rating.[Footnote 39] PEO heard similar concerns during a series of focus groups that it conducted in 2004. Specifically, participating employees reported that rating inflation existed in the prior system, which resulted in the system's inability to distinguish between high and low performers. Excessive Amount of Time Spent Navigating the Performance Management Process: A prevalent theme at all 12 locations we visited was that it was excessively time-consuming and/or labor intensive to navigate through all the steps of the performance management process of NSPS. While some participants recognized that a learning curve is to be expected with any new system, other participants told us that NSPS requires a much greater time commitment than previous performance management systems. Employees, including supervisors, also told us that, in some cases they found that the tasks and responsibilities associated with NSPS hindered their ability to focus on and complete their assigned job duties. Specifically, employees in both supervisory and nonsupervisory positions told us that the back and forth exchange of draft job objectives and self-assessments with their supervisor was particularly time consuming. Participants in some of our discussion groups also told us that they had to devote excessive amounts of time to tracking and writing up their tasks over the course of the performance cycle. For example, we heard at multiple sites that employees kept a daily log, or record, of their tasks so that they would be able to write, in detail, what they did on their self-assessments. Moreover, both supervisors and employees expressed frustration that the entire process was too labor- intensive, with some saying that management had to delay their day-to- day work for extensive periods of time to complete ratings and participate in the sub-pay pool and pay pool panel processes. Further, supervisors in our discussion groups told us that their administrative tasks under NSPS, specifically drafting ratings and maneuvering back and forth with the employee through the steps in the computer application, required so much of their time that they could barely fulfill their other job responsibilities. During one discussion group, supervisors told us that it took a minimum of 4 uninterrupted hours per employee to complete a rating. Employees expressed a similar sentiment in regards to their supervisors. For example, during a discussion group with employees, we heard that one supervisor had to shut his door for an extended period of time in order to complete employee assessments. In addition, some employees told us that they did not see the added benefit of the system, given the amount of time and effort they had to invest in performing tasks such as drafting job objectives and self- assessments while navigating the performance appraisal process. Employees further noted that, despite the significant amount of time they invested to complete the process, they received a "3" or felt their payout was insufficient to justify the time investment. A couple of supervisors told us that their subordinates have asked why they should put in the effort if they are going to get a "3." Potential Influence that Employees' and Supervisors' Writing Skills Have on Panels' Assessments of Employee Ratings: Supervisors and employees at 11 of the 12 locations we visited voiced concern that their writing skills, as evidenced in their job objectives, self-assessments, and ratings, influenced the panels' evaluation of the ratings they received under NSPS and potentially overshadowed the accomplishments they achieved during the rating period. Specifically, during two discussion groups, employees in more technical positions felt that they were at a disadvantage when it came to writing objectives and self-assessments because their strengths lie in other areas or their jobs do not require them to regularly produce written products. Further, supervisors told us that they were concerned that their own writing skills were detrimental to employee ratings. For example, one supervisor told us that he bought a number of books on writing and performance appraisals to assist him with the process so that his employees would not be disadvantaged. Some employees told us that they devoted a significant amount of time to their self-assessments, often using personal time to compose them. However, other employees told us that the quality of their assessment and their rating may have suffered because they were focused on their job responsibilities and did not invest a lot of time and effort in their assessment. For example, during one discussion group, employees said that since the end of the rating cycle coincides with the end of the fiscal year, they must choose between meeting fiscal year deadlines and completing their NSPS assessment tasks. Furthermore, some employees and supervisors were unclear as to (1) what information they should include in their self-assessments and/or employee ratings and (2) what format they should use, because they had not received any examples, feedback for improvement, or comments on the strengths of previous assessments. Participants also noted that it is difficult to explain to the pay pool panel exactly what each employee's job entails, regardless of the amount of explanation they are allowed. Employees Lack Transparency and Understanding of the Pay Pool Panel Process: Employees and supervisors at 9 of the 12 sites we visited expressed concern that they lacked transparency over and an understanding of the pay pool panel process and the overall rating process. Some employees said that they did not trust the system because they think there is a lot of secrecy in the pay pool panel process. For example, some employees we spoke with at 1 location indicated that they had limited understanding of the process from the moment their rating left their supervisors' hands and went up to the "pay pool in the sky." Employees at almost all locations told us that they did not feel as though the pay pool panel members knew them or the work they did. Specifically, at one location employees said that pay pool panel members did not know them well enough to make a fair determination of their final rating. Furthermore, employees at 8 locations expressed concern that the visibility of their position or their assignment to the pay pool panel influenced the rating they received. A prevalent theme at a majority of the sites we held discussion groups at was that employees were concerned about the pay pool panel not having direct knowledge of them or their accomplishments. However, at a couple of the sites we visited, employees said that this was a benefit of the system. They stated that the additional level of review by the pay pool panel, and in some cases a sub-pay pool panel, removed some of the subjectivity from the process and allowed them to make management more aware of their accomplishments. Supervisors we spoke with also expressed concerns about their understanding of the pay pool panel's decision on employee ratings and the communication they received from the panel. Some supervisors we spoke with were concerned about giving feedback, specifically praise, to their subordinates throughout the year or prior to releasing the final ratings because they were unsure if the pay pool panel would sustain the rating they assigned. Moreover, supervisors at some of our discussion groups expressed frustration regarding the pay pool panel's lack of communication about their subordinates' final ratings and its rationale for its final ratings. In instances where changes were made to a rating, supervisors at half of the installations we visited told us that they were unsure how to give employees feedback on their final rating because they felt their employees had earned a different rating and the panel did not provide evidence to explain why it changed the rating. Rapid Pace of System Implementation: Another theme that emerged from our discussions with both supervisors and employees was that NSPS was implemented before all of the glitches with the system were identified and resolved. Employees described instances where they received incomplete information during training, as well as instances where the trainer could not provide answers to their questions. For example, one employee told us that he and others did not receive answers to the questions they submitted to the online question box. The employee told us that management at that location stated that they were unable to answer some questions they received. Employees at another location described feeling as though they were turned loose to figure things out for themselves, because the trainers could not answer employee questions. One employee said that it felt as though the system "hit the street running." Another theme that emerged from our discussion groups with both employees and supervisors was the haste with which the online tool--the PAA--was rolled out, as well as the difficulties they continued to experience in using this tool, despite several different iterations of the program attempting to correct the problems. Employees and supervisors at several locations described the system as being fraught with problems. For example, they said the tool was nonintuitive and not user-friendly, or as one employee called it, "user hostile." Specifically, during a couple of our discussion groups we heard that users were entering information into the system without knowing if they missed key steps or if their information would be lost before saving. In addition, during one of our discussion groups, employees who were trained to train other employees on NSPS told us that they found it particularly difficult to train employees on the new online tool because of the new versions and updates that were released to correct problems with the system. The trainers told us that in some cases they were learning how to use the new versions at the same time as the employees they were supposed to be training. As a result, employees told us that they did not know whom to turn to for answers to their questions about the performance management system and online tool. DOD Has Not Developed a Plan for Addressing Issues Raised by Employees: DOD has not developed a specific action plan to address critical issues raised by employees in forums such as DOD's survey of employees and other avenues, such as the PEO's evaluation of NSPS through focus groups, according to PEO. As required by the National Defense Authorization Act for Fiscal Year 2004[Footnote 40], OPM issued regulations requiring each agency to conduct an annual survey of its employees to assess leadership and management practices that contribute to agency performance and employee satisfaction with aspects of their organization.According to OPM, survey information allows organizations to focus their efforts and to improve various programs and processes. Further, OPM developed supplementary guidance recommending that agencies use survey results to provide feedback to employees and develop and implement an action plan. Specifically, it suggests that, after the survey results have been reviewed, agencies have a responsibility to provide feedback to their employees on the results, as well as to let employees know the intended actions to address the results and progress on these actions. The guidance further suggests, and we believe it is a best practice, that agencies develop and implement an action plan to guide their efforts to address the results of the employee surveys. Through our own analysis of DOD's survey and the discussion groups we held with employees and supervisors, we determined that employees under NSPS have generally negative perceptions regarding some aspects of NSPS. Further, PEO's analysis of its most recent focus groups also showed that employees had concerns about NSPS.[Footnote 41] For example, PEO found that employees were concerned about the potential loss of their cost-of-living increase, the existence of adequate funding for pay increases and bonuses, and the lack of direct supervisory contact by their rater, among others.[Footnote 42] PEO issued its evaluation plan in 2007, the purpose of which is to describe the approach, types of data, and general time frames that PEO will use to evaluate and report on NSPS, including identifying aspects for modification and improvement. In addition, the evaluation plan specifies data sources, including employee attitude surveys, focus groups, and lessons learned. This evaluation plan is a first step toward successful implementation of NSPS. According to an official within PEO, the office has gathered information about employee perceptions since the onset of the system's implementation and has used the information to make some adjustments to the system. However, the office has not developed a formal plan to address all employee issues. Further, an official within PEO stated that the office is hesitant to develop an action plan this early in the implementation process because NSPS's performance management process is relatively new and employees have not had a lot of time to become acclimated to the new processes and procedures. Further, theories of organizational transformation state that it takes years for large-scale organizational changes to be successfully integrated into the organization. Similarly, OPM studies on federal government demonstration projects for performance management show that employees' attitudes were initially negative toward demonstration performance management systems; however, over time, these same employees developed more positive attitudes toward the systems.[Footnote 43] Given this, it is reasonable for DOD to allow employees some time to accept the changes that NSPS brought about; however, it is also prudent for PEO to consider possible actions it could take to address persistent negative employee perceptions, particularly those perceptions that are not directly related to accepting a new system. For example, one prevalent theme from our discussion group was the potential for employees' writing skills to influence the panels' assessments of their performance. Without a plan to address employees' negative perceptions of NSPS, DOD could miss opportunities to make changes that could lead to greater employee acceptance and, ultimately, successful implementation of NSPS's performance management system. Conclusions: DOD's implementation of a more performance-and results-based personnel system has positioned the agency at the forefront of a significant transition facing the federal government. NSPS is intended to move DOD from, in some cases, a pass or fail assessment of employees' performance to a detailed assessment of employee performance that is linked to pay increases. We recognize that DOD faces many challenges in implementing NSPS, as any organization would in implementing a large- scale organizational change. However, the department has not fully addressed some key internal safeguards that could help it ensure the fairness and credibility of NSPS. Specifically, DOD cannot identify anomalies in predecisional rating results that might raise concerns about the equity of the system. Until DOD requires a third party to analyze the predecisional results of the ratings, it cannot be certain that NSPS performance management system is achieving consistency, equity, and nondiscrimination in the determination and assignment of employee ratings before those ratings are finalized. In addition, failure to provide all employees with key performance feedback on how their final rating and share value compares to those of other employees could lead to employee distrust of the process and overall system. Finally, DOD's NSPS guidance has discouraged the system from making meaningful distinctions in employee performance. Unless DOD encourages pay pools to make meaningful distinctions in employee performance to the fullest extent possible, as warranted by employees' performance as compared to the standards, employees will continue to feel devalued, which may result in further deterioration of morale and motivation. Furthermore, prevalent themes from our discussion groups, such as the perception of the pay pool process as secretive and the belief that employees will be rated a "3" no matter how well or poorly they perform, suggest that employees lack confidence in NSPS. Taken together, the absence of these safeguards and the negative, and declining, employee perceptions of NSPS are cause for concern about the success of the performance management system. NSPS is a new program and organizational change requires time for employees to accept the system. That said, DOD civilian employees will continue to question the fairness of their ratings and will lack confidence in the system until DOD develops an action plan and takes specific steps to mitigate negative employee perceptions of NSPS. Recommendations for Executive Action: To better address the internal safeguards and improve employee trust in the NSPS performance management system, we recommend that the Secretary of Defense direct the National Security Personnel System Senior Executive to take the following four actions: * Require a third party to perform predecisional demographic and other analysis as appropriate for pay pools. * Require commands to publish the final overall rating results. * Provide guidance to pay pools and supervisors that encourages them to rate employees appropriately, including using all categories of ratings as warranted by comparing employees' individual performance against the standards. * Develop and implement a specific action plan to address employee perceptions of NSPS ascertained from feedback avenues such as, but not limited to, DOD's survey and DOD's and GAO's employee focus groups. For example, the plan should include actions to mitigate employee concerns about the potential influence that employees' and supervisors' writing skills have on the panels' assessment of employee ratings and the lack transparency and understanding of the pay pool panel process. Agency Comments and Our Evaluation: In written comments on a draft of this report, DOD concurred or partially concurred with three of our four recommendations to better address the internal safeguards and improve employee trust in the NSPS performance management system. DOD did not concur with our recommendation to require a third party to perform predecisional demographic and other analysis as appropriate for pay pools. DOD also provided technical comments, which we have incorporated in the report as appropriate. DOD's official comments are reprinted in appendix V. DOD concurred with our recommendation to require commands to publish the final overall rating results. DOD noted that a vast majority of organizations under NSPS are publishing the overall final rating results, and stated that it will take steps to require all organizations under NSPS to share overall rating results with their employees. DOD partially concurred with our recommendation to provide guidance to pay pools and supervisors that encourages them to rate employees appropriately, including using all categories of ratings as warranted by comparing employees' individual performance against the standards. DOD noted that ratings under NSPS rest firmly on the foundation of the written assessments and that the transition to NSPS requires leaders to demonstrate a firm commitment to rigorous, fact-based rating, as well as training and other efforts to "recalibrate" DOD's workforce expectations from previous performance management systems in which nearly all employees got the highest available rating. We agree that NSPS was designed to assess employee performance using written assessments compared to performance indicators. Further, we acknowledged in the report that PEO and the components' training and guidance on ratings were part of the transition process aimed at the majority of the employees who were, in the past, rated as "pass" or at the highest available rating. DOD, however, noted that it did not agree with our conclusion that it is questionable whether meaningful distinctions are being made in NSPS employees' performance ratings, stating that our report relied heavily on workforce opinions gleaned from focus group discussions. Our conclusion, on the contrary, was based on our analysis of discussion with management (including performance review authorities, pay pool managers, pay pool panel members, rating officials, and NSPS program managers or transition managers) at the 12 sites we visited, as well as during interviews with officials at the PEO and component headquarters. Our analysis of these officials' interpretation of the guidance among pay pool panels and rating officials consistently indicated that there was hesitancy to rate employees above or below a "3." DOD further commented that it does "not accept the assumption" underlying our conclusion that pay pools and rating officials were not rating employees appropriately. We never assumed that pay pools and rating officials did not rate employees appropriately. Instead, as we stated in the draft, we were unable to determine whether the final distributions were meaningful because we do not have specific knowledge of employees' performance. DOD also noted that an employee has recourse, through the reconsideration process, if the employee believes a rating was "unfair" or did not result from meaningful distinctions. While we believe a reconsideration process is an important part of a performance management process, we do not necessarily think that the number of employees who filed reconsiderations or the outcomes of the reconsiderations are alone appropriate to determine whether employees believe their ratings are unfair or that meaningful distinctions were made. In fact, during our discussion groups at four locations, we heard that employees did not always choose to use the reconsideration process because they feared retribution from management and their supervisors. Lastly, DOD noted that suggesting that all rating levels be used, despite the caveat that they be "warranted," could be interpreted as mandating rating distributions based on other factors. Our recommendation, however, states that PEO should provide guidance to pay pools and supervisors that encourages them to rate employees appropriately, including using all categories as warranted by comparing employees' performance against the standards. The essence of our recommendation reinforces that performance evaluations must be based on the employee's actual performance measured against the standard criteria, rather than on a preconceived notion of a normal rating distribution, as DOD noted. By providing such reinforcement, we believe DOD will better implement meaningful distinctions in employees' performance and improve employee trust in the system. DOD partially concurred with our recommendation to develop and implement a specific action plan to address employee perceptions. In its written response, DOD stated that the department will address areas of weakness identified in its evaluation of NSPS. It further commented that it is premature to draw actionable conclusions from its recent survey, and it is, therefore, institutionalizing a continuous improvement strategy to give employees time to adjust to and accept the new performance management system. While we recognize that employees often require an adjustment period following any large-scale organizational transformation and acknowledged the department's efforts to correct issues with the system, such as with the automated performance appraisal tools, we believe that DOD's survey data, though preliminary, provide valuable insight into employee perceptions about NSPS. We, as well as the Office of Personnel Management, note that it is a best practice for agencies to use employee survey data by developing and implementing an action plan to guide its efforts to address the results of such surveys. Accordingly, we continue to believe that the development of a plan to address employees' negative perceptions of NSPS could lead to greater employee acceptance and, ultimately, could better enable successful implementation of the NSPS performance management system. DOD did not concur with our recommendation to require a third party to perform an independent, predecisional demographic and other analysis as appropriate for pay pools. In DOD's written response, it stated that predecisional demographic and other analysis was not a "prescribed" safeguard. We agree that neither the National Defense Authorization Act for Fiscal Year 2008 nor the original statutory authority for NSPS prescribed predecisional analysis. However, the National Defense Authorization Act for Fiscal Year 2008 did direct GAO to (1) review the extent to which DOD had effectively implemented prescribed "accountability mechanisms" to include "adherence to merit principles" and "effective safeguards to ensure that the management of the system is fair and equitable and based on employee performance" and (2) assess other "internal safeguards." [Footnote 44] As part of our mandate to review the adherence of NSPS to merit system principles and internal safeguards, we examined whether DOD was performing predecisional analysis because it was identified in our prior work as a practice used by leading public sector organizations. Further, we have emphasized the need for predecisional analysis as part of performance management systems' internal safeguards since DOD first proposed NSPS.[Footnote 45] However, we revised the report to clarify that predecisional analysis was not specified in the act. We continue to believe that our recommendation has merit and that independent, third-party predecisional analyses of rating results are a key internal safeguard for performance management systems in the federal government that can help agencies ensure that their systems adhere to merit system principles and are fair, equitable, and based on employee performance. In its comments regarding the predecisional issue, DOD noted that its pay pool panel process provides checks and balances for fair and equitable ratings. We commended DOD's efforts in our report, noting that the various levels of reviews incorporated into the department's process were steps toward ensuring that predecisional internal safeguards are employed; however, we believe that such reviews are not sufficient to safeguard fair and equitable rating results because DOD is unable to determine whether rating results under the system are consistent, equitable, or nondiscriminatory prior to the ratings' certification. Furthermore, DOD's process does not include a review of all rating results to identify any anomalies. In fact, not all pay pool panels conduct 100 percent reviews of employee appraisals and assessments. As we noted in our report, some panels may review a sample of employees' appraisals and assessments. Moreover, we found that one quarter of the pay pools we visited analyzed the predecisional results of the rating distribution according to demographics. As a result, DOD is inconsistently taking steps to implement this safeguard. DOD further commented that the rating reconsideration process and the Equal Employment Opportunity complaint process serve as another means for ensuring fairness in ratings. While we believe the reconsideration and complaint processes are an important part of the system, they do not take the place of predecisional reviews to identify potential anomalies or significant variances before ratings are finalized. DOD also stated that while demographic and other analyses can be used to ensure the process is fair and equitable, such analyses should be done after the ratings are finalized--noting that predecisional analysis may have detrimental effects on the credibility of the system. We agree with DOD that analyses done after the ratings are finalized are important and that any predecisional analyses should not be used to manipulate the results to achieve some type of parity among various groups of employees. However, we continue to believe that identifying an anomaly in the ratings prior to finalizing those ratings would allow management to investigate the situation and determine whether any non-merit-based factors contributed to the rating results. We disagree with DOD that a predecisional analysis could have detrimental effects on the credibility of the system. As we noted in the report, the purpose of this predecisional analysis is not intended to change the results to portray an "ideal" distribution, to alter the outcome of the performance management process, or to change the rating results unless a mistake was identified. Instead, we stated that the predecisional analysis could enable management to identify any potential egregious decisions or investigate any potential problems, such as blatant discrimination, in a transparent manner before finalizing the ratings. We are sending copies of this report to the appropriate congressional committees. We will make copies available to others upon request. This report will be available at no charge on GAO's Web site at [hyperlink, http://www.gao.gov]. If you or your staff have any questions about this report, please contact me at (202)512-3604 or by e-mail at farrellb@gao.gov. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. GAO staff who made major contributions to the report are listed in appendix VI. Signed by: Brenda S. Farrell: Director, Defense Capabilities and Management: List of Congressional Committees: The Honorable Carl Levin: Chairman: The Honorable John McCain: Ranking Member: Committee on Armed Services: United States Senate: The Honorable Joseph I. Leiberman: Chairman: The Honorable Susan M. Collins: Ranking Member: Committee on Homeland Security and Governmental Affairs: United States Senate: The Honorable Ike Skelton: Chairman: The Honorable Duncan Hunter: Ranking Member: Committee on Armed Services: House of Representatives: The Honorable Henry A. Waxman: Chairman: The Honorable Tom Davis: Ranking Member: Committee on Oversight and Government Reform: House of Representatives: [End of section] Appendix I: Scope and Methodology: In conducting this review, we limited our scope to the performance management aspect of the National Security Personnel System (NSPS). Therefore, we addressed neither performance management of the Senior Executive Service at the Department of Defense (DOD) nor other aspects of NSPS, such as classification and pay. Determination of Implementation of Internal Safeguards and Accountability Mechanisms: To determine the extent to which DOD has implemented safeguards to ensure that NSPS's performance management system is fair, effective, and credible, we used the following internal safeguards and accountability mechanisms, which were either specified in the National Defense Authorization Act for Fiscal Year 2008 or identified in our previous work on pay for performance management systems in the federal government:[Footnote 46] * Involve employees, their representatives, and other stakeholders in the design of the system, to include employees directly involved in validating any related implementation of the system. * Assure that the agency's performance management systems link employee objectives to the agency's strategic plan, related goals, and desired outcomes. * Provide adequate training and retraining for supervisors, managers, and employees in the implementation and operation of the performance management system. * Provide a process for ensuring ongoing performance feedback and dialogue between supervisors, managers, and employees throughout the appraisal period, and for setting timetables for review. * Implement a pay-for-performance evaluation system to better link individual pay to performance, and provide an equitable method for appraising and compensating employees. * Assure that certain predecisional internal safeguards exist to help achieve the consistency, equity, nondiscrimination, and nonpoliticization of the performance management process; such safeguards include an independent reasonableness review by a third party or reviews of performance rating decisions, pay determinations, and promotions before they are finalized to ensure that they are merit- based, as well as consideration by pay panels of the results of the performance appraisal process and other information in connection with final pay decisions. * Assure that there are reasonable transparency and appropriate accountability mechanisms in connection with the results of the performance management process, to include reporting periodically on internal assessments and employee survey results relating to performance management and individual pay decisions while protecting individual confidentiality. * Assure that performance management results in meaningful distinctions in individual employee performance. * Provide a means for ensuring that adequate agency resources are allocated for the design, implementation, and administration of the performance management system. To assess the implementation of these safeguards and accountability mechanisms, we analyzed regulations and other guidance provided by officials in DOD and the four components' headquarters--the Army, Navy, Air Force, and Fourth Estate.[Footnote 47] We also reviewed documents, such as pay pool business rules, and regulations and training instructions obtained during 12 site visits and meetings with component- level program offices. Within DOD, we interviewed the Under Secretary of Defense for Personnel and Readiness, Arlington, Virginia as well as officials at: * the Program Executive Office (PEO), Arlington, Virginia; * Equal Employment Opportunity Office, Arlington, Virginia; * Department of the Army, NSPS Program Management Office, Alexandria, Virginia; * Department of the Navy NSPS Program Office, Washington, D.C.; * Marine Corps NSPS Program Management Office, Quantico, Virginia; * Department of the Air Force, NSPS Program Office, Arlington, Virginia; * Department of the Air Force, Air Force Personnel Center, Randolph Air Force Base, Texas; and: * Fourth Estate NSPS Program Management Office, Arlington, Virginia. We also interviewed appropriate officials across all four components, at 12 installations total. To allow for appropriate representation by each component, we visited 3 installations per component and selected the sites because they (1) contained a large number of civilian employees under NSPS and (2) were geographically dispersed throughout the United States. Specifically, we visited the following 12 installations: * Redstone Arsenal, Alabama; * Fort Sam Houston, Texas; * Fort Huachuca, Arizona; * Joint Warfare Analysis Center, Virginia; * Naval Facilities Headquarters, D.C.; * Marine Corps Tactical Systems Support Activity, California; * Randolph Air Base, Texas; * Tinker Air Base, Oklahoma; * March Air Reserve Base, California; * Defense Microelectronics Activity, California; * Defense Threat Reduction Agency, Virginia; and: * Department of Defense Inspector General, Virginia. At the installations we visited, we interviewed the Performance Review Authority,[Footnote 48] the pay pool manager, pay pool panel members, rating officials, and the NSPS program officer or transition manager. We compared and contrasted information extracted from the interviews regarding the implementation of the safeguards. We supplemented this testimonial evidence with policies and procedures, lessons learned, and other documents we obtained. We then identified how and at which installations each of the safeguards had been implemented. We also obtained and analyzed the rating and share distributions for each of the 12 installations visited and compared the distributions to those of the components and DOD-wide. Further, we analyzed documents on NSPS and performance management published by the Office of Personnel Management, Washington, D.C., and interviewed appropriate officials at this agency. We also analyzed reports on performance management and NSPS published by the U.S. Merit Systems Protection Board, the Congressional Research Service, and GAO. Finally, we interviewed a representative from the American Federation of Government Employees as well as the coalition of DOD unions and analyzed relevant legal documents, such as the outcome of NSPS lawsuits. Determination of Civilian Personnel's Perceptions of NSPS: To determine how DOD civilian employees perceive NSPS, we analyzed two sources of employee perceptions or attitudes. First, we analyzed the results of DOD's survey of civilian employees. Second, we conducted small group discussions with DOD civilian employees who had converted to NSPS and administered a short questionnaire to discussion group participants to collect information on their background, tenure with the federal service and DOD, and attitudes toward NSPS. Analysis of DOD Survey Results: We analyzed employee responses to DOD's SOFS of civilian employees-- including the May 2006, November 2006, and May 2007 administrations--to gauge employee attitudes toward NSPS and performance management in general and to identify early indications of movement in employee perceptions. The Defense Manpower Data Center (DMDC) within DOD has conducted large-scale, departmentwide surveys of active military personnel since 2002, called the Status of Forces Active Duty Survey. DMDC has also conducted surveys of reserve military personnel for DOD (Status of Forces Reserve survey). GAO has reviewed the survey results from prior active and reserve military personnel surveys and found the survey results sufficiently reliable to use for several GAO engagements.[Footnote 49] DMDC has conducted DOD-wide surveys of civilian employees since October 2003. The SOFS for civilian employees was created to measure the attitudes and opinions of these employees. The survey was developed to satisfy the requirement[Footnote 50] to assess, among other things, employee satisfaction with leadership policies and practices; work environment; and rewards and recognition for professional accomplishment and personal contributions to achieving organizational mission. According to DOD, the May 2006 SOFS for civilian employees was the first to capture the attitudes of civilian employees under NSPS. The May 2007 survey was administered from May 7 to June 15, 2007, to more than 102,000 DOD civilian employees.[Footnote 51] Review of Statistical Validity of DOD's Survey of Civilians: To review whether DOD's surveys of civilians were appropriately designed and statistically valid, a team made up of GAO social science analysts with survey research expertise and GAO's Chief Statistician (1) reviewed relevant documentation provided by DMDC regarding the survey methods used in their surveys of DOD civilians, (2) interviewed DMDC officials who had knowledge of or were involved in the development and administration of the DMDC surveys of civilians, and (3) reviewed the results for selected NSPS questions from the May and November 2006 and May 2007 surveys of DOD civilians. We determined that the survey results are sufficiently reliable for the purposes of this report; however we identified areas for improvement. Based on the documentation of the DMDC's surveys of DOD civilians, we concluded that they were generally conducted in accordance with standard research practices. The civilian survey sample design, which determined which DOD civilians were selected for the survey, was reasonable and allowed for making appropriate comparisons between groups of civilians who are in the NSPS system and the rest of DOD civilians. It also distinguished between groups of NSPS employees who entered into the new performance system at different times (spirals 1.1, 1.2, 1.3), allowing for appropriate statistical comparisons between groups over time. The development of the full list of DOD civilians from which to sample, or the population sampling frame, was reasonable, and does not appear to suffer from any significant under-or overcoverage of the target population. The design of the sample and the related survey respondent selection methods were appropriate to develop statistically valid survey estimates.[Footnote 52] Based on the reported percentage of the sample of DOD civilians who were located (96.67 percent), it appears that respondent contact information for the sample was adequate, allowing the survey to reach respondents at a high rate. Generally, DOD civilian questionnaires were appropriately designed. The questions that are specifically related to the NSPS are developed through a process whereby PEO officials review and suggest questions to DMDC survey officials. PEO develops new questions and alternative wordings for existing questions based on NSPS employee input through focus groups as well as PEO officials' observations about the program. DMDC's survey researchers then work with questions provided by PEO staff to revise them as necessary for balance and clarity. The survey was implemented via the Web and response follow-up activities made it possible for them to reach response rates comparable to other governmentwide surveys. Weighted response rates for the SOFS for civilian employees were: 59 percent in May 2006, 55 percent in November 2006, and 59 percent in May 2007. Similarly, the 2006 Federal Human Capital Survey, a large, stratified random sample survey of civilian government employees conducted every 2 years, achieved a response rate of 57 percent. To address nonresponse, the survey estimates incorporated appropriate statistical weighting techniques. Although the NSPS question items were developed using input from NSPS program officials and employees in the new system, the wording of questions was not pretested using cognitive interviewing techniques to assess clarity and comprehension and to minimize the risk of differing interpretations by those completing the questionnaires. Cognitive testing of survey items is a good practice used by survey researchers. We understand that survey researchers need to balance revising questions for validity and data quality with the need for survey questions that can be compared over time. Nonetheless, some of the question-stem wording and response categories could be improved. For instance, some questions' stems are worded positively and then respondents are asked to respond to an "agree/disagree" scale. For example, one question is worded, "To what extent do you agree or disagree with the following statements? The performance appraisal system I am under improves organizational performance," and the response options are: "Strongly agree, Agree, Neither agree nor disagree, Disagree, Strongly disagree." This question is worded positively and the scale of responses includes both positive and negative responses. It is difficult to interpret a response of "Disagree" or "Strongly disagree." A respondent selecting one of these options disagrees with the statement presented, but we cannot determine whether they believe that the system has no influence on organizational performance or whether they believe that organizational performance is worse because of the system. In addition, DMDC has not had a group of external experts review established survey practices for suggestions and recommendations, which is a best practice in survey research. Expert review for other large- scale federal surveys sometimes takes the form of advisory oversight boards, some of whose members have methodological expertise that allows them to make suggestions about the survey processes and particular projects. Lastly, DMDC does not perform nonresponse analysis to clarify whether those who did not respond to the survey may provide substantively different answers than those who did respond. The level of nonresponse warrants using at least some of the methods available for assessing whether nonresponse bias might under-or overrepresent some respondent views on survey questions. For instance, it is conceivable that employees' ratings might influence whether or not they are likely to reply to this survey, making it possible that some views are not reflected in the survey estimates for some questions, particularly for NSPS questions. The survey results could be interpreted more confidently if nonresponse analysis was done to establish whether or not it is likely that there are any systematic biases due to some civilians being more or less likely to respond to the survey. Discussion Groups: We conducted 3 discussion groups with civilian employees at each of the 12 sites we visited, for a total of 36 discussion groups. Our overall objective in using the discussion group approach was to obtain employees' perceptions about NSPS and its implementation thus far. Discussion groups, which are similar in nature and intent to focus groups, involve structured small group discussions that are designed to obtain in-depth information about specific issues. The information obtained is such that it cannot easily be obtained from a set of individual interviews. From each location, we obtained lists of employees and information on their length of employment and supervisory status. We divided these lists into three groups: employees with 0 to 5 years of service, employees with 6 or more years of service, and supervisors. We randomly selected 20 employees from each of these three groups.[Footnote 53] The employee names and a standard invitation were supplied to our points of contact to disseminate to employees. At the majority of locations, we reached our goal of meeting with 8 to 12 employees in each discussion group; however, since participation was not compulsory, in some instances we did not reach the recommended 8 participants in the group. Discussions were held in a semistructured manner, led by a moderator who followed a standardized list of questions. The discussions were documented by one or two other analysts at each location. Scope of Our Discussion Groups: In conducting our discussion groups, our intent was to achieve saturation--the point at which we were no longer hearing new information. As noted, we conducted 36 discussion groups with three classifications of DOD civilian employees at the 12 DOD installations we visited (see table 5). Our design allowed us to identify differences, if any, in employee perceptions held by supervisors and employees with different lengths of employment. Discussion groups were conducted between November 2007 and March 2008. Table 5: Composition of Discussion Groups: Location: Air Force; March Air Reserve Base, California; Employees with 0-5 years of experience: 8; Employees with 6 or more years of experience: 12; Supervisors: 13; Total participants in discussion groups: 33; Total NSPS employees at location[A]: 275. Location: Air Force; Randolph Air Force Base,Texas; Employees with 0-5 years of experience: 9; Employees with 6 or more years of experience: 9; Supervisors: 10; Total participants in discussion groups: 28; Total NSPS employees at location[A]: 1,487. Location: Air Force; Tinker Air Force Base, Oklahoma; Employees with 0-5 years of experience: 13; Employees with 6 or more years of experience: 15; Supervisors: 7; Total participants in discussion groups: 35; Total NSPS employees at location[A]: 2,538. Location: Army; Redstone Arsenal, Alabama; Employees with 0-5 years of experience: 11; Employees with 6 or more years of experience: 10; Supervisors: 11; Total participants in discussion groups: 32; Total NSPS employees at location[A]: 1,108. Location: Army; Fort Huachuca, Arizona; Employees with 0-5 years of experience: 9; Employees with 6 or more years of experience: 10; Supervisors: 10; Total participants in discussion groups: 29; Total NSPS employees at location[A]: 673. Location: Army; Fort Sam Houston, Texas; Employees with 0-5 years of experience: 7; Employees with 6 or more years of experience: 7; Supervisors: 7; Total participants in discussion groups: 21; Total NSPS employees at location[A]: 1,190. Location: Navy; Joint Warfare Analysis Center, Dahlgren, Virginia; Employees with 0-5 years of experience: 12; Employees with 6 or more years of experience: 11; Supervisors: 11; Total participants in discussion groups: 34; Total NSPS employees at location[A]: 467. Location: Navy; Naval Facilities Headquarters, Navy Yard, D.C.; Employees with 0-5 years of experience: 12; Employees with 6 or more years of experience: 7; Supervisors: 7; Total participants in discussion groups: 26; Total NSPS employees at location[A]: 3,057. Location: Navy; Marine Corps Tactical Systems Support Activity, Camp Pendleton, California; Employees with 0-5 years of experience: 9; Employees with 6 or more years of experience: 11; Supervisors: 12; Total participants in discussion groups: 32; Total NSPS employees at location[A]: 183. Location: Fourth Estate; Defense Microelectronics Activity, California; Employees with 0-5 years of experience: 8; Employees with 6 or more years of experience: 15; Supervisors: 11; Total participants in discussion groups: 34; Total NSPS employees at location[A]: 121. Location: Fourth Estate; DOD Inspector General, Arlington, Virginia; Employees with 0-5 years of experience: 7; Employees with 6 or more years of experience: 16; Supervisors: 8; Total participants in discussion groups: 31; Total NSPS employees at location[A]: 1,316. Location: Fourth Estate; Defense Threat Reduction Agency, Ft. Belvoir, Virginia; Employees with 0-5 years of experience: 4; Employees with 6 or more years of experience: 10; Supervisors: 7; Total participants in discussion groups: 21; Total NSPS employees at location[A]: 616. Location: Fourth Estate; Total; Employees with 0-5 years of experience: 109; Employees with 6 or more years of experience: 133; Supervisors: 114; Total participants in discussion groups: 357[B]; Total NSPS employees at location[A]: 13,031. Source: GAO. [A] The totals listed include employees spiraled into NSPS as of February 2008. [B] For one questionnaire we received, the location was not provided by the respondent; therefore, the total participants for all discussion groups above sums up to 356. However, we received questionnaires from 357 participants. [End of table] Methodology of Our Discussion Groups: A discussion guide was developed to facilitate the discussion group moderator in leading the discussions. The guide helped the moderator address several topics related to civilian employees' perceptions of the performance management system, including their overall perception of NSPS and the rating process, the training they received on NSPS, the communication they have with their supervisor, positive aspects of NSPS, and any changes they would make to NSPS, among others. Each discussion group began with the moderator greeting the participants, describing the purpose of the study, and explaining the procedures for the discussion group. Participants were assured that all of their comments would be discussed in the aggregate or as part of larger themes that emerged. The moderator asked participants open-ended questions related to NSPS. All discussion groups were moderated by a GAO analyst, while at least one other GAO analyst observed the discussion group and took notes. After each discussion group, the moderator and note taker reviewed the notes from the session to ensure that all comments were captured accurately. Content Analysis: We performed content analysis of our discussion group sessions in order to identify the themes that emerged during the sessions and to summarize participant perceptions of NSPS. We reviewed responses from several of the discussion groups and created a list of themes and subthemes. We then reviewed the comments from each of the 36 discussion groups and assigned each comment to the appropriate subtheme category, which was agreed upon by two analysts. If agreement was not reached on a comment's placement in a category, another analyst reconciled the issue by placing the comment in either one or more of the categories. The responses in each category were then used in our evaluation and discussion of how civilian employees perceive NSPS. Limitations: Discussion groups are not designed to (1) demonstrate the extent of a problem or to generalize the results to a larger population, (2) develop a consensus to arrive at an agreed-upon plan or make decisions about what actions to take, or (3) provide statistically representative samples or reliable quantitative estimates. Instead, discussion groups are intended to provide in-depth information about participants' reasons for holding certain attitudes about specific topics and to offer insights into the range of concerns and support for an issue. Specifically, the projectability of the information obtained during our discussion groups is limited for two reasons. First, the information gathered during our discussion groups on NSPS represents the responses of only the civilian employees present in our 36 discussion groups. The experiences of other civilian employees under NSPS who did not participate in our discussion groups may have varied. Second, while the composition of our discussion groups was designed to assure a distribution of civilian employees under NSPS, our sampling did not take into account any other demographic or job-specific information. Rather, our groups were determined solely on the basis of the employee's supervisor or nonsupervisor classification and the employee's length of service with DOD. Use of a Questionnaire to Supplement Discussion Group Findings: We administered a questionnaire to discussion group participants to obtain further information on their background and perceptions of NSPS. The questionnaire was administered and received from 357 participants of our discussion groups. The purpose of our questionnaire was to (1) collect demographic data from participants for the purpose of reporting with whom we spoke (see table 6); (2) collect information from participants that could not easily be obtained through discussion, e.g., information participants may have been uncomfortable sharing in a group setting; and (3) collect some of the same data found in past DOD surveys. Specifically, the questionnaire included questions designed to obtain employees' perceptions of NSPS as compared to their previous personnel system; the accuracy with which they felt their ratings reflected their performance; and management's methods for conveying individual and group rating information. Since the questionnaire was used to collect supplemental information and was administered solely to the participants of our discussion groups, the results represent the opinions of only those employees who participated in our discussion groups. Therefore, the results of our questionnaire cannot be generalized across the population of DOD civilian employees. Table 6: Table 6: Composition of Discussion Groups by Demographic Category per Component: Category: Male; Service: Air Force: 64; Service: Army: 50; Service: Navy: 62; Service: Fourth Estate: 55; Service: Total: 231. Category: Female; Service: Air Force: 32; Service: Army: 32; Service: Navy: 30; Service: Fourth Estate: 31; Service: Total: 125. Category: Total; Service: Air Force: 96; Service: Army: 82; Service: Navy: 92; Service: Fourth Estate: 86; Service: Total: 356[A]. Category: American Indian or Alaskan Native; Service: Air Force: 2; Service: Army: 1; Service: Navy: 0; Service: Fourth Estate: 0; Service: Total: 3. Category: Asian; Service: Air Force: 13; Service: Army: 2; Service: Navy: 6; Service: Fourth Estate: 7; Service: Total: 28. Category: Black/African American; Service: Air Force: 7; Service: Army: 16; Service: Navy: 7; Service: Fourth Estate: 17; Service: Total: 47. Category: Hispanic; Service: Air Force: 5; Service: Army: 1; Service: Navy: 1; Service: Fourth Estate: 8; Service: Total: 15. Category: Native Hawaiian or other Pacific Islander; Service: Air Force: 2; Service: Army: 0; Service: Navy: 2; Service: Fourth Estate: 1; Service: Total: 5. Category: White; Service: Air Force: 65; Service: Army: 56; Service: Navy: 71; Service: Fourth Estate: 48; Service: Total: 240. Category: Missing or Non-response; Service: Air Force: 2; Service: Army: 6; Service: Navy: 5; Service: Fourth Estate: 5; Service: Total: 18. Category: Total; Service: Air Force: 96; Service: Army: 82; Service: Navy: 92; Service: Fourth Estate: 86; Service: Total: 356[A]. Source: GAO. [A] Participants voluntarily self-reported demographic information in our questionnaire; some participants did not provide responses for all demographic questions. In addition, participants could select more than one response category for the ethnic and racial questions. Further, this table does not include results from one questionnaire, because we were unable to determine the service with which it was associated. Therefore, totals may not match overall total of 357 participants. [End of table] We conducted our review from August 2007 to July 2008 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient and appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. [End of section] Appendix II: Example of Linking Performance to Mission and Objectives: This slide was one of many presented to employees at Fort Sam Houston as part of a briefing titled "Garrison Action Plan and IPB." The slide is designed to show employees that their work and performance are directly aligned with the organization's mission goals. Specifically, this chart shows National Security Personnel System (NSPS) as the foundation or bottom of the pyramid leading up to the command's strategic plan. Further, the chart was designed to show employees that their individual objectives--which were to be "SMART"--should connect to the garrison's action plan and ultimately to the strategic plan. Figure 4: Example of Linking Performance to Mission and Objectives: [See PDF for image] This figure illustrates an example of linking performance to mission and objectives, as follows: Performance Plan is really the foundation: Illustration of a pyramid with four levels: Base level: NSPS individual performance plans and GS/GM support forms; Second level: Operating plans (METL/CLS); * Appropriate to your position: - Garrison mission; director; office chief; - METL task; Third level: Garrison action plan: * SMART: Strategic action plan (Specific, Measurable, Aligned, Realistic and relevant, Timed); - Objective/initiative; - Metric; Top level: IMCOM Strat plan; * Appropriate to your position: - Garrison mission; director; office chief; - METL task; Part B: Relevant organizational mission/strategic goals. Source: DOD. [End of figure] [End of section] Appendix III: Additional Responses to 2007 Status of Forces Survey of DOD Civilian Employees: In addition to the responses to the Department of Defense's (DOD) Status of Forces Survey-Civilian (SOFS-C) we presented on page 30, we also identified other employee responses related to the National Security Personnel System (NSPS) or performance management from the 2007 SOFS-C. The survey asked DOD civilian employees questions on various topics, such as overall satisfaction, leadership and management, retention, personnel actions, motivation/development/ involvement, performance management, and the NSPS. The following tables provide estimated percentage of employee responses. Table 7: Estimated Percentage of Employees Responding to Questions about Overall Satisfaction and Leadership and Management in May 2007 Status of Forces Survey-Civilian: Question: Overall satisfaction; Overall satisfaction with pay; Employee description: DOD; Response: Satisfied: 62%; Response: Neither: 16%; Response: Dissatisfied: 20%. Question: Overall satisfaction; Overall satisfaction with pay; Employee description: NSPS; Response: Satisfied: 63; Response: Neither: 16; Response: Dissatisfied: 19. Question: Leadership and management; Overall, how satisfied are you with management at your organization?; Employee description: DOD; Response: Satisfied: 49; Response: Neither: 25; Response: Dissatisfied: 26. Question: Leadership and management; Overall, how satisfied are you with management at your organization?; Employee description: NSPS; Response: Satisfied: 54; Response: Neither: 22; Response: Dissatisfied: 24. Question: Leadership and management; How satisfied are you with the policies and practices of your senior leaders?; Employee description: DOD; Response: Satisfied: 49; Response: Neither: 27; Response: Dissatisfied: 25. Question: Leadership and management; How satisfied are you with the policies and practices of your senior leaders?; Employee description: NSPS; Response: Satisfied: 52; Response: Neither: 25; Response: Dissatisfied: 24. Question: Leadership and management; How satisfied are you with the recognition you receive for doing a good job?; Employee description: DOD; Response: Satisfied: 48; Response: Neither: 26; Response: Dissatisfied: 26. Question: Leadership and management; How satisfied are you with the recognition you receive for doing a good job?; Employee description: NSPS; Response: Satisfied: 50; Response: Neither: 25; Response: Dissatisfied: 24. Source: GAO analysis of DOD data. Note: The estimated percentages are based on a 95 percent confidence interval and the margin of error is within +/-2 percent. The response categories are collapsed for positive ("satisfied") and negative ("dissatisfied") responses. That is, "satisfied" is the estimated percentage of employees who responded either "satisfied" or "very satisfied," while "dissatisfied" is the estimated percentage of employees who responded either "dissatisfied" or "very dissatisfied." [End of table] Table 8: Estimated Percentage of Employees Responding to Questions about Leadership and Management, Motivation/Development/Involvement, and Performance Management in May 2007 Status of Forces Survey- Civilian: Question: Leadership and management; I have trust and confidence in my supervisor; Employee description: DOD; Agree: 62%; Neither: 19%; Disagree: 19%. Question: Leadership and management; I have trust and confidence in my supervisor; Employee description: NSPS; Agree: 66; Neither: 17; Disagree: 17. Question: Leadership and management; Managers/supervisors deal effectively with reports of prejudice and discrimination; Employee description: DOD; Agree: 52; Neither: 34; Disagree: 13. Question: Leadership and management; Managers/supervisors deal effectively with reports of prejudice and discrimination; Employee description: NSPS; Agree: 56; Neither: 33; Disagree: 10. Question: Leadership and management; Managers/supervisors/team leaders work well with employees of different backgrounds; Employee description: DOD; Agree: 61; Neither: 21; Disagree: 17. Question: Leadership and management; Managers/supervisors/team leaders work well with employees of different backgrounds; Employee description: NSPS; Agree: 67; Neither: 18; Disagree: 14. Question: Leadership and management; Managers review and evaluate the organization's progress toward meeting its goals and objectives; Employee description: DOD; Agree: 58; Neither: 23; Disagree: 17. Question: Leadership and management; Managers review and evaluate the organization's progress toward meeting its goals and objectives; Employee description: NSPS; Agree: 62; Neither: 21; Disagree: 16. Question: Leadership and management; I have a high level of respect for my organization's senior leaders; Employee description: DOD; Agree: 51; Neither: 24; Disagree: 25. Question: Leadership and management; I have a high level of respect for my organization's senior leaders; Employee description: NSPS; Agree: 56; Neither: 22; Disagree: 22. Question: Leadership and management; In my organization, leaders generate high levels of motivation and commitment in the workforce; Employee description: DOD; Agree: 41; Neither: 28; Disagree: 31. Question: Leadership and management; In my organization, leaders generate high levels of motivation and commitment in the workforce; Employee description: NSPS; Agree: 45; Neither: 27; Disagree: 28. Question: Motivation and morale; To what extent do you agree or disagree with the following statement?; I know how my work relates to the agency's goals and priorities; Employee description: DOD; Agree: 82; Neither: 12; Disagree: 5. Question: Motivation and morale; To what extent do you agree or disagree with the following statement?; I know how my work relates to the agency's goals and priorities; Employee description: NSPS; Agree: 83; Neither: 11; Disagree: 5. Question: Performance management; Performance appraisal is fair reflection of performance; Employee description: DOD; Agree: 66; Neither: 18; Disagree: 14. Question: Performance management; Performance appraisal is fair reflection of performance; Employee description: NSPS; Agree: 68; Neither: 18; Disagree: 12. Question: Performance management; Creativity and innovation are rewarded; Employee description: DOD; Agree: 39; Neither: 31; Disagree: 28. Question: Performance management; Creativity and innovation are rewarded; Employee description: NSPS; Agree: 45; Neither: 29; Disagree: 24. Question: Performance management; Promotions in work unit are based on merit; Employee description: DOD; Agree: 33; Neither: 29; Disagree: 35. Question: Performance management; Promotions in work unit are based on merit; Employee description: NSPS; Agree: 41; Neither: 28; Disagree: 27. Question: Performance management; In most recent appraisal, I understood what I had to do to be rated at different performance levels; Employee description: DOD; Agree: 65; Neither: 16; Disagree: 14. Question: Performance management; In most recent appraisal, I understood what I had to do to be rated at different performance levels; Employee description: NSPS; Agree: 65; Neither: 16; Disagree: 15. Question: Performance management; Performance standards/expectations are directly related to the organization's mission; Employee description: DOD; Agree: 66; Neither: 24; Disagree: 9. Question: Performance management; Performance standards/expectations are directly related to the organization's mission; Employee description: NSPS; Agree: 71; Neither: 21; Disagree: 8. Question: Performance management; My bonus and cash awards depend on how well I perform my job; Employee description: DOD; Agree: 52; Neither: 20; Disagree: 27. Question: Performance management; My bonus and cash awards depend on how well I perform my job; Employee description: NSPS; Agree: 61; Neither: 19; Disagree: 20. Question: Performance management; My current performance appraisal system motivates me to perform well; Employee description: DOD; Agree: 47; Neither: 31; Disagree: 22. Question: Performance management; My current performance appraisal system motivates me to perform well; Employee description: NSPS; Agree: 47; Neither: 30; Disagree: 23. Question: Performance management; The people I work with cooperate to get the job done; Employee description: DOD; Agree: 74; Neither: 16; Disagree: 10. Question: Performance management; The people I work with cooperate to get the job done; Employee description: NSPS; Agree: 79; Neither: 13; Disagree: 8. Question: Performance management; The performance appraisal system I am under improves organizational performance; Employee description: DOD; Agree: 31; Neither: 40; Disagree: 29. Question: Performance management; The performance appraisal system I am under improves organizational performance; Employee description: NSPS; Agree: 28; Neither: 41; Disagree: 31. Source: GAO analysis of DOD data. Note: The estimated percentages are based on a 95 percent confidence interval and the margin of error is within +/-2 percent. The response categories are collapsed for positive ("agree") and negative ("disagree") responses. That is, "agree" is the estimated percentage of employees who responded either "agree" or "strongly agree," while "disagree" is the estimated percentage of employees who responded either "disagree" or "strongly disagree." [End of table] Table 9: Estimated Percentage of Employees Responding to Question about Performance Management in May 2007 Status of Forces Survey-Civilian: Question: Performance management; How useful is feedback?; Employee description: DOD; Useful: 68%; Neither: 25%; Useless: 8%. Question: Performance management; How useful is feedback?; Employee description: NSPS; Useful: 68; Neither: 25; Useless: 7. Source: GAO analysis of DOD data. Note: The estimated percentages are based on a 95 percent confidence interval and the margin of error is within +/-2 percent. The response categories are collapsed for positive ("useful") and negative ("useless") responses. That is, "useful" is the estimated percentage of employees who responded either "useful" or "very useful," while "useless" is the estimated percentage of employees who responded either "useless" or "very useless." [End of table] Table 10: Estimated Percentage of Employees Responding to Question about Performance Management in May 2007 Status of Forces Survey- Civilian: Question: Do you receive performance feedback?; Employee description: DOD; Yes, regularly throughout year: 33%; Yes,occasionally or at least once during the year: 50%; No: 16%. Question: Do you receive performance feedback?; Employee description: NSPS; Yes, regularly throughout year: 36; Yes,occasionally or at least once during the year: 51; No: 13. Source: GAO analysis of DOD data. Note: The estimated percentages are based on a 95 percent confidence interval and the margin of error is within +/-2 percent. [End of table] Table 11: Estimated Percentage of Employees Responding to Questions about Retention and Commitment in May 2007 Status of Forces Survey- Civilian: Question: Retention and commitment; How likely is it that you will leave at the next available opportunity to take another job in the federal government outside of the DOD?; Employee description: DOD; Likely: 33%; Neither: 21%; Unlikely: 46%. Question: Retention and commitment; How likely is it that you will leave at the next available opportunity to take another job in the federal government outside of the DOD?; Employee description: NSPS; Likely: 34; Neither: 20; Unlikely: 47. Source: GAO analysis of DOD data. Note: The estimated percentages are based on a 95 percent confidence interval and the margin of error is within +/-1 percent. The response categories are collapsed for positive ("likely") and negative ("unlikely") responses. That is, "likely" is the estimated percentage of employees who responded either "likely" or "very likely," while "unlikely" is the estimated percentage of employees who responded either "unlikely" or "very unlikely." [End of table] Table 12: Estimated Percentage of Employees Responding to Questions about the National Security Personnel System in May 2007 Status of Forces Survey-Civilian: Question: National Security Personnel System: Usefulness of NSPS training; NSPS performance management for managers/supervisors; Employee description: National Security Personnel System: DOD; Useful: National Security Personnel System: 71%; Neither: National Security Personnel System: 22%; Useless: National Security Personnel System: 7%. Question: National Security Personnel System: Usefulness of NSPS training; NSPS performance management for managers/supervisors; Employee description: National Security Personnel System: NSPS; Useful: National Security Personnel System: 72; Neither: National Security Personnel System: 21; Useless: National Security Personnel System: 7. Question: National Security Personnel System: Usefulness of NSPS training; human resources elements for managers, supervisors, and employees; Employee description: National Security Personnel System: DOD; Useful: National Security Personnel System: 70; Neither: National Security Personnel System: 22; Useless: National Security Personnel System: 8. Question: National Security Personnel System: Usefulness of NSPS training; human resources elements for managers, supervisors, and employees; Employee description: National Security Personnel System: NSPS; Useful: National Security Personnel System: 70; Neither: National Security Personnel System: 22; Useless: National Security Personnel System: 8. Question: National Security Personnel System: Usefulness of NSPS training; NSPS performance management for employees; Employee description: National Security Personnel System: DOD; Useful: National Security Personnel System: 66; Neither: National Security Personnel System: 25; Useless: National Security Personnel System: 8. Question: National Security Personnel System: Usefulness of NSPS training; NSPS performance management for employees; Employee description: National Security Personnel System: NSPS; Useful: National Security Personnel System: 66; Neither: National Security Personnel System: 25; Useless: National Security Personnel System: 9. Question: National Security Personnel System: Usefulness of NSPS training; NSPS pay pool management; Employee description: National Security Personnel System: DOD; Useful: National Security Personnel System: 64; Neither: National Security Personnel System: 27; Useless: National Security Personnel System: 10. Question: National Security Personnel System: Usefulness of NSPS training; NSPS pay pool management; Employee description: National Security Personnel System: NSPS; Useful: National Security Personnel System: 62; Neither: National Security Personnel System: 27; Useless: National Security Personnel System: 10. Source: GAO analysis of DOD data. Note: The estimated percentages are based on a 95 percent confidence interval and the margin of error is within +/-2 percent. The response categories are collapsed for positive ("useful") and negative ("useless") responses. That is "useful" is the estimated percentage of employees who responded either "useful" or "very useful," while useless is the estimated percentage of employees who responded either "useless" or "very useless." [End of table] Table 13: Estimated Percentage of Employees Responding to Questions about the National Security Personnel System in May 2007 Status of Forces Survey-Civilian: Question: National Security Personnel System: Overall, what type of impact will NSPS have on personnel practices in the DOD?; Employee description: National Security Personnel System: DOD; Positive: National Security Personnel System: 25%; Neither: National Security Personnel System: 38%; Negative: National Security Personnel System: 36%. Question: National Security Personnel System: Overall, what type of impact will NSPS have on personnel practices in the DOD?; Employee description: National Security Personnel System: NSPS; Positive: National Security Personnel System: 28; Neither: National Security Personnel System: 34; Negative: National Security Personnel System: 38. Source: GAO analysis of DOD data. Note: The estimated percentages are based on a 95 percent confidence interval and the margin of error is within +/-1 percent. The response categories are collapsed for "positive" and "negative." That is, "positive" is the estimated percentage of employees who responded either "positive" or "very positive," while "negative" is the estimated percentage of employees who responded either "negative" or "very negative." [End of table] [End of section] Appendix IV: Other Themes Discussed by Department of Defense Civilians during GAO Discussion Groups: In addition to the themes that emerged during our discussion groups with select Department of Defense (DOD) civilian employees, which we presented on pages 34-39, we also identified other themes or topics that were discussed less prevalently by employees across all of the discussion groups we held. See table 14. Table 14: Additional Themes that Emerged during Discussion Groups with Select Employees: Job objectives: Employees at six sites we visited expressed concerns about the time and effort it took to develop their objectives. Specifically, employees said it was difficult to write "SMART" (Specific, Measurable, Aligned, Realistic/Relevant, Timed) objectives that adequately captured all aspects of their job at the start of the performance cycle because, according to them, their jobs were often unpredictable or involved unexpected tasks over the course of the year. Employees also expressed concern over the fact that their objectives had to be rewritten several times during the year to incorporate shifting job duties. Others felt that their objectives were written in such a manner that made them impossible to exceed. Teamwork: At six sites, employees we spoke with told us that NSPS has, and will continue to have, a negative impact on team work. Some participants in our discussion groups told us that employees no longer want to assist each other with their work because they are worried about getting credit for the work and would prefer to make themselves look better in front of management. Specifically, employees were concerned that taking the time to help others takes time away from their own work, which is the basis for their objectives, rating, and eventual payout. Thus, we heard from one employee that assisting others will "help out another's pocket" while financially disadvantaging them. Further, two employees we spoke with told us that employees are even keeping projects secret so that they can get credit for independently completing the project. However, at other locations we visited, participants discussed the use of the contributing factor, collaborating with others, to counter these employee concerns. Reconsideration: Employees at four sites expressed concern about the reconsideration process. Specifically, some employees in our discussion groups told us that even though they received information about the reconsideration process, they would not challenge a rating because they felt management would no longer view them as a team player. Further, some employees expressed concern that if they did challenge their rating, their supervisor or management would seek retribution during the next rating cycle. Other employees saw no benefit in challenging their rating because the disputed rating is reviewed by the same individuals who finalized the rating. Ratings: As discussed on pages 34-35, employees at all sites we visited expressed some concern over the rating process. In addition to the concerns previously discussed, employees raised additional concerns with the process through which their ratings were determined. Some discussion group participants we spoke with said that more granularity was needed in the rating distribution and share values. One employee suggested using a 1 to 10 scale, as a way to better distinguish among employee performance. Further, employees at four locations we visited told us they would prefer to have their rating separate from their pay increase. Specifically, one employee told us that a smaller payout tied to a "good" (i.e., role model) rating was preferable. One employee, in particular, told us that her performance is tied to her self-esteem, including the praise she receives from management, and she did not want it tied to money. Further, several participants told us that supervisors were hesitant or not inclined to give employees a rating other than a "3" because, for example, it required too much paperwork to give a lower rating or they did not want to be seen as the "bad guy." Still other participants expressed concern that the weights assigned to specific objectives be used to impact ratings. Timing of the cycle: Employees we spoke with at eight locations told us that the lag between when the rating period ends, when they submit their self-assessment, and when they actually receive their rating and their payout is too long. Some employees expressed concern that they were already several months into the next rating cycle and working towards new objectives before they received their rating and feedback from the previous rating cycle. One employee in particular told us that, although the command was almost 5 months into the rating cycle, employees at that location did not have their objectives finalized. The employee further told us that the prior year's discussions on objectives and midyear review sessions were held at the same time. In addition, some employees were concerned that there were too many competing priorities--holiday leave and budgetary requirements--at the end of the fiscal year when the ratings and pay panel process occurred. Control points: Employees we spoke with at five sites expressed confusion and discontent over the existence of "pay lanes," "pay caps," and/or "control points." Specifically, employees told us that they were unaware of the pay lanes, pay caps, and control points prior to the system's implementation and only learned of these pay constraints once they were under the system. Employees further told us that they had thought they could advance to the top of their pay band, potentially earning more money through their performance increases than they would have through General Schedule step increases. However, once the system was implemented, several discussion group participants learned that artificial pay constraints would not allow them to reach the top of the pay bands and, upon reaching their pay caps, they would receive subsequent performance payouts as bonuses. Other positive comments on NSPS: Discussion group participants at 11 of the 12 sites we visited spoke positively about certain aspects of NSPS. Specifically, some discussion group participants said that the initial design and intent of the system were good. Specifically, some employees commented on the system's ability to recognize performance. Employees told us that they liked that pay increases were based on performance and not on seniority, allowing them to receive pay increases faster than under the General Schedule. Other employees and supervisors told us that NSPS gives managers more flexibility to reward strong performers while allowing them to deal more effectively with poor performers. Finally, at some locations we heard that NSPS has increased the amount of communication between employees and their supervisors. For example, some discussion group participants have found that the process of drafting their self-assessment gave them the opportunity to point out accomplishments or activities to their supervisor that may have been overlooked. In addition, some discussion group participants have found that supervisors are providing their employees with more meaningful feedback on their performance. Source: GAO analysis. [End of table] [End of section] Appendix V: Comments from the Department of Defense: Department Of Defense: National Security Personnel System: Program Executive Office: 1400 Key Boulevard Suite B200: Arlington, VA 22209-5144: August 18, 2008: Ms. Brenda S. Farrell: Director, Defense Capabilities and Management: U.S. Government Accountability Office: 441 G Street, N.W. Washington, DC 20548: Dear Ms. Farrell: This is the Department of Defense (DoD) response to your draft report, "Human Capital: DoD Needs to Improve Implementation of and Address Employee Concerns about Its National Security Personnel System," dated July 17, 2008 (GAO Code 351086/GAO-08-773). We thank you for the opportunity to review and comment on the draft report. While the Department does not concur with all of the findings and recommendations in the draft report, we believe it strikes a balance between the Department's efforts to design and operate the National Security Personnel System (NSPS) performance management system so it is fair and credible, and the workforce's early concerns about this new, rigorous, and consequential pay for performance approach. We appreciate your recognition of the many safeguards we have in place. As we have implemented NSPS, we have heard many of the same concerns as your auditors and have attempted to differentiate between those that warrant prompt action, and those that reflect the uncertainty and skepticism that typically accompany major changes. We agree with your statement that organizational transformations such as NSPS require an adjustment period to gain employee acceptance and trust. NSPS transforms how the workforce is evaluated, compensated, and advanced along their career paths. Your report acknowledges that such changes often take years to be fully successful, and we believe that to be the case with NSPS. The Federal Human Capital Survey shows us that even the best, long-established systems do not enjoy total workforce support. Most of the NSPS workforce your team met with were in the system for one year and experienced only one performance appraisal cycle. As employees, supervisors, and managers gain practical experience with this system and understand it better, we believe their confidence will grow. We base this on years of experience with personnel demonstration projects. With respect to the system safeguards, the Department has taken great pains to design appropriate and effective safeguards to ensure that the performance management process is fair, equitable, and transparent. We recognize that these attributes are necessary to be credible in the eyes of the workforce. We continue to monitor these safeguards for credibility and effectiveness. Again, thank you for the opportunity to comment on the draft report. Our responses to the recommendations for executive action include comments on major items we would like to clarify or correct. We have provided you technical corrections under separate cover. We appreciate the care your team took to understand and recognize the challenges in implementing and working under NSPS. If you have any questions regarding this response, please do not hesitate to contact me. Sincerely, Signed by: Brad Bunn: Program Executive Officer: Enclosure: GAO Draft Report Dated July 17, 2008: GAO Code 351086/GAO-08-773: "Human Capital: DOD Needs to Improve Implementation of and Address Employee Concerns about Its National Security Personnel System" Department Of Defense Responses To Recommendations: Recommendation 1: The GAO recommends that the Secretary of Defense direct the National Security Personnel System Senior Executive to require a third party to perform pre-decisional demographic and other analysis as appropriate for pay pools. DOD Response: Nonconcur. On pages 2 and 3, the draft report erroneously says that the National Defense Authorization Act for Fiscal Year 2008 (NDAA 08) specified that, among other safeguards, GAO assess the extent to which the system incorporated "certain pre-decisional internal safeguards ... to help achieve consistency, equity, nondiscrimination, and nonpoliticization of the performance management process, e.g., independent reasonableness reviews by a third party." Neither NDAA 08 nor the original statutory authority for NSPS prescribes such a safeguard. The draft report adds it among the criteria prescribed by section 9902(b)(7) of title 5, United States Code for the NSPS performance management system. The NSPS pay pool process provides essential safeguards to ensure that the system adheres to merit principles, and that ratings and management of the system are fair, equitable, and based on employee performance. As the draft report notes, individual ratings recommended by a supervisor are reviewed by a higher level official and by at least one panel of management officials to ensure consistency and fairness across the pay pool. Rating officials, reviewers, and panel members apply standard, NSPS-wide performance indicators and benchmarks when they consider employees' performance assessments. Employees are encouraged to provide written self-assessments about their performance accomplishments which helps ensure panels have a full picture; and an employee who disagrees with his or her rating has several avenues of redress. in addition to the checks and balances inherent in the pay pool process, NSPS includes a crucial safeguard for fair, equitable and performance-based ratings: the rating reconsideration process. While the draft report notes that some employees expressed a lack of confidence in the process, we would point out that 2,302 employees filed requests for reconsiderations after the FY07 performance cycle, and 769 (or 33.41%) were decided in favor of the employee. In our view, this demonstrates the credibility and effectiveness of the rating reconsideration and pay pool process safeguards. We would also note that employees under NSPS continue to have access to the Equal Employment Opportunity (EEO) complaints process if they believe they are victims of illegal discrimination. However, we note that since the implementation of NSPS, the Department has not seen a demonstrable increase in formal EEO complaints. While we have no objection to demographic and other analyses for pay pools, we do not believe integrating such analyses as part of the predecisional pay pool deliberation process is warranted; and, in fact, they may have detrimental effects on the credibility of the system. We agree that such analyses can be used to ensure that the process is fair and equitable and to identify and address possible barriers that may affect some groups, but believe it should be done after the process in complete. Such analysis must not be used to manipulate results to achieve some type of parity among various groups. Post-decisional analysis of results is useful to identify barriers and corrective actions. If the information gleaned from demographic analysis demonstrates that the results were not fair or equitable, for whatever reason, this information could legitimately be employed to examine the process used to achieve those results, with a view to identifying barriers to equal employment opportunity, if any, and eliminating them in order to achieve a more fair and equitable outcome. And if an analysis of pay pool results uncovers illegal discrimination, management always has the ability and obligation to take corrective action. Recommendation 2: The GAO recommends that the Secretary of Defense direct the National Security Personnel System Senior Executive to require commands to publish the final overall rating results. DOD Response: Concur. As the draft report notes, the vast majority of organizations under NSPS are doing this. The Department will take steps to require all organizations under NSPS to share overall ratings results with their employees. Recommendation 3: The GAO recommends that the Secretary of Defense direct the National Security Personnel System Senior Executive to provide guidance to pay pools and supervisors that encourages them to rate employees appropriately, including using all categories of ratings as warranted by comparing employees' individual performance against the standards. DOD Response: Partially concur. Ratings under NSPS rest firmly on the foundation of the written assessments, and must be made in relation to the standard performance indicators and benchmarks. Transition to this performance-based pay system requires that leaders demonstrate a firm commitment to rigorous, fact-based rating. We do not agree with the generalization on page 28 that "it is questionable whether meaningful distinctions are being made in NSPS employees' performance ratings." The GAO report relies heavily on workforce opinions gleaned from its focus groups; we would recommend that GAO give more weight to NSPS' rigorous performance rating construct and criteria. NSPS is a pay banded system, with performance ratings and payouts that have the potential to advance employees rapidly at rates akin to GS promotions, not just a few percentage points in place of step increases and portions of the annual schedule adjustments. NSPS criteria for level 3 performance recognize employees who perform their responsibilities in a "valued" manner, effectively meeting their performance expectations. The level 3 "valued performer" level covers situations that require the employee to solve problems appropriate for the pay band, not just handle routine situations. NSPS reserves higher level ratings for employees who have significantly exceeded performance expectations. Level 5 indicators and enhanced level benchmarks reflect a very high bar. The system construct is that a level 3 rating is normal, and that a higher rating will be based on unusually high performance or good performance under unusually demanding circumstances. Half the DoD workforce comes from pass-fail systems where more than 99% of those covered received a "3" or "pass." Other large segments of the workforce come from multi-level systems, where more than 90% were rated at levels 4 and 5. With statutory emphasis on a pay for performance system with meaningful distinctions between the levels of performance, and a pay banded system with potentially significant pay consequences, our emphasis on the "valued performer" 3 level in pre-conversion training and during mock rating processes has been to recalibrate workforce expectations from previous systems in which nearly everyone got the highest available rating. We do not accept the assumption underlying the GAO recommendation that pay pools and supervisors are not rating employees appropriately. On page 29, the draft report includes data on the rating distributions, which show both variance among organizations and an overall outcome of more than 40% rated at level 4 or 5. The report also shows 1.8% of the ratings were at the "1" and "2" level (approximately four times more occurrences than happened under previous five-level systems). The draft does not present facts or observations from panels to indicate they suppressed justified ratings, only that they "were reluctant to award `too many 4s and 5s."' Our own after-action sessions with pay pool managers and panels indicate that they have more nuanced views than GAO suggests, based on their application of the rating criteria to assessments. We would also point out that NSPS performance appraisals are based on actual performance against standard benchmarks. Suggesting that all rating levels be used, despite the caveat that they be `warranted," could be interpreted as mandating rating distributions based on factors other than the rigorous evaluation of individual employee. Finally, we note that if employees believe their ratings are unfair or that a meaningful distinction has not been made in relation to the standard performance indicators and benchmarks, they have recourse to the rating reconsideration process. Of the 2,302 employees who filed reconsideration requests, (769 or 33.41%) received a favorable decision. We agree that we should continue to reinforce that performance evaluations must be based on actual performance against the standard criteria, and not a preconceived notion of a normal rating distribution. In continuing to train and inform those involved in the rating process, we will ensure these concepts, which currently exist in NSPS policies and training materials, are emphasized. Recommendation 4: The GAO recommends that the Secretary of Defense direct the National Security Personnel System Senior Executive to develop and implement a specific action plan to address employee perceptions of NSPS ascertained from feedback avenues such as, but not limited to, DoD's survey and DoD's and GAO's employee focus groups. For example, the plan should include actions to mitigate employee concerns about the potential influence that employees' and supervisors' writing skills have on the panels' assessment of employee ratings and the lack of transparency and understanding of the pay pool panel process. DOD Response: Partially concur. The Department will address areas of weakness identified in our comprehensive, in-progress evaluation of NSPS as implemented in Spiral One in a plan of action. At all levels in DoD, we apply continuous learning to identify weaknesses in NSPS and its operation that may warrant attention and adjustment. Opinions in some areas where necessary improvements are unequivocal are acted on immediately without a formal improvement plan. Examples include the additional courseware and training opportunities such as iSuccess on performance objectives and assessments, a series of improvements to the automated performance appraisal tools, additional displays and data in the pay pool automated tool and a complementary automated tool to roll up and analyze results from multiple pay pools, and local pay pool changes in some of their panel representation and business rules. Other examples are recent modifications to NSPS implementing issuances and changes in the revised NSPS regulations. With opinions in other areas, where issues are equivocal or people may be reacting more to change or newness, we monitor and gather additional facts that will help us understand the issues and decide on appropriate courses of action, if any. DoD soon will have the results of the 2008 Status of Forces Survey. These will reflect opinions after the second NSPS rating cycle for Spiral 1.1 employees and the first cycle for Spirals 1.2 and 1.3. We believe it is premature to draw actionable conclusions from the 2007 survey. (Note that the May 2006 survey opened three weeks before Spiral 1.1 conversions, and the November 2006 survey ran with only 25% of Spiral 1.2 employees having converted to NSPS at that time. The 2007 survey ran shortly after Spiral 1.3 conversions.) The draft report notes on page 35 and 36, "Results of our discussion groups are not generalizable to the entire population of DOD civilians" and "our previous work...have shown that organizational transformations...requires an adjustment period to gain employee acceptance and trust" and "major change management initiatives...can often take several years to be fully successful." Our approach to evaluation recognizes this reality, and therefore we are institutionalizing a continuous improvement strategy. [End of section] Appendix VI: GAO Contact and Staff Acknowledgments: GAO Contact: Brenda S. Farrell, (202) 512-3604, or farrellb@gao.gov. Acknowledgments: In addition to the contact named above, Ron Fecso, Chief Statistician; Marion Gatling (Assistant Director), Lori Atkinson, Margaret Braley, Renee Brown, Jennifer Harman, Ron La Due Lake, Janice Latimer, Jennifer C. Madison, Oscar Mardis, Belva Martin, Julia Matta, Luann Moy, Carl Ramirez, Terry Richardson, Carolyn Taylor, and Martha Tracy made key contributions to this report. [End of section] Related GAO Products: The Department of Defense's Civilian Human Capital Strategic Plan Does Not Meet Most Statutory Requirements. [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-08-439R]. Washington, D.C.: February 6, 2008. Human Capital: DOD Needs Better Internal Controls and Visibility over Costs for Implementing Its National Security Personnel System. [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-07-851]. Washington, D.C.: July 16, 2007. Human Capital: Federal Workforce Challenges in the 21st Century. [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-07-556T]. Washington, D.C.: March 6, 2007. Post-Hearing Questions for the Record Related to the Department of Defense's National Security Personnel System (NSPS). [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-06-582R]. Washington, D.C.: March 24, 2006. Human Capital: Observations on Final Regulations for DOD's National Security Personnel System. [hyperlink, http://www.gao.gov/cgi- bin/getrpt?GAO-06-227T]. Washington, D.C.: November 17, 2005. Human Capital: Designing and Managing Market-Based and More Performance- Oriented Pay Systems. [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO- 05-1048T]. Washington, D.C.: September 27, 2005. Human Capital: DOD's National Security Personnel System Faces Implementation Challenges. [hyperlink, http://www.gao.gov/cgi- bin/getrpt?GAO-05-730]. Washington, D.C.: July 14, 2005. Questions for the Record Related to the Department of Defense's National Security Personnel System. [hyperlink, http://www.gao.gov/cgi- bin/getrpt?GAO-05-771R]. Washington, D.C.: June 14, 2005. Questions for the Record Regarding the Department of Defense's National Security Personnel System. [hyperlink, http://www.gao.gov/cgi- bin/getrpt?GAO-05-770R]. Washington, D.C.: May 31, 2005. Post-Hearing Questions Related to the Department of Defense's National Security Personnel System. [hyperlink, http://www.gao.gov/cgi- bin/getrpt?GAO-05-641R]. Washington, D.C.: April 29, 2005. Human Capital: Agencies Need Leadership and the Supporting Infrastructure to Take Advantage of New Flexibilities. [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-05-616T]. Washington, D.C.: April 21, 2005. Human Capital: Selected Agencies' Statutory Authorities Could Offer Options in Developing a Framework for Governmentwide Reform. [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-05-398R]. Washington, D.C.: April 21, 2005. Human Capital: Preliminary Observations on Proposed Regulations for DOD's National Security Personnel System. [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-05-559T]. Washington, D.C.: April 14, 2005. Human Capital: Preliminary Observations on Proposed Department of Defense National Security Personnel System Regulations. [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-05-517T]. Washington, D.C.: April 12, 2005. Human Capital: Preliminary Observations on Proposed DOD National Security Personnel System Regulations. [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-05-432T]. Washington, D.C.: March 15, 2005. Human Capital: Principles, Criteria, and Processes for Governmentwide Federal Human Capital Reform. [hyperlink, http://www.gao.gov/cgi- bin/getrpt?GAO-05-69SP]. Washington, D.C.: December 1, 2004. Human Capital: Building on the Current Momentum to Transform the Federal Government. [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO- 04-976T]. Washington, D.C.: July 20, 2004. DOD Civilian Personnel: Comprehensive Strategic Workforce Plans Needed. [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-04-753]. Washington, D.C.: June 30, 2004. Human Capital: A Guide for Assessing Strategic Training and Development Efforts in the Federal Government. [hyperlink, http://www.gao.gov/cgi- bin/getrpt?GAO-04-546G]. Washington, D.C.: March, 2004. Human Capital: Implementing Pay for Performance at Selected Personnel Demonstration Projects. [hyperlink, http://www.gao.gov/cgi- bin/getrpt?GAO-04-83]. Washington, D.C.: January 23, 2004. Human Capital: Key Principles for Effective Strategic Workforce Planning. [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-04-39]. Washington, D.C.: December 11, 2003. DOD Personnel: Documentation of the Army's Civilian Workforce-Planning Model Needed to Enhance Credibility. [hyperlink, http://www.gao.gov/cgi- bin/getrpt?GAO-03-1046]. Washington, D.C.: August 22, 2003. Posthearing Questions Related to Proposed DOD Human Capital Reform. [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-03-965R]. Washington, D.C.: July 3, 2003. Human Capital: A Guide for Assessing Strategic Training and Development Efforts in the Federal Government. [hyperlink, http://www.gao.gov/cgi- bin/getrpt?GAO-03-893G]. Washington, D.C.: July, 2003. Defense Transformation: DOD's Proposed Civilian Personnel System and Governmentwide Human Capital Reform. [hyperlink, http://www.gao.gov/cgi- bin/getrpt?GAO-03-741T]. Washington, D.C.: May 1, 2003. Human Capital: DOD's Civilian Personnel Strategic Management and the Proposed National Security Personnel System. [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-03-493T]. Washington, D.C.: May 12, 2003. Human Capital: Building on DOD's Reform Efforts to Foster Governmentwide Improvements. [hyperlink, http://www.gao.gov/cgi- bin/getrpt?GAO-03-851T]. Washington, D.C.: June 4, 2003. High-Risk Series: Strategic Human Capital Management. [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-03-120]. Washington, D.C.: January 2003. Acquisition Workforce: Status of Agency Efforts to Address Future Needs. [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-03-55]. Washington, D.C.: December 18, 2002. Military Personnel: Oversight Process Needed to Help Maintain Momentum of DOD's Strategic Human Capital Planning. [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-03-237]. Washington, D.C.: December 5, 2002. Managing for Results: Building on the Momentum for Strategic Human Capital Reform. [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-02- 528T]. Washington, D.C.: March 18, 2002. A Model of Strategic Human Capital Management. [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-02-373SP]. Washington, D.C.: March 15, 2002. Human Capital: Taking Steps to Meet Current and Emerging Human Capital Challenges. [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-01-965T]. Washington, D.C.: July 17, 2001. Human Capital: Major Human Capital Challenges at the Departments of Defense and State. [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-01- 565T]. Washington, D.C.: March 29, 2001. [End of section] Footnotes: [1] GAO, High-Risk Series: An Update, [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-07-310] (Washington, D.C.: January 2007). In 2001, we designated strategic human capital management as a high-risk area because of the federal government's long- standing lack of a consistent strategic approach to marshaling, managing, and maintaining the human capital needed to maximize government performance and ensure its accountability. GAO, High-Risk Series: An Update, [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-01- 241] (Washington, D.C.: January 2001). [2] GAO, Defense Transformation: Preliminary Observations on DOD's Proposed Civilian Personnel Reforms, [hyperlink, http://www.gao.gov/cgi- bin/getrpt?GAO-03-717T] (Washington, D.C.: Apr. 29, 2003); Defense Transformation: DOD's Proposed Civilian Personnel Systems and Governmentwide Human Capital Reform, [hyperlink, http://www.gao.gov/cgi- bin/getrpt?GAO-03-741T] (Washington, D.C.: May 1, 2003); and Human Capital: Building on DOD's Reform Efforts to Foster Governmentwide Improvements, [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-03- 851T] (Washington, D.C.: June 4, 2003). See Related GAO Products at the end of this report for additional reports we have issued related to NSPS and performance management in the federal government. [3] GAO, Human Capital: DOD Needs Better Internal Controls and Visibility Over Costs for Implementing Its National Security Personnel System, [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-07-851] (Washington, D.C.: July 16, 2007) and Human Capital: Observations on Final Regulations for DOD's National Security Personnel System, [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-06-227T] (Washington, D.C.: Nov. 17, 2006). [4] GAO, Results Oriented Cultures: Creating a Clear Linkage between Individual Performance and Organizational Success, [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-03-488] (Washington, D.C.: Mar.14, 2003). [5] GAO, Post-Hearing Questions for the Record Related to the Department of Defense's National Security Personnel System (NSPS), [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-06-582R] (Washington, D.C.: Mar. 24, 2006). [6] Pub. L. No. 110-181, § 1106(c) (2008). Specifically, section 1106(c)(1)(B) directs GAO to conduct reviews in calendar years 2008- 2010 to evaluate the extent to which the Department of Defense has effectively implemented accountability mechanisms, including those established in 5 U.S.C. § 9902(b)(7) and other internal safeguards. The accountability mechanisms specified in 5 U.S.C. § 9902(b)(7) include those that GAO previously identified as internal safeguards key to successful implementation of performance management systems. For example see GAO, Post-Hearing Questions for the Record Related to the Department of Defense's National Security Personnel System (NSPS), [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-06-582R] (Washington, D.C.: Mar. 24, 2006). GAO has emphasized the need for internal safeguards since DOD first proposed NSPS. For example see GAO, Posthearing Questions Related to Strategic Human Capital Management, [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-03-779R] (Washington, D.C.: May 22, 2003). [7] For the purpose of this report, we define safeguards to include accountability mechanisms. [8] The Department of the Navy's NSPS policies encompass Marine Corps civilians. The Fourth Estate includes all organizational entities in DOD that are not in the military departments or the combatant commands, for example, the Office of the Secretary of Defense, the Joint Staff, the Office of the DOD Inspector General, the defense agencies, and DOD field activities. [9] SOFS is a series of Web-based surveys of the total force that allows DOD to (1) evaluate existing programs/policies, (2) establish baselines before implementing new programs/policies, and (3) monitor progress of programs/policies and their effects on the total force. Since 2003, the Defense Manpower Data Center has administered the SOFS for civilian personnel on a semiannual basis. SOFS for civilian employees includes questions about compensation, performance, and personnel processes. Regular administrations every 6 months occurred between October 2004 and November 2006, and annual administrations commenced in 2007. All surveys include outcome or "leading indicator" measures such as overall satisfaction, retention intention, and perceived readiness, as well as demographic items needed to classify individuals into various subpopulations. In 2004, DOD added questions to SOFS for civilian employees pertaining specifically to NSPS. These surveys also include items for the annual reporting requirement under the National Defense Authorization Act for Fiscal Year 2004. [10] These estimated percentages are based on a 95 percent confidence interval and margin of error within +/-2 percent as reported in DOD's Defense Manpower and Data Center's SOFS of civilian employees. [11] Pub. L. No. 108-136, § 1101 (2003) (codified at 5 U.S.C. §§ 9901- 9904). The National Defense Authorization Act for Fiscal Year 2008 amended 5 U.S.C. § 9902. Pub. L. No. 110-181, § 1106 (2008). [12] DOD has not applied NSPS to the Senior Executive Service because the latter's members are under a separate governmentwide pay-for- performance system. Additionally, DOD has not applied NSPS to the DOD intelligence components, which include the Defense Intelligence Agency, because these components are initiating implementation of a performance management system called the Defense Civilian Intelligence Personnel System (DCIPS). See 10 U.S.C. § 1601. [13] According to PEO officials, DOD originally planned to convert approximately 700,000 civilian employees to NSPS; however, recent legislative changes decreased the total number of eligible civilians to approximately 450,000. [14] Criteria to distinguish pay pools may include, but are not limited to, organization structure, employee job function, location, and organization mission. [15] Where determined appropriate due to the size of the pay pool population, the complexity of the mission, the need to prevent conflicts of interest, or other similar criteria, sub-pay pool panels may be organized in a structure subordinate to the pay pool panel. Sub- pay pool panels normally operate under the same requirements and guidelines provided to the pay pool panel to which they belong. [16] Pay pool panel members may not participate in payout deliberations or decisions that directly impact their own performance assessment or pay. [17] The senior organization official, usually a member of the Senior Executive Service or a General/Flag officer, serves as the Performance Review Authority (PRA). DOD components may provide additional guidance for the establishment of PRAs. The responsibilities of the PRA may be assigned to an individual management official or organizational unit or group. [18] GAO, Standards for Internal Control in the Federal Government, [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/AIMD-00-21.3.1] (Washington, D.C.: November 1999). [19] Department of Defense Human Resources Management and Labor Relations Systems, 70 Fed. Reg. 66,116, 66,121 (Nov. 1, 2005). [20] In 2004, 36 of the unions voluntarily formed the United DOD Workers' Coalition, otherwise referred to as the "union coalition," which allowed the workers to have one voice in regards to NSPS. Each union elects representatives to speak on their behalf at collaborative coalition meetings. DOD has 45 unions, which are affiliated with 1,500 local bargaining units. [21] American Federation of Government Employees, AFL-CIO, et al. v. Rumsfeld, et al., 422 F. Supp. 2d 16, 35 (D.D.C. 2006), see also American Federation of Government Employees, AFL-CIO, et al. v. Gates, et al., 486 F. 3d 1316, 1327 (D.C. Cir. 2007). The National Defense Authorization Act for Fiscal Year 2008 repealed the statutory provisions at issue in both cases. [22] Separate focus groups were held for employees, civilian and military supervisors, and managers and practitioners from the human resource, legal, and equal employment opportunity communities. The focus group participants were asked to comment on the positive aspects of NSPS' human resource systems and propose any suggested changes to these systems. [23] With the change to objectives-based performance plans, DOD dropped the separate factor for "achieving results." [24] 5 U.S.C. § 9902(e)(4). [25] Components must certify that pay pool funds are used only for the compensation of civilian employees, as required by 5 U.S.C. § 9902(e)(6). [26] Percentages were determined using previous years data on General Schedule workforce within grade and quality step increases, and promotions between grades banded in NSPS. [27] GAO, Human Capital: DOD Needs Better Internal Controls and Visibility over Costs for Implementing its National Security Personnel System, [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-07-851] (Washington, D.C.: July 16, 2007). [28] 73 Fed. Reg. 29,882 (May 22, 2008). The proposed regulations revise the NSPS regulations published in November 2005 in response to significant changes made to the NSPS law by the National Defense Authorization Act for Fiscal Year 2008. [29] GAO, Financial Regulators: Agencies Have Implemented Key Performance Management Practices, but Opportunities Exist for Improvement, [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-07-678] (Washington, D.C.: June 18, 2007). [30] In response to comments on the original proposed NSPS regulations published in the Federal Register in 2005, PEO stated that it agrees with the concept of incorporating additional transparency in the performance management system, but not at the expense of employee confidentiality and privacy. Management offers alternatives to publishing individual ratings, to include publishing summary results and aggregate data such as average ratings and payouts within pay pools and job foci. 70 Fed. Reg. 66,116, 66,155 (Nov. 1, 2005). [31] Employees are assessed on job objectives. Scores are given to each job objective, and the average of these scores is the employee's rounded rating, or rating of record. [32] GAO, Human Capital: Preliminary Observations on the Administration's Draft Proposed "Working for America Act, [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-06-142T] (Washington, D.C.: Oct. 5, 2005). [33] DOD's efforts to assess employee perceptions of NSPS have been captured within three surveys. Since NSPS implementation began, the SOFS for civilian employees was conducted in May 2006, November 2006, and May 2007. Results from a fourth survey conducted in 2008 were not available at the time of this report. [34] U.S. Office of Personnel Management, Working for America: An Assessment of the Implementation of the Department of Defense National Security Personnel System (Washington, D.C.: May 2007). [35] For the May 2007 survey, 102,000 civilians were surveyed and the weighted response rate was 59 percent. Estimated percentages are reported for collapsed positive and negative responses. That is, agree includes those that responded both agree and strongly agree, and disagree includes responses for both disagree and strongly disagree. The estimated percentages are reported with margins of error based on 95 percent confidence intervals. The margin of error is within +/-2 percent. [36] Specifically, the SOFS of civilian employees asked employees to respond to the statement, "Overall, what type of impact do you think NSPS will have on personnel practices in the DOD." In May 2006, responses were: 25 percent, negative; 35 percent, neither; and 40 percent, positive. In May 2007, responses were: 48 percent, negative; 30 percent, neither; and 23 percent positive. [37] Specifically, the SOFS of civilian employees asked employees to respond to the statement, "Compared to previous personnel systems, NSPS is worse, neither, or better." The question was not asked in May 2006; however, in November 2006 the responses were: 44 percent, worse; 41 percent, neither; and 15 percent, better. In May 2007, the responses were: 50 percent, worse; 35 percent, neither; and 15 percent, better. [38] GAO, Results Oriented Cultures: Implementation Steps to Assist Mergers and Organizational Transformations, [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-03-669] (Washington, D.C.: July 2, 2003); Office of Personnel Management, Working for America: Alternative Personnel Systems in Practice and a Guide to the Future (Washington, D.C.: October 2005). [39] General Schedule employees in DOD were under either a pass/fail or a five-level rating system prior to the implementation of NSPS. A pass/ fail system assesses employees' performance as either "passing" or "failing." [40] Pub. L. No. 108-136, § 1128 (2003) and 5 C.F.R. Part 250, Subpart C. [41] U.S. Department of Defense, NSPS Program Executive Office, National Security Personnel System (NSPS) Focus Group Report (Washington, D.C.: February 2005). [42] According to PEO, many focus group participants incorrectly referred to the annual General Schedule (GS) pay adjustment as a cost of living increase, or COLA. The GS pay adjustments are linked to changes in the Employment Cost Index (ECI), a measure of the overall rate of change in employers' compensation costs in the private and public sectors, excluding the federal government. The ECI does not measure the cost of consumer goods and services, and this adjustment is in no way tied to an inflation index. Rather, it is an attempt to keep federal pay in line with private sector pay. [43] U.S. Office of Personnel Management, Working for America: Alternative Personnel Systems in Practice and a Guide to the Future (Washington, D.C.: October 2005). [44] See National Defense Authorization Act for Fiscal Year 2008, Pub. L. No. 110-181, § 1106(c)(1)(B) (2008) and 5 U.S.C. § 9902(b)(7)(A) and (G). [45] For example, see GAO, Posthearing Questions Related to Strategic Human Capital Management, [hyperlink, http://www.gao.gov/cgi- bin/getrpt?GAO-03-779R] (Washington, D.C.: May 22, 2003); Defense Transformation: DOD's Proposed Civilian Personnel System and Governmentwide Human Capital Reform, [hyperlink, http://www.gao.gov/cgi- bin/getrpt?GAO-03-741T] (Washington, D.C.: May 1, 2003); Human Capital: Agencies Need Leadership and the Supporting Infrastructure to Take Advantage of New Flexibilities, [hyperlink, http://www.gao.gov/cgi- bin/getrpt?GAO-05-616T] (Washington, D.C.: April 21, 2005); and Post- Hearing Questions for the Record Related to the Department of Defense's National Security Personnel System (NSPS), [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-06-582R] (Washington, D.C.: Mar. 24, 2006). [46] The National Defense Authorization Act for Fiscal Year 2008, Pub. L. No. 110-181, § 1106(c)(1)(B) (2008), directs GAO to evaluate the extent to which the Department of Defense has effectively implemented accountability mechanisms, including those established in 5 U.S.C. § 9902(b)(7) and other internal safeguards. We identified some of these safeguards in GAO, Post-Hearing Questions for the Record Related to the Department of Defense's National Security Personnel System (NSPS), GAO- 06-582R (Washington, D.C.: Mar. 24, 2006). Moreover, GAO has emphasized the need for internal safeguards since DOD first proposed NSPS (for example, see [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-03- 779R].) [47] The Fourth Estate includes all organizational entities in DOD that are not in the military departments or the combatant commands, for example, the Office of the Secretary of Defense, the Joint Staff, the Office of the DOD Inspector General, the defense agencies, and DOD field activities. [48] We were unable to schedule a meeting with the Performance Review Authority official for Redstone Arsenal, Alabama. [49] GAO, Military Personnel: The DOD and Coast Guard Academies Have Taken Steps to Address Incidents of Sexual Harassment and Assault, but Greater Federal Oversight Is Needed, [hyperlink, http://www.gao.gov/cgi- bin/getrpt?GAO-08-296] (Washington, D.C.: Jan. 17, 2008); Military Personnel: Federal Management of Servicemember Employment Rights Can Be Further Improved, [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-06- 60] (Washington, D.C.: Oct. 19, 2005); and Military Personnel: DOD Needs to Improve the Transparency and Reassess the Reasonableness, Appropriateness, Affordability, and Sustainability of Its Military Compensation System, [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO- 05-798] (Washington, D.C.: July 19, 2005). [50] National Defense Authorization Act for Fiscal Year 2004, Pub. L. No. 108-136, § 1128 (2003) and 5 C.F.R. Part 250, Subpart C. [51] In 2007, Human Resources Strategic Assessment Program decreased the number of respondents surveyed in the Status of Forces Survey of DOD Civilian Employees from approximately 150,000 to approximately 100,000 per year. This was accomplished by covering about the same content in a single survey administration that was previously covered by two surveys each year. [52] GAO did not monitor or audit the implementation of any of the DMDC survey processes. [53] In a few locations, the population of employees with 0 to 5 years of service was too small to create a sample of 20 employees and subsequently achieve the 8-12 participants necessary for each of our groups. As a result, we expanded the population to include employees with up to 8 years of service. [End of section] GAO's Mission: The Government Accountability Office, the audit, evaluation and investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO's commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each weekday, GAO posts newly released reports, testimony, and correspondence on its Web site. To have GAO e-mail you a list of newly posted products every afternoon, go to [hyperlink, http://www.gao.gov] and select "E-mail Updates." Order by Mail or Phone: The first copy of each printed report is free. Additional copies are $2 each. A check or money order should be made out to the Superintendent of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or more copies mailed to a single address are discounted 25 percent. Orders should be sent to: U.S. Government Accountability Office: 441 G Street NW, Room LM: Washington, D.C. 20548: To order by Phone: Voice: (202) 512-6000: TDD: (202) 512-2537: Fax: (202) 512-6061: To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: E-mail: fraudnet@gao.gov: Automated answering system: (800) 424-5454 or (202) 512-7470: Congressional Relations: Ralph Dawn, Managing Director, dawnr@gao.gov: (202) 512-4400: U.S. Government Accountability Office: 441 G Street NW, Room 7125: Washington, D.C. 20548: Public Affairs: Chuck Young, Managing Director, youngc1@gao.gov: (202) 512-4800: U.S. Government Accountability Office: 441 G Street NW, Room 7149: Washington, D.C. 20548:

The Justia Government Accountability Office site republishes public reports retrieved from the U.S. GAO These reports should not be considered official, and do not necessarily reflect the views of Justia.