Medicare Physician Payment

Care Coordination Programs Used in Demonstration Show Promise, but Wider Use of Payment Approach May Be Limited Gao ID: GAO-08-65 February 15, 2008

Congress mandated in 2000 that the Centers for Medicare & Medicaid Services (CMS) conduct the Physician Group Practice (PGP) Demonstration to test a hybrid payment methodology for physician groups that combines Medicare fee-for-service payments with new incentive payments. The 10 participants, with 200 or more physicians each, may earn annual bonus incentive payments by achieving cost savings and meeting quality targets set by CMS in the demonstration that began in April 2005. In July 2007, CMS reported that in the first performance year (PY1), 2 participants earned combined bonuses of approximately $7.4 million, and all 10 achieved most of the quality targets. Congress mandated that GAO evaluate the demonstration. GAO examined, for PY1, the programs used, whether the design was reasonable, and the potential challenges in broadening the payment approach used in the demonstration to other physician groups. To do so, GAO reviewed CMS documents, surveyed all 10 groups, and conducted interviews and site visits.

All 10 participating physician groups implemented care coordination programs to generate cost savings for patients with certain conditions, such as congestive heart failure, and initiated processes to better identify and manage diabetes patients in PY1. However, only 2 of the 10 participants earned a bonus payment in PY1 for achieving cost savings and meeting diabetes quality-of-care targets. The remaining 8 participants met most of the quality targets, but did not achieve the required level of cost savings to earn a bonus. Many of the participants' care coordination programs were not in place for all of PY1. CMS's design for the PGP Demonstration was generally a reasonable approach for rewarding participating physician groups for achieving cost-savings and quality-of-care targets, but created challenges. CMS's decision to use comparison groups, adjust for Medicare beneficiaries' health status, and include a quality component in the design helped ensure that bonus payments were attributable to demonstration-specific programs and that cost-savings were not achieved at the expense of quality. However, the design created challenges. For example, neither bonuses nor performance feedback for PY1 were given to participants until after the third performance year had begun. CMS provides participants with quarterly claims data sets, but most participants report they do not have the resources to analyze these data sets and generate summary reports on their progress and areas for improvement. The large relative size of the 10 participating physician groups (all had 200 or more physicians) compared with most U.S. physician practices (less than 1 percent had more than 150 physicians) gave the participants certain size-related advantages that may make broadening the payment approach used in the demonstration to other physician groups and non-group practices challenging. Their larger size provided the participants with three unique size-related advantages: institutional affiliations that allowed greater access to financial capital, access to and experience with using electronic health records systems, and prior experience with pay-for-performance programs.

Recommendations

Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.

Director: Team: Phone:


GAO-08-65, Medicare Physician Payment: Care Coordination Programs Used in Demonstration Show Promise, but Wider Use of Payment Approach May Be Limited This is the accessible text file for GAO report number GAO-08-65 entitled 'Medicare Physician Payment: Care Coordination Programs Used in Demonstration Show Promise, but Wider Use of Payment Approach May Be Limited' which was released on February 15, 2008. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. United States Government Accountability Office: GAO: Report to Congressional Committees: February 2008: Medicare Physician Payment: Care Coordination Programs Used in Demonstration Show Promise, but Wider Use of Payment Approach May Be Limited: GAO-08-65: GAO Highlights: Highlights of GAO-08-65, a report to congressional committees. Why GAO Did This Study: Congress mandated in 2000 that the Centers for Medicare & Medicaid Services (CMS) conduct the Physician Group Practice (PGP) Demonstration to test a hybrid payment methodology for physician groups that combines Medicare fee-for-service payments with new incentive payments. The 10 participants, with 200 or more physicians each, may earn annual bonus incentive payments by achieving cost savings and meeting quality targets set by CMS in the demonstration that began in April 2005. In July 2007, CMS reported that in the first performance year (PY1), 2 participants earned combined bonuses of approximately $7.4 million, and all 10 achieved most of the quality targets. Congress mandated that GAO evaluate the demonstration. GAO examined, for PY1, the programs used, whether the design was reasonable, and the potential challenges in broadening the payment approach used in the demonstration to other physician groups. To do so, GAO reviewed CMS documents, surveyed all 10 groups, and conducted interviews and site visits. What GAO Found: All 10 participating physician groups implemented care coordination programs to generate cost savings for patients with certain conditions, such as congestive heart failure, and initiated processes to better identify and manage diabetes patients in PY1. However, only 2 of the 10 participants earned a bonus payment in PY1 for achieving cost savings and meeting diabetes quality-of-care targets. The remaining 8 participants met most of the quality targets, but did not achieve the required level of cost savings to earn a bonus. Many of the participants‘ care coordination programs were not in place for all of PY1. CMS‘s design for the PGP Demonstration was generally a reasonable approach for rewarding participating physician groups for achieving cost-savings and quality-of-care targets, but created challenges. CMS‘s decision to use comparison groups, adjust for Medicare beneficiaries‘ health status, and include a quality component in the design helped ensure that bonus payments were attributable to demonstration-specific programs and that cost-savings were not achieved at the expense of quality. However, the design created challenges. For example, neither bonuses nor performance feedback for PY1 were given to participants until after the third performance year had begun. CMS provides participants with quarterly claims data sets, but most participants report they do not have the resources to analyze these data sets and generate summary reports on their progress and areas for improvement. Gap between Completion of First Performance Year and Expected Bonus Payments: [See PDF for image] This figure is an illustration of the Gap between Completion of First Performance Year and Expected Bonus Payments. The following data is depicted: Performance Year 1: April 1, 2005 - April 1, 2006; Number of months: 12. Performance Year 2: April 1, 2006 - April 1, 2007; Number of months: 12 (extended to 15, through July 1, 2007); April 2007: Cost-savings feedback for PY1; Mid-June 2007: Cost-savings and quality-of-care bonuses for PY1; July 2007: Bonus payments distributed for PY1. Performance Year 2: April 1, 2007 - April 1, 2008; Number of months: 12. Source: GAO analysis of GMS data. Note: The 15-month period includes the typical 6-month period necessary for CMS to process a sufficient number of claims to meet its 98 percent complete claims threshold that it uses for analysis. [End of figure] The large relative size of the 10 participating physician groups (all had 200 or more physicians) compared with most U.S. physician practices (less than 1 percent had more than 150 physicians) gave the participants certain size-related advantages that may make broadening the payment approach used in the demonstration to other physician groups and non-group practices challenging. Their larger size provided the participants with three unique size-related advantages: institutional affiliations that allowed greater access to financial capital, access to and experience with using electronic health records systems, and prior experience with pay-for-performance programs. What GAO Recommends: CMS should provide participating physician groups with interim summary reports that estimate participants‘ progress in achieving cost-savings and quality-of-care targets. CMS agreed with the intent of GAO‘s recommendation. To view the full product, including the scope and methodology, click on [hyperlink, http://www.GAO-08-65]. For more information, contact Kathleen M. King at (202)512-7114 or kingk@gao.gov. [End of section] Contents: Letter: Results in Brief: Background: Participating Physician Groups Implemented Care Coordination Programs Designed to Achieve Cost Savings and Management Processes to Meet CMS- Set Diabetes Quality-of-Care Targets in PY1: CMS's PGP Demonstration Design Was Generally a Reasonable Approach, but Created Challenges: Participating Physician Groups Had Several Size-Related Advantages, Which May Pose Challenges in Broadening the Payment Approach Used in the Demonstration to More Participants: Conclusions: Recommendation for Executive Action: Agency Comments and Our Evaluation: Appendix I: Objectives, Scope, and Methodology: Appendix II: New and Expanded Programs Implemented by the 10 Participating Physician Groups for the PGP Demonstration: Appendix III: Reported PGP Demonstration-Related Start-up and Operating Costs for New and Expanded Programs: Appendix IV: Comments from the Centers for Medicare & Medicaid Services: Appendix V: GAO Contact and Staff Acknowledgments: Tables: Table 1: Description of the Participating Physician Groups in the Physician Group Practice Demonstration: Table 2: CMS's Demonstrations Related to Physician Pay-for-Performance: Table 3: Percentage of FTEs Devoted to Largest New and Expanded Care Coordination Programs Implemented by PGP Demonstration Participants, PY1: Table 4: Management Processes Developed or Enhanced by Participating Physician Groups to Meet Diabetes Quality-of-Care Targets, PY1: Table 5: Range and Average of Initial Start-up and PY1 Operating Expenditures Reported by Participating Physician Groups, by Program Type and Order of Average Amount Spent, 2005: Figures: Figure 1: Illustration of CMS Bonus Payment Methodology for PGP Demonstration, Performance Year 1: Figure 2: Medicare Spending Growth Rate of Participating Physician Groups Relative to Their Comparison Group and the 2 Percent Threshold in PY1: Figure 3: Percentage of New and Expanded Programs Implemented by Participating Physician Groups: Figure 4: Gap between Completion of First Performance Year and Performance Feedback and Bonus Payments: Figure 5: Relative Proportion of Physician Practices in the United States, by Practice Size, 2005: Abbreviations: ASC: ambulatory surgical center: BIPA: Medicare, Medicaid, and SCHIP Benefits Improvement and Protection Act of 2000: CAD: coronary artery disease: CAH: critical access hospital: CHF: congestive heart failure: CMS: Centers for Medicare & Medicaid Services: COPD: chronic obstructive pulmonary disease: E&M: evaluation and management: EHR: electronic health record: FFS: fee-for-service: FTE: full-time equivalent: HCC: hierarchical condition category: HEDIS®: health plan employer data and information set: IRMA: Integrated Resources for the Middlesex Area: IT: information technology: IVR: interactive voice response: MedPAC: Medicare Payment Advisory Commission: MGMA: Medical Group Management Association: NCQA: National Committee for Quality Assurance: PGP: physician group practice: PY1: performance year one: PY2: performance year two: PY3: performance year three: [End of section] United States Government Accountability Office: Washington, DC 20548: February 15, 2008: The Honorable Max Baucus: Chairman: The Honorable Charles E. Grassley: Ranking Member: Committee on Finance: United States Senate: The Honorable John D. Dingell: Chairman: The Honorable Joe Barton: Ranking Member: Committee on Energy and Commerce: House of Representatives: The Honorable Charles B. Rangel: Chairman: The Honorable Jim McCrery: Ranking Member: Committee on Ways and Means: House of Representatives: Medicare spending for physician services paid through Medicare's Part B fee-for-service (FFS) program grew rapidly from 2000 to 2006, increasing from $37 billion in 2000 to $58 billion in 2006, at an average annual growth rate of almost 8 percent, outpacing the national economy's annual average growth rate for that period of 5.2 percent.[Footnote 1] Growth in Medicare spending for physician services has heightened congressional concerns about how physicians are reimbursed under the Medicare FFS payment system as well as the long- term fiscal sustainability of the Medicare program. The Medicare Payment Advisory Commission (MedPAC) and other experts believe that the method for paying physicians under the current Medicare FFS payment system is contributing to the rapid growth in Medicare expenditures,[Footnote 2] because the FFS payment system generally does not encourage physicians to make efficient use of resources, encourage coordination of physician services with services paid under the Medicare Part A program,[Footnote 3] or encourage improvements in quality of care. Some private sector insurers that pay physicians on an FFS basis have recently implemented payment systems, referred to as pay- for-performance, in which a portion of a physician's or hospital's payment is based on meeting performance measures designed to improve the quality of care. Congress mandated in the Medicare, Medicaid, and SCHIP Benefits Improvement and Protection Act of 2000 (BIPA)[Footnote 4] that the Centers for Medicare & Medicaid Services (CMS), the agency that oversees the Medicare program, conduct demonstrations to test incentive-based alternative payment methods for physicians reimbursed under Medicare FFS. The Physician Group Practice (PGP) Demonstration was the first of several physician pay-for-performance demonstrations CMS has implemented. In line with BIPA's mandate and the ongoing concerns about growth in Medicare spending for physician services, CMS's PGP Demonstration aims to encourage the coordination of Part A and Part B services, promote efficiency through investment in administrative processes, and reward physicians for improving health outcomes.[Footnote 5] CMS's PGP Demonstration tests a hybrid payment methodology that combines Medicare FFS payments with a bonus payment that participating physician groups can earn by demonstrating savings through better management of patient care and services and meeting quality-of-care performance targets. Performance for the demonstration is based on the success of the entire physician group practice, and not on the individual performance of any one physician. CMS stated that it chose to focus on large physician group practices because these organizations influence a significant amount of Medicare expenditures and have sufficient Medicare beneficiary volume to calculate Medicare savings or losses under the demonstration. CMS initially designed the PGP Demonstration for a 3-year period and recently, December 2007, continued the demonstration for a fourth performance year. CMS solicited participation from physician practices across the United States, and selected 10 physician group practices with at least 200 or more physicians that were multispecialty physician groups, which had the capacity to provide a variety of types of clinical services.[Footnote 6] The first year of the demonstration, referred to as performance year one (PY1), ran from April 1, 2005, through March 31, 2006, and the fourth performance year will end on March 31, 2009.[Footnote 7] In July 2007, CMS reported the results of PY1. As a result of their efforts, 2 of the 10 participating physician groups participating in the PGP Demonstration earned a performance bonus payment. Specifically, the Marshfield Clinic and the University of Michigan Faculty Group Practice earned bonuses of approximately $4.6 million and $2.8 million, respectively. As a part of the overall design of the demonstration, to obtain bonus payments, each participant had to generate cost savings to Medicare of more than 2 percent relative to a unique comparison group of Medicare beneficiaries intended to be similar to Medicare beneficiaries treated by that participant.[Footnote 8] Only participants that earned this cost-savings portion of the bonus also were eligible to increase their bonus payments if they met certain disease-specific, quality-of-care performance targets. CMS selected 10 diabetes management measures as the quality-of-care performance targets, and in PY1, all 10 participants achieved 7 or more of CMS's 10 quality-of-care targets. Performance year two (PY2) and performance year three (PY3) used the same bonus payment methodology, with two exceptions: (1) other disease-specific quality measures are phased in over time and (2) the share of any bonus earned by meeting quality targets increases each year. While the specific quality-of-care measures for each year were selected by CMS, each participating physician group had the latitude to determine what programs--such as patient care coordination and administrative processes--it would implement to both generate cost savings and meet the quality measures. No one savings or quality management program was required to be implemented uniformly. BIPA required us to report on the progress of the PGP Demonstration. As discussed with the committees of jurisdiction, in this report we examined for the first performance year of the demonstration: (1) what actions the participating physician groups took to achieve cost savings and meet the CMS-set, diabetes quality-of-care targets, (2) the extent to which the PGP Demonstration design was a reasonable approach to rewarding participating physician groups for cost-savings and quality performance, and (3) potential challenges involved in broadening the payment approach used in the demonstration from the 10 large participating physician groups to other physician groups and nongroup practices. To determine the programs used by the participating physician groups for the PGP Demonstration, we analyzed data we collected through written questionnaires and interviews. We supplemented this information by conducting site visits to 5 of the 10 locations.[Footnote 9] We included in our analysis new programs or expansions of existing programs created in response to the demonstration. To determine the extent to which the PGP Demonstration design was reasonable, we analyzed documents on the overall design and bonus payment methodology we obtained from CMS, analyzed data collected through the questionnaire, and used interviews we conducted with representatives of the participating physician groups and CMS. We also reviewed and analyzed CMS documents on the design of the PGP Demonstration. To determine the potential challenges involved in broadening the payment approach used in the demonstration to other physician practices, we compared selected characteristics of the 10 participating physician groups with all physician practices in the United States, and we also relied on interviews with the participating physician group practices, CMS officials, and relevant industry experts. In doing our work, we tested the reliability of the data and determined they were adequate for our purposes. We conducted this performance audit from May 2006 through December 2007 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. For additional details of our scope and methodology, see appendix I. Results in Brief: To achieve cost savings in the PGP demonstration, all 10 participating physician groups implemented care coordination programs that focused on specific patient populations, such as those with congestive heart failure (CHF), that the participants believed were the most likely to generate cost savings. However, only 2 of the 10 participants earned a bonus payment in PY1 for achieving cost savings and meeting quality-of- care targets. The remaining 8 participants met 7 or more of CMS's 10 quality-of-care targets in PY1, but were not eligible for bonus payments because they did not achieve the required level of cost savings. To meet the quality-of-care targets set by CMS for PY1 on 10 diabetes measures, such as whether a beneficiary's blood pressure was at the recommended level, participants generally initiated processes to better identify patients with diabetes, improve documentation of diabetes-management-related exams and tests completed, and provide feedback to physicians on achievement of the targets. All 10 participating physician groups reported that their care coordination programs were making progress in both achieving cost savings and providing broader benefits to their programs and communities because, for example, demonstration program initiatives were typically implemented for all patients in a physician's care regardless of whether they were part of the demonstration. Despite early positive indicators of cost savings, the full impact of programs implemented for the PGP Demonstration, particularly in care coordination, is largely unknown for a variety of reasons, including that many programs were not in place for all 12 months of the first performance year. CMS's design for the PGP Demonstration was generally a reasonable approach for rewarding physician groups in the demonstration for cost savings and quality performance, but created challenges. CMS's decision to use comparison groups and adjust for differences in health status among Medicare beneficiaries helped ensure that bonus payments reflected programs and incentives attributable to the demonstration. In addition, having a quality component of the design helped ensure that participating physician groups did not achieve cost savings at the expense of quality. However, the demonstration design created a particular challenge for CMS in providing timely performance feedback and bonus payments to the participating physician groups, which, if received, may have enabled them to improve their programs. Specifically, neither bonuses nor performance feedback for PY1 were given to participants until after PY3 began. While CMS has begun to provide each participant with a quarterly patient claims data set related to beneficiaries it served, most participants reported they did not have the necessary resources to analyze and use these data sets to determine their progress and areas for potential improvements. Participants also raised additional concerns about the demonstration's design, including the use of a uniform 2 percent savings threshold for all participants, which may have made earning a bonus more challenging for particular providers. CMS officials indicated that they planned to examine these issues as part of their evaluation at the conclusion of the PGP Demonstration. The large relative size of the 10 participating physician groups compared with most U.S. physician practices gave the participants certain size-related advantages that may make broadening the payment approach used in the demonstration to other physician groups and nongroup practices challenging. Whereas all physician groups participating in the demonstration had 200 or more physicians, less than 1 percent of all physician practices nationwide had more than 150 physicians. In addition to the number of physicians, participants were larger in terms of their annual medical revenues and staff size. Their larger size provided the participating physician groups with three unique size-related advantages: institutional affiliations that allowed greater access to financial capital, access to and experience with using electronic health records (EHR) systems, and experience prior to the PGP Demonstration with pay-for-performance programs. For example, 8 of the 10 participants had an EHR, which was essential to participants' ability to gather data and track progress in meeting quality-of-care targets; only about 24 percent of physician practices in the U.S. had a full or partial EHR in 2005. Most participating groups believed these three advantages were critical to achieving cost savings and improving quality. We recommend that the Administrator of CMS provide participating physician groups with interim summary reports that estimate their progress in achieving cost-savings and quality-of-care targets. In commenting on a draft of this report, CMS agreed with the intent of our recommendation. CMS stated that it was developing a new quarterly report and refined data set to aid physician groups in monitoring their performance, coordinating care, and improving quality. Background: Physician groups with at least 200 physicians were eligible to apply for the PGP Demonstration and 10 were selected by CMS. (See table 1.) CMS's technical review panel evaluated each applicant based on its organizational structure, operational feasibility, geographic location, and demonstration implementation strategy.[Footnote 10] Collectively, the 10 participating physician groups are all multispecialty practices comprising more than 6,000 physicians who provide care for more than 220,000 Medicare FFS beneficiaries. While all the participants have at least 200 physicians, group practice size varies widely, ranging from 232 to 1,291 physicians. Except for the Marshfield Clinic, all participants identified themselves as integrated delivery systems that include, in addition to their group practice, other health care entities such as hospitals, surgical centers, or laboratories.[Footnote 11] Nearly all of the participants have nonprofit tax status, except for the Everett Clinic and the Integrated Resources for the Middlesex Area (IRMA), which are for profit. Overall, a majority of the 10 participants are located in small cities and serve either predominantly rural or suburban areas. These participants provide care over wide geographic areas by using satellite physician group office locations, ranging from 10 to 65 physician group office locations. Table 1: Description of the Participating Physician Groups in the Physician Group Practice Demonstration: Participating physician group: Billings Clinic; Description of participants' affiliations and the geographic area served: A physician group practice that is part of an integrated delivery system, which includes a general hospital, a skilled nursing facility, and other facilities. The system also operates a private health insurance plan. Its geographic service area includes the city of Billings, Montana, south-central Montana, and northwestern Wyoming, a predominantly rural area; Number of physician group office locations: 10; Number of physicians in the physician group[A]: 232. Participating physician group: Dartmouth-Hitchcock Clinic; Description of participants' affiliations and the geographic area served: A faculty/community group practice that is a part of an integrated delivery system, which includes an academic medical center, an ambulatory surgical center (ASC), and a laboratory. Its geographic service area includes New Hampshire and eastern Vermont, a predominantly rural area with small cities; Number of physician group office locations: 35; Number of physicians in the physician group[A]: 907. Participating physician group: The Everett Clinic; Description of participants' affiliations and the geographic area served: A physician group practice that is a part of an integrated delivery system, which includes two ASCs, a radiology center, a laboratory, but no general hospital. Its geographic service area includes the city of Everett, Washington, and west-central Washington, a predominantly suburban area; Number of physician group office locations: 10; Number of physicians in the physician group[A]: 250. Participating physician group: Geisinger Health System; Description of participants' affiliations and the geographic area served: A physician group practice that is part of an integrated delivery system, which includes three general hospitals, a home health agency, two specialty hospitals, and other facilities. The system also operates a private health insurance plan. Its geographic service area includes central-northeast Pennsylvania, a predominantly rural area with small cities; Number of physician group office locations: 55; Number of physicians in the physician group[A]: 833. Participating physician group: Integrated Resources for the Middlesex Area; Description of participants' affiliations and the geographic area served: An independent practice association that provides management services to a network of physicians and physician groups that are part of an integrated delivery system. The system includes a general hospital, a home health agency, an ASC, and other facilities. Its geographic service area includes south-central Connecticut, a predominantly suburban area with small cities; Number of physician group office locations: 57; Number of physicians in the physician group[A]: 293. Participating physician group: Marshfield Clinic; Description of participants' affiliations and the geographic area served: A physician group practice without affiliations with other facilities, it operates a private health insurance plan. Its geographic service area includes the city of Marshfield, Wisconsin, and north- central Wisconsin, a predominantly rural area; Number of physician group office locations: 41; Number of physicians in the physician group[A]: 1,039. Participating physician group: Novant Medical Group; Description of participants' affiliations and the geographic area served: A physician group practice that is part of an integrated delivery system, which includes two general hospitals, two skilled nursing facilities, and a laboratory. Its geographic service area includes the small urban area of Winston-Salem, North Carolina, and other parts of northwestern North Carolina, a predominantly rural area; Number of physician group office locations: 44; Number of physicians in the physician group[A]: 250. Participating physician group: Park Nicollet Health Services; Description of participants' affiliations and the geographic area served: A physician group practice that is part of an integrated delivery system, which includes two general hospitals and a home health agency.[B] Its geographic service area includes suburban areas adjacent to Minneapolis-St. Paul, Minnesota, and south-central Minnesota, a predominantly suburban area; Number of physician group office locations: 29; Number of physicians in the physician group[A]: 648. Participating physician group: St. John's Health System; Description of participants' affiliations and the geographic area served: A physician group practice that is part of an integrated delivery system, which includes six general hospitals, six home health agencies, an ASC, and other facilities. The system also operates a private health insurance plan. Its geographic service area includes the city of Springfield, Missouri, as well as south-central Missouri and northwest Arkansas, predominantly rural areas; Number of physician group office locations: 65; Number of physicians in the physician group[A]: 522. Participating physician group: University of Michigan Faculty Group Practice; Description of participants' affiliations and the geographic area served: A faculty group practice that is a part of an integrated delivery system, which includes an academic medical center, two other general hospitals, a home health agency, two ASCs, and other facilities. The system also operates a private health insurance plan. Its geographic service area includes the city of Ann Arbor, Michigan, and southeastern Michigan, a predominantly suburban area with small cities; Number of physician group office locations: 28; Number of physicians in the physician group[A]: 1,291. Source: GAO. [A] The number of physicians includes physicians and physician extenders--those who can bill Medicare as physicians, such as physician assistants. [B] Park Nicollet Health Services has full ownership of one general hospital and partial ownership of a second. [End of table] Demonstration Design, Including Bonus Payment Methodology: Under the PGP Demonstration's design, participating physician groups are eligible to earn annual cost-savings bonuses for generating Medicare program savings. Participants that received cost-savings bonuses were also eligible to receive additional bonuses for meeting certain quality targets. Both the cost-savings and quality bonuses are in addition to payments physicians receive under Medicare FFS. There are three main steps in CMS's bonus payment methodology to determine which participants are awarded bonus payments and the amount of these bonuses: (1) determination of eligibility for performance bonus payments, (2) determination of the size of the bonus pool, and (3) determination of actual bonus payments earned. (See fig. 1.) Figure 1: Illustration of CMS Bonus Payment Methodology for PGP Demonstration, Performance Year 1: [See PDF for image] This figure includes both data describing the bonus payment methodology as well as graphical illustrations of the same. The following data is depicted: Step 1, Determination of eligibility for bonus payment: CMS determines whether a participating physician group is eligible for a demonstration bonus based on whether the participant generated annual Medicare savings greater than 2% of its target expenditures. -Medicare program savings generated by the participating physician group in excess of 2% of its target expenditure amount. Step 2, Determination of size of bonus pool: 80% is available as a bonus pool to the participating physician group; 20% is considered Medicare program savings. Step 3, Determination of actual bonus payment earned: Of the bonus pool of savings available: 70% is awarded as a cost-savings bonus; Up to 30% is awarded (for those who received the cost-savings bonus) to those who also meet diabetes quality targets for PY1. Any unearned portion is considered Medicare program savings. Total actual bonus payment = cost- savings bonus (70% of bonus pool) received + quality bonus received (up to 30% of bonus pool). Source: GAO analysis of CMS data. Notes: CMS retained a share (25 percent) of the actual bonus earned by participating physician groups in Performance Year 1 to protect against any potential losses in future performance years. Spending increases in excess of 2 percent, relative to a comparison group of beneficiaries intended to have similar characteristics, of the spending target are carried forward as losses and deducted from any bonus earned in future years. The total actual bonus earned cannot be greater than 5 percent of the participating physician group's original spending target. If it is higher, it will be reduced to the 5 percent level. [End of figure] For the first step of the bonus payment methodology, to determine eligibility for receiving bonus payments, participating physician groups had to generate savings greater than 2 percent of their target expenditure amounts, relative to a comparison group of beneficiaries intended to have similar characteristics. CMS stated that the purpose of the 2 percent savings threshold was to further account for the possibility of random fluctuations in expenditures rather than actual savings. CMS also stated that it used separate comparison groups for each of the participants to distinguish the effect of the demonstration's incentive payments from trends among Medicare beneficiaries unrelated to the demonstration. Operationally, Medicare beneficiaries were assigned to the comparison groups or to the participating physician groups retrospectively at the conclusion of each performance year, using Medicare claims data sent to CMS by providers following the delivery of care. As a part of the process of selecting beneficiaries for each comparison group that were similar to those served by the participating physician group they were being compared with, CMS ensured that beneficiaries (1) resided in the same geographic service areas as the beneficiaries assigned to the corresponding physician group;[Footnote 12] (2) had received at least one office or outpatient service, referred to as an evaluation and management (E&M) service, in that performance year;[Footnote 13] and (3) had not received any E&M services from the corresponding physician group that year or had been assigned to the participant's group of beneficiaries in any previous performance year. For step two, determining the size of the bonus pools, participating physician groups that generated savings beyond the 2 percent threshold were eligible to receive up to 80 percent of those savings as potential bonuses. The remaining 20 percent, and all other savings not awarded to the participants,[Footnote 14] were retained by the Medicare program. In the third step, the determination of actual bonus amounts earned, eligible participating physician groups could receive up to the full amount available in their bonus pools as cost-savings bonuses and quality-of-care bonuses. Specifically, for PY1, 70 percent of the bonus pool was awarded as a cost-savings bonus to participants who met the 2 percent cost-savings threshold, and up to 30 percent of the bonus pool could have been awarded as a quality-of-care bonus, for those who met the cost-savings threshold. The quality-of-care bonus was awarded to participants that met or exceeded various quality-of-care targets within an area of clinical focus selected by CMS, in collaboration with other organizations and the participating physician groups.[Footnote 15] In PY1, CMS focused on diabetes management, and required participants to meet targets on a set of 10 diabetes measures, including whether a beneficiary received an eye exam or foot exam.[Footnote 16] To meet the quality-of-care target for each of the diabetes measures, a participant had to either improve its performance by a certain amount relative to its baseline performance or meet a national set of performance measures, referred to as HEDIS® measures, established by the National Committee for Quality Assurance (NCQA).[Footnote 17] Participants could also receive a prorated share of the quality-of-care bonus, based on success meeting some, but not all, of the quality-of-care targets. While the bonus payment methodology will remain the same throughout the demonstration, CMS added other quality-of-care measures and increased the relative significance of the quality-of-care measures in PY2 and PY3. In PY2, quality-of-care measures pertaining to CHF and coronary artery disease (CAD) were added to the existing diabetes measures. In PY3, quality-of-care measures pertaining to the management of hypertension and screening for breast and colorectal cancer were added to the existing diabetes, CHF, and CAD measures. The proportion of the bonus pool dedicated to meeting the quality-of-care targets--30 percent in PY1--also increased in each performance year. For PY2, the potential quality-of-care bonus increased to 40 percent of the potential bonus pool, and the proportion of the bonus pool that will be paid as a cost- savings bonus decreased to 60 percent. For PY3, the cost-savings and quality-of-care bonuses each will constitute 50 percent of the total bonus paid. Results from Performance Year 1: In July 2007, CMS reported that in PY1, 2 of the 10 participating physician groups earned bonuses for achieving cost-saving and quality- of-care targets, while all 10 participants achieved 7 or more of the 10 quality-of-care targets. The Marshfield Clinic and the University of Michigan Faculty Group Practice received performance bonus payments of approximately $4.6 million and $2.8 million, respectively, in PY1. The Marshfield Clinic generated approximately $6 million in Medicare savings in PY1, above the 2 percent threshold established by CMS. Of this $6 million bonus pool, the Medicare program retained approximately $1.2 million, and Marshfield Clinic earned $3.4 million for the cost- savings component of the bonus and $1.2 million for meeting 9 of the 10 quality-of-care targets.[Footnote 18] The University of Michigan Faculty Group Practice generated approximately $3.5 million in savings in PY1 above the 2 percent threshold. Medicare retained approximately $700,000 and the University of Michigan Faculty Group Practice earned nearly $2 million for the cost-savings component of the bonus, and just over $800,000 for meeting 9 of the 10 quality-of-care targets.[Footnote 19] Of the remaining eight participating physician groups that did not earn cost-savings bonuses in PY1, all performed well in meeting the quality- of-care targets. Specifically, all eight of these participants achieved 7 or more of the 10 quality-of-care targets, with two participants meeting all 10 quality-of-care targets and two others achieving 9 of the targets. In addition, six of the participants came close to achieving the 2 percent threshold for the cost-savings component of the performance bonus payment in PY1. These six groups reduced their Medicare spending growth rates compared to their comparison group, but not beyond the 2 percent threshold.[Footnote 20] (See fig. 2): Figure 2: Medicare Spending Growth Rate of Participating Physician Groups Relative to Their Comparison Group and the 2 Percent Threshold in PY1: [See PDF for image] This figure is a bar graph depicting the Medicare Spending Growth Rate of Participating Physician Groups Relative to Their Comparison Group and the 2 Percent Threshold in PY1. The vertical axis of the graph represents difference in Medicare spending growth rate from -5 to +5, with -2 being the 2% threshold. The horizontal axis of the graph represents Physician group practice from 1 to 10. The following data is depicted: Physician group practice: 1; Difference in Medicare spending growth rate: -4.3. Physician group practice: 2; Difference in Medicare spending growth rate: -3.7. Physician group practice: 3; Difference in Medicare spending growth rate: -2.0. Physician group practice: 4; Difference in Medicare spending growth rate: -1.2. Physician group practice: 5; Difference in Medicare spending growth rate: -0.7. Physician group practice: 6; Difference in Medicare spending growth rate: -0.7. Physician group practice: 7; Difference in Medicare spending growth rate: -0.5. Physician group practice: 8. Difference in Medicare spending growth rate: -0.5. Physician group practice: 9; Difference in Medicare spending growth rate: 2.2. Physician group practice: 10; Difference in Medicare spending growth rate: 3.6. Source: CMS and RTI International. Note: Participating group 3 reduced its Medicare spending by almost 2 percent, but did not exceed the threshold. [End of figure] PGP Demonstration, Early Test of Public Sector Pay-for-Performance Models: While the number of pay-for-performance programs--programs in which a portion of a provider's[Footnote 21] payment is based on their performance against defined measures--has increased in recent years, this growth has occurred largely in the private sector by commercial health plans. MedVantage reported that 107 pay-for-performance programs were in place as of November 2005, up from 84 the year before.[Footnote 22] Of these 107 pay-for-performance programs, 21 were public sector programs, of which 10 were Medicare programs. Currently, CMS has 5 programs, including the PGP Demonstration, that are demonstrations testing alternative physician payment methods. (See table 2.) Among these 5 physician pay-for-performance demonstrations, 4, including the PGP Demonstration, test physician pay-for-performance methods by offering incentives to physicians for meeting clinical performance standards, while 1 focuses on aligning financial incentives between hospitals and physicians. The PGP Demonstration was the first of CMS's Medicare demonstrations to test physician pay-for-performance. Participants in CMS's Medicare Health Care Quality Demonstration, projected to begin in 2008,[Footnote 23] may elect to use this overall design and bonus payment methodology from the PGP Demonstration. Among CMS's other pay-for-performance demonstrations that are not physician related, is the Premier Hospital Quality Incentive Demonstration, a hospital-specific pay-for-performance demonstration for more than 260 hospitals in the Premier Inc., system. Under this demonstration, CMS provides bonus payments for hospitals with the highest levels of performance in five clinical conditions, including acute myocardial infarction. A recent study examining this demonstration concluded that among hospitals receiving performance bonuses, patients did not have a significant improvement in quality-of-care of care or outcomes for acute myocardial infarction.[Footnote 24] Table 2: CMS's Demonstrations Related to Physician Pay-for-Performance: CMS demonstration project: Physician Group Practice Demonstration; Targeted organizations: Physician groups with 200 or more physicians; Statutory authority: BIPA,[A] section 412; Description: Participating physician groups are rewarded for improving the quality and efficiency of health care services delivered to Medicare fee-for-service beneficiaries through a methodology that shared savings with the Medicare program. The demonstration seeks to encourage coordination of inpatient and outpatient services, promote efficiency through investment in administrative structures and process, and reward physicians for improving health outcomes; Timing of demonstration: 3 years, CMS recently continued the demonstration for an additional year. CMS demonstration project: Medicare Health Care Quality Demonstration; Targeted organizations: Physician groups and integrated delivery systems; Statutory authority: MMA,[B] section 646; Description: Participating physician groups will test the effectiveness of different payment methodologies to improve quality and reduce costs. Participants will be rewarded using their choice of (1) a similar design and bonus payment methodology to that used in the PGP Demonstration or (2) a different payment methodology they elect; Timing of demonstration: 5 years, application period closed, projected to begin in 2008. CMS demonstration project: Care Management for High Cost Beneficiaries Demonstration; Targeted organizations: Physician groups, hospitals, and integrated delivery systems; Statutory authority: Section 402(a), Social Security Amendments of 1967[C]; Description: Six organizations will test the ability of direct-care provider models to coordinate care for high-cost, high-risk beneficiaries by providing clinical support beyond traditional settings to manage their conditions; Timing of demonstration: 3 years, began October 2005. CMS demonstration project: Medicare Care Management Performance Demonstration; Targeted organizations: Solo and small to medium-sized physician practices in California, Arkansas, Massachusetts, and Utah; Statutory authority: MMA,[B] section 649; Description: Participants will be rewarded for meeting clinical performance standards for (1) treating diabetes, congestive heart failure, and coronary artery disease; (2) providing preventive services provided to high-risk, chronically ill patients; and (3) implementing electronic health records systems; Timing of demonstration: 3 years, began July 1, 2007. CMS demonstration project: Medicare Hospital Gainsharing Demonstration; Targeted organizations: Hospitals; Statutory authority: DRA,[D] section 5007; Description: Participating hospitals will be allowed to provide incentive payments to physicians to reward improvements in quality of care and increased financial efficiency; Timing of demonstration: 3 years, expected to begin January 1, 2007 but start currently delayed. Source: CMS. [A] Medicare, Medicaid, and SCHIP Benefits Improvement and Protection Act of 2000, Pub. L. No. 106-554, App. F, § 412, 114 Stat. at 2763A- 509. [B] Medicare Prescription Drug, Improvement, and Modernization Act of 2003, Pub. L. No. 108-173, 117 Stat. 2066. [C] Pub. L. No. 90-248, 81 Stat. 821, 930-31 (1968), as amended by Pub. L. No. 92-603, § 222(b)(2), 86 Stat. at 1393. [D] Deficit Reduction Act of 2005, Pub. L. No. 109-171, 120 Stat. 4 (2006). [End of table] Participating Physician Groups Implemented Care Coordination Programs Designed to Achieve Cost Savings and Management Processes to Meet CMS- Set Diabetes Quality-of-Care Targets in PY1: The participating physician groups implemented care coordination programs to achieve cost savings and improved their management processes to meet quality improvement targets CMS set for particular diabetes measures in PY1. More specifically, management process improvements included enhancing information technology (IT) systems, incorporating more team-based approaches, and improving administrative processes. Despite early positive indicators in cost savings, the full impact of programs implemented for the PGP Demonstration, particularly in care coordination, is largely unknown because many programs were not in place for all 12 months of the first performance year. Participating Physician Groups Implemented 47 Programs for the Demonstration, Largely in Care Coordination: The participating physician groups implemented 47 programs, which were either new or expansions of existing programs, to achieve cost savings and meet the CMS-set diabetes quality-of-care targets, with each participant implementing from 2 to 9 programs.[Footnote 25] (See app. II for a complete list of new and expanded programs implemented for the PGP Demonstration.) More specifically, participants focused nearly three-quarters of their new and expanded programs on care coordination- -programs that manage the care of a small number of chronically ill and frail elderly patients who account for a disproportionately large share of overall costs. (See fig. 3.) The remaining one-quarter of programs focused on patient education, medication-related issues, improving administrative processes, and other initiatives. Figure 3: Percentage of New and Expanded Programs Implemented by Participating Physician Groups: [See PDF for image] This figure contains two pie-charts depicting the following data: All new and expanded programs: Care coordination: 72%; Administrative processes: 15%; Patient education: 6%; Medication related: 4%; Other: 2%. New and expanded care coordination programs: Of the 72% Care coordination: Disease management: 36%; Case management: 36%. Source: GAO. Note: There are two types of care coordination programs: (1) case- management programs that target high-cost, high-risk patients with multiple medical conditions, and (2) disease-management programs that treat patients with a specific disease, such as congestive heart failure. Information technology initiatives are not counted as discrete new or expanded programs because these initiatives were parts of broader participating physician group efforts. Administrative processes include such activities as physician and staff education programs, physician feedback systems, and data collection processes. Because of rounding, the pie chart does not total to 100 percent. [End of figure] Among the 47 programs, participating physician groups devoted the largest portion of their program resources to care coordination programs designed to reduce hospitalizations by improving post-acute care. Our analysis showed that for 9 of the 10 participants at least half of demonstration-specific, full-time equivalents (FTE) were devoted to care coordination programs. (See table 3.) Participants told us they selected care coordination programs that provided post-acute care because they believed these programs would reduce future hospitalizations and yield the most cost savings in the shortest amount of time. For example, both Billings Clinic and Park Nicollet Health Services used a telephonic interactive voice response (IVR) system to monitor patients' health status at home following a hospitalization or another significant health event.[Footnote 26] Table 3: Percentage of FTEs Devoted to Largest New and Expanded Care Coordination Programs Implemented by PGP Demonstration Participants, PY1: Participating physician group: Billings Clinic; Largest care coordination program: Cancer treatment center (disease management); Description of care managers' responsibilities following a hospitalization or significant medical event: Provided outpatient treatment, prevention, and education programs to decrease patients' risk for infections or other potentially harmful exposures to decrease the number and length of hospital stays; Percentage of demonstration-specific FTEs: 27. Participating physician group: Dartmouth-Hitchcock Clinic; Largest care coordination program: Health coaching (case management); Description of care managers' responsibilities following a hospitalization or significant medical event: Helped patients follow hospital post-discharge instructions, make follow-up appointments with physicians, and take the correct medications and dosages; Percentage of demonstration-specific FTEs: 55. Participating physician group: The Everett Clinic; Largest care coordination program: Palliative care program (case management); Description of care managers' responsibilities following a hospitalization or significant medical event: Educated patients about end-of-life planning, and provided information on community support agencies, alternative living options, and in-home support; Percentage of demonstration-specific FTEs: 67. Participating physician group: Geisinger Health System; Largest care coordination program: Post-acute Case Management (case management); Description of care managers' responsibilities following a hospitalization or significant medical event: Contacted patients to ensure home health services were received and correct medications were being taken, and assisted with coordinating community service programs; Percentage of demonstration-specific FTEs: 60. Participating physician group: Integrated Resources for the Middlesex Area; Largest care coordination program: Heart smart program (disease management); Description of care managers' responsibilities following a hospitalization or significant medical event: Provided case-management services to cardiac patients enrolled in home-care services; Percentage of demonstration-specific FTEs: 50. Participating physician group: Marshfield Clinic; Largest care coordination program: Anticoagulation Program (case management); Description of care managers' responsibilities following a hospitalization or significant medical event: Worked with patients taking the anti-clotting drug Warfarin to ensure dosages were adjusted properly, and recognize other factors affecting coagulation, such as diet, activity, other medications, and other illnesses; Percentage of demonstration-specific FTEs: 70. Participating physician group: Novant Medical Group; Largest care coordination program: Outpatient case management (case management); Description of care managers' responsibilities following a hospitalization or significant medical event: Ensured that high-risk, high-cost patients scheduled follow-up visits with physicians and informed patients of available resources; Percentage of demonstration-specific FTEs: 100. Participating physician group: Park Nicollet Health Services; Largest care coordination program: Heart failure care coordination (disease management); Description of care managers' responsibilities following a hospitalization or significant medical event: Monitored patients' medication usage and dietary regimes through an IVR system and initiated medical care when patients' health status worsened; Percentage of demonstration-specific FTEs: 100. Participating physician group: St. John's Health System; Largest care coordination program: Case management systems (case management); Description of care managers' responsibilities following a hospitalization or significant medical event: Coordinated inpatient and outpatient care for high-risk patients and helped patients follow physician treatment plans; Percentage of demonstration-specific FTEs: 79. Participating physician group: University of Michigan Faculty Group Practice; Largest care coordination program: Post-discharge transitional care (case management); Description of care managers' responsibilities following a hospitalization or significant medical event: Provided medication counseling, guidance on post-acute care treatment, assistance making post-discharge appointments, and assistance with nonclinical services, such as arranging transportation; Percentage of demonstration-specific FTEs: 63. Source: GAO analysis of survey data. Note: Care coordination programs were defined as large based on resources, as measured in FTEs, devoted to the program. [End of table] Approximately half of the care coordination programs were case- management programs that targeted high-cost, high-risk patients with multiple medical conditions, while the other half were disease- management programs that treated patients with a specific disease, such as CHF. Seven participants focused on case-management programs by using care managers for patients with multiple medical conditions to reduce hospitalizations. For example, an official from the Dartmouth-Hitchcock Clinic stated that the clinic's primary strategy for the PGP Demonstration was to reduce hospitalizations and readmissions through more effective discharge planning, such as calling patients at home following their hospital discharge and encouraging them to schedule follow-up appointments with their physicians. Three participants committed the majority of their resources to disease-management programs; two of the three participants told us they focused on CHF because it is a costly disease to treat and would therefore generate savings within the first performance year. Other diseases, such as diabetes, could take several years to generate cost savings. CHF and diabetes are two of the most common chronic diseases among Medicare beneficiaries, according to recent health policy research.[Footnote 27] All 10 participating physician groups reported that their care coordination programs were making progress in both achieving cost savings and providing broader benefits to their programs and communities. In particular, four participants reported declines in hospitalizations for patients enrolled in their CHF care coordination programs. For example, Park Nicollet Health Services reported a 61 percent reduction in hospitalizations for patients enrolled in its CHF care-management program, which utilized an IVR to interact with patients on a daily basis. Park Nicollet representatives estimated this program saved $4,680 yearly, on average, for each patient enrolled in the program. Because the demonstration included other Medicare and non- Medicare patients, its benefits extended beyond the patients assigned to Park Nicollet for the demonstration. Further, several participants stated that collaboration and information sharing among the 10 participants on designing and implementing programs and analyzing data resulted in improvements to their demonstration programs, which broadly benefit their organizations. Representatives from St. John's Health Systems stated that creating a care-coordination program had additional benefits, including the adoption of such programs by other health systems and physician groups throughout the community. Despite early positive indicators of cost savings, the full impact of programs implemented for the PGP Demonstration, particularly in care coordination, is largely unknown for a variety of reasons, including that many programs were not in place for all 12 months of the first performance year. Only 1 of the 10 participants had all of its programs in place for all 12 months of PY1. For example, the Marshfield Clinic had a case-management program operational for all 12 months in PY1, and a disease-management program operational for 4 months. By the beginning of PY2, only 6 of the 10 participants had all of their care coordination programs operational. Officials from participating physician groups stated that program implementation delays were caused by program complexity, the process of gaining management approval for significant program start-up costs, and educating physicians about the programs. In addition, two participants stated that because their care coordination programs were phased in throughout the first two performance years, PY3 may be the first year that the full impact of these programs is realized. Participating Physician Groups Improved Management Processes to Meet the Diabetes Quality-of-Care Targets in First Performance Year: To meet the quality-of-care targets set by CMS on diabetes management, participating physician groups improved their management processes by investing in IT, creating team-based approaches, and improving administrative processes, particularly for diabetes management, the quality-of-care target for PY1. To earn the maximum bonus, participants that met the 2 percent cost-savings target had to further meet a quality-of-care improvement target in a particular clinical area. The measures selected by CMS for each performance year of the demonstration focused on chronic conditions prevalent in the Medicare population that are treated in primary care.[Footnote 28] In PY1, CMS selected diabetes management as the focus for quality improvement for the demonstration participants. See table 4 for a categorization of how the participants worked to improve quality, specifically for diabetes, by using physician feedback, patient registries, team-based approaches, and improved documentation. Table 4: Management Processes Developed or Enhanced by Participating Physician Groups to Meet Diabetes Quality-of-Care Targets, PY1: Participating physician group: Billings Clinic; Physician feedback: [Check]; Patient registry: [Check]; Team-based approaches: [Check]; Improved documentation: [Empty]; Description of unique program components[A]: * Developed an electronic database that allowed physicians to identify patients with diabetes; * Utilized electronic database to generate reports for physicians on meeting diabetes quality-of-care measures; * Allowed care managers to manage certain aspects of a patient's care such as adjusting diuretic medications for heart failure patients. Participating physician group: Dartmouth-Hitchcock Clinic; Physician feedback: [Check]; Patient registry: [Empty]; Team-based approaches: [Check]; Improved documentation: [Check]; Description of unique program components[A]: * Provided reports to physicians on meeting diabetes quality measures through intranet; * Improved physician coordination with nurses, health coaches, and case managers; * Used flowcharts to ensure that diabetes patients receive the appropriate tests and treatments. Participating physician group: The Everett Clinic; Physician feedback: [Check]; Patient registry: [Empty]; Team-based approaches: [Check]; Improved documentation: [Empty]; Description of unique program components[A]: * Provided reports to physicians online, enabling them to view the percentage of diabetes quality-of-care measures they had completed for each patient; * Instructed medical assistants to begin performing more initial screenings on patients. Participating physician group: Geisinger Health System; Physician feedback: [Check]; Patient registry: [Empty]; Team-based approaches: [Check]; Improved documentation: [Empty]; Description of unique program components[A]: * Implemented an electronic system to track physicians' compliance with diabetes quality-of-care measures; * Implemented standing orders for nurses to test diabetic patients' urine for protein before each visit. Participating physician group: Integrated Resources for the Middlesex Area; Physician feedback: [Check]; Patient registry: [Empty]; Team-based approaches: [Empty]; Improved documentation: [Check]; Description of unique program components[A]: * Issued aggregated report cards for each physician location that measured performance in meeting the diabetes quality-of-care measures; * Distributed flow sheets to physicians at the point of care to help monitor care for diabetes patients. Participating physician group: Marshfield Clinic; Physician feedback: [Check]; Patient registry: [Empty]; Team-based approaches: [Check]; Improved documentation: [Check]; Description of unique program components[A]: * Provided reports to physicians on meeting diabetes quality-of-care measures through intranet; * Implemented standing orders for medical assistants to order tests and allowed care managers to adjust patients' Warfarin dosages based on protocols; * Created paper forms to help ensure that foot exams are completed and documented for diabetes patients. Participating physician group: Novant Medical Group; Physician feedback: [Check]; Patient registry: [Empty]; Team-based approaches: [Empty]; Improved documentation: [Check]; Description of unique program components[A]: * Provided reports to physicians on meeting diabetes quality-of-care measures, based on paper charts; * Used paper checklists placed in patients' medical records to record data on eye and foot exams for diabetes patients. Participating physician group: Park Nicollet Health Services; Physician feedback: [Check]; Patient registry: [Empty]; Team-based approaches: [Check]; Improved documentation: [Empty]; Description of unique program components[A]: * Developed electronic alerts and reminders that inform physicians of patients' immediate and future clinical needs, including the diabetes- related measures; * Implemented standing orders for care managers to administer various treatments. Participating physician group: St. John's Health System; Physician feedback: [Check]; Patient registry: [Check]; Team-based approaches: [Empty]; Improved documentation: [Empty]; Description of unique program components[A]: * Used electronic database to check progress on meeting diabetes quality-of-care measures; * Developed an electronic database that allows physicians to identify diabetes patients. Participating physician group: University of Michigan Faculty Group Practice; Physician feedback: [Check]; Patient registry: [Empty]; Team-based approaches: [Empty]; Improved documentation: [Empty]; Description of unique program components[A]: * Provided reports to physicians at the point of care detailing each diabetes patient's test results, appointments, and medications. Sources: GAO and CMS. [A] Participating physician groups may have implemented more than one program to meet quality-of-care targets, with multiple components, and only a sample of selected components are included. [End of table] All participating physician groups made new investments in IT, by adding features to existing EHR systems or using technology to track physicians' performance on the quality-of-care measures set by CMS. For example, Marshfield Clinic implemented electronic alerts in its EHR system to remind clinical staff to provide care, such as immunizations. Participants primarily used electronic methods for physician feedback as a tool for physicians to track their performance and that of their peers to improve their internal operations and patient care. For example, Geisinger Health System's physician feedback system provided physicians with access to monthly reports for each physician, which compared each physician's performance in meeting the quality-of-care measures. According to administrators from Geisinger, this transparent approach fostered positive competition among its physicians to improve quality of care. Participants also invested in IT by creating electronic patient or disease-specific databases or lists referred to as patient registries to better identify patients eligible for enrollment in diabetes programs. The St. John's Health System, which did not have an EHR system, created an electronic patient registry to track patients with diabetes and to alert physicians to provide certain tests.[Footnote 29] Six of the participating physician groups relied to a greater extent on a team-based approach to improve care processes. Using a team-based approach, participants expanded the roles and responsibilities of nonphysician staff such as nurses, medical assistants, and care managers so that they worked more effectively with physicians to deliver quality care. Although the demonstration required additional quality reporting, officials from two of the participating physician groups stated that they were able to treat the same number of patients in a day. For example, Dartmouth-Hitchcock Clinic used care managers who were nurses to maximize the effectiveness of patients' office visits. These staff scheduled lab tests in advance of patients' office visits when appropriate, developed patient action plans, and communicated with physicians before and after patients' arrivals. Physicians from Dartmouth-Hitchcock told us that the time they spent with patients had become more effective because of this new approach. Four participating physician groups improved their administrative processes by creating better documentation methods for diabetes-related tests and exams. They created worksheets, derived from patients' medical records, to ensure that patients received diabetes tests, such as foot and eye exams. In addition to improving documentation, these initiatives served as reminders to physicians to complete diabetes- related tests and exams and also reduced the burden of data collection for reporting purposes. For example, IRMA created paper forms that were added to patient records to collect data on tests as they were conducted. In addition to improving documentation, these forms were intended to relieve some of the burden of collecting data for smaller practices within the organization. IRMA physicians also received paper worksheets at the point of care to help monitor and track care provided to their diabetes patients. CMS's PGP Demonstration Design Was Generally a Reasonable Approach, but Created Challenges: CMS's design for the PGP Demonstration was generally a reasonable approach for rewarding participating physician groups for cost-savings and quality performance. However, the demonstration design created a particular challenge for CMS in providing timely performance feedback and bonus payments to the participants which, if received more quickly, may have enabled them to improve their programs. CMS's PGP Demonstration Design Was Generally a Reasonable Approach for Rewarding Participating Physician Groups: CMS's design for the PGP Demonstration was generally a reasonable methodological approach for determining whether the actions taken by the participants resulted in cost savings and improvements in quality, and rewarding participants as appropriate. In particular, three aspects of the PGP Demonstration design were consistent with established methodological practices considered effective: a rigorous research study design to isolate the effects of the demonstration's incentives, a risk-adjustment approach to adjust for changes in patient health status, and a quality component to help ensure that participating physician groups did not achieve cost savings at the expense of quality. CMS used a rigorous research design to enable it to isolate the effectiveness of the actions taken by each of the participants in the demonstration. Specifically, CMS used a modified "pre-test/post-test" control group design that is generally viewed by experts as an effective way to control for some of the most common threats to internal validity, in this case the ability of the study design to measure the true effects of CMS's incentive payments.[Footnote 30] The features of CMS's study design included a separate comparison group for each participant to distinguish the effects of the demonstration's incentives from unrelated spending trends in the participants' service areas. Comparison groups' beneficiaries are from the participants' geographic service areas and, as such, are affected by the same local market trends as the participants. In addition, the study design included a baseline period, before the demonstration began, that helped to control for trends that may have occurred without demonstration- related interventions. A standard pre-test/post-test control group design would have randomly assigned beneficiaries to either a comparison group or a participant's beneficiaries. To avoid having to restrict or control beneficiaries' choice of providers and health care services, and to continue to operate within the Medicare FFS system while the demonstration was in place, CMS modified this standard approach. Rather than assigning beneficiaries randomly at the start of the demonstration to participant or comparison groups, the agency retrospectively assigned beneficiaries at the end of each year based on the beneficiaries' natural use of outpatient E&M services. CMS also used a rigorous risk-adjustment approach to adjust for changes in patients' health-care status. Without these adjustments, CMS could not have been reasonably assured that changes in spending growth were not attributable to changes in patients' health care status, and the severity and complexity of diagnosis. For the PGP Demonstration, CMS tailored the CMS-Hierarchical Condition Category (HCC) model, the risk- adjustment model that it currently uses to make capitation payments to Medicare managed care plans.[Footnote 31] This model accounts for changes in the health status of beneficiaries. Furthermore, CMS incorporated a quality component into the research design, which helped ensure that participants would not achieve cost savings at the expense of quality. The quality-of-care measures CMS selected were based on a consensus of experts and were developed in collaboration with the American Medical Association and quality assurance organizations and with input from the participants. In addition, CMS has placed an increased emphasis on quality in its bonus payment methodology for future years. By PY3, half of the available bonus pool will be awarded based on each participant's success in meeting quality-of-care metrics in six clinical areas: diabetes, CHF, CAD, management of hypertension, and screening for breast cancer and colorectal cancer. PGP Demonstration Design Created Several Challenges: While CMS's research design for the PGP Demonstration was generally a reasonable approach, it also created some challenges for the participating physician groups. Challenges resulting from the demonstration design included providing timely performance feedback and bonus payments and the use of a uniform 2 percent savings threshold that may have disadvantaged certain participants. Participants also raised other concerns related to the demonstration design that were related to their local markets. Certain Aspects of the Demonstration Design Made Providing Timely Feedback and Bonus Payments Challenging: Overall, participants did not receive performance feedback or bonus payments for their PY1 efforts until after the beginning of the third performance year. Specifically, CMS provided participants with performance feedback and bonus payments regarding their efforts in PY1 in three phases beginning 12 months after the end of PY1. In April 2007, CMS provided each participant with a cost-savings summary report displaying its success in controlling Medicare expenditures for PY1 and the size of its cost-savings bonus pool. (See fig. 4.) A little over 2 months later, CMS provided each participant with a detailed settlement sheet displaying its individual cost-savings and quality-of-care bonuses for PY1. It was not until July 2007, 15 months after the end of PY1, that the two participants that earned a demonstration bonus for PY1--the Marshfield Clinic and the University of Michigan Faculty Group Practice--received their bonus payments of $3.4 million and $2.1 million, respectively. Figure 4: Gap between Completion of First Performance Year and Performance Feedback and Bonus Payments: [See PDF for image] This figure is an illustration of the Gap between Completion of First Performance Year and Expected Bonus Payments. The following data is depicted: Performance Year 1: April 1, 2005 - April 1, 2006; Number of months: 12. Performance Year 2: April 1, 2006 - April 1, 2007; Number of months: 12 (extended to 15, through July 1, 2007); April 2007: Cost-savings feedback for PY1; Mid-June 2007: Cost-savings and quality-of-care bonuses for PY1; July 2007: Bonus payments distributed for PY1. Performance Year 2: April 1, 2007 - April 1, 2008; Number of months: 12. Source: GAO analysis of GMS data. Note: The 15-month period includes the typical 6-month period necessary for CMS to process a sufficient number of claims to meet its 98 percent complete claims threshold that it uses for analysis. [End of figure] CMS officials explained that generating feedback for the participating physician groups required 15 months because the demonstration design depended on the time-consuming process of retrospectively analyzing Medicare beneficiaries' claims and chart-based data. Specifically, CMS officials stated that the process of calculating participants' cost- savings bonuses required at least 12 months after the conclusion of the first performance year--6 months to accrue claims data that were sufficiently complete and a second 6 months to analyze and calculate the bonus amounts.[Footnote 32] In addition, they stated that the calculation of the quality-of-care bonus required an additional 3 months to audit and reconcile chart-based data with claims-based data pertaining to the 10 diabetes quality-of-care measures. CMS officials stated that to calculate the cost-savings bonus they chose to use a claims file that was 98 percent complete because they wanted to ensure that the feedback they provided to participants was accurate. CMS officials also stated that the time frames for providing performance feedback and bonus payments to participants in PY1 will be the same for PY2 and PY3. Officials from all 10 participating physician groups expressed concern about the length of time CMS took to provide them with performance feedback and bonus payments. Several participants stated that they had difficulty making adjustments to their programs and improving their overall performance because of delayed feedback and payments. One official from the Novant Medical Group stated that the 15-month time lag in receiving bonus payments would prevent the organization from reinvesting these resources into demonstration-related programs and improving them for subsequent performance years. In addition, two of the participants told us that other pay-for-performance programs they have participated in have used payment methodologies that yielded more timely performance feedback or bonus payments. For example, officials from the University of Michigan Faculty Group Practice indicated that a pay-for-performance program sponsored by Blue Cross Blue Shield of Michigan provided them twice a year with feedback for meeting certain quality-of-care targets. In response to these concerns, CMS has been working to provide each participating physician group with a quarterly Medicare patient claims data set related to beneficiaries they served. Initially, data sets were provided quarterly and focused on identifying patients with chronic conditions who had a hospital admission or emergency room visit. In July 2007, CMS provided each participating group with a data set on hospital inpatient, outpatient, and physician information consisting of the Medicare claims of beneficiaries likely to be included in the PY2 cost-savings calculations. In September 2007, CMS responded to participants' requests for quarterly claims data that would allow them to assess their cost-savings performance during the performance year, by providing them with a revised data set. CMS's most recent data set included Medicare inpatient, outpatient, and physician claims data for beneficiaries likely to be included in the year-end cost-savings calculation. The data set includes information on these beneficiaries for the first quarter of PY3. CMS noted that it will not provide equivalent information pertaining to comparison group beneficiaries because these data are too time-consuming to assemble. While CMS's provision of ongoing quarterly data sets to participants is timelier than the information provided before, most participants told us they do not have the necessary resources to analyze these data sets in a timely manner. This lack of timely actionable data could hinder participants' ability to adjust their programs on a more "real-time" basis. Officials from only 2 of the 10 participants told us they would able to analyze and use the quarterly data sets CMS provided. Consequently, the data sets are not as useful as providing CMS- generated quarterly reports, similar to the final reports CMS provided on participants' progress in achieving cost-savings and quality-of-care targets. CMS may not be able to provide quarterly reports that include comparison group trends or provide quality-of-care data that rely on chart-based data because of complexity and cost. However, CMS could provide participants with estimates on their growth in per-beneficiary expenditures each quarter, as well as changes in the profile of the beneficiaries who are likely to be assigned to the participants. CMS could also use more readily available claims data to provide quarterly estimates of participants' progress in meeting the quality-of-care targets. In PY1, for example, that would have included reporting progress on 4 of the 10 claims-based quality targets on diabetes, such as whether a beneficiary received an eye exam. Use of a Uniform 2 Percent Savings Threshold for All Participants May Have Made Earning a Bonus More Challenging for Particular Providers: CMS officials said they adopted the use of a uniform 2 percent threshold to ensure that savings generated really were due to demonstration-related programs. Just as CMS used individual comparison groups for each participant, CMS could have used separate savings thresholds that more closely reflected the market dynamics of each participant's overall area instead of a uniform savings threshold chosen based on historical data averaged across the 10 participants. However, use of different thresholds for each participant, according to CMS officials, would have been complex and would have generated additional administrative burden in processing bonus payments. Nevertheless, the use of a uniform savings threshold--2 percent--that all participating physician groups had to achieve before becoming eligible for a bonus payment may have made earning bonus payments more challenging for particular providers, specifically those with already low Medicare spending growth rates or those that had comparison group providers with low spending growth rates. Participants with low relative spending may have had difficulty generating annual Medicare savings of greater than 2 percent compare to those participants with high spending growth rates before the demonstration began, some participants argued. Supporting this concern is the wide variation in the amount participants spent per beneficiary in the year prior to the demonstration, which ranged from $6,426 to $11,520, after adjusting for health status. In addition, participants with comparison groups that had relatively low spending growth may have faced more of a challenge reducing their spending below 2 percent of their comparison groups' spending than participants with comparison groups that had higher relative spending growth. In fact, both participants that received a bonus, Marshfield Clinic and the University of Michigan Faculty Group Practice, were measured against comparison groups with high relative spending growth rates--the 2 largest among the 10 participants. While their success cannot necessarily be attributed to the high relative spending of their comparison groups, the high spending growth of the comparison groups against which they were measured may have had some effect. Additional Participant Concerns: In addition, several participating physician groups noted selected concerns particular to their local markets. For example, officials from one participating physician group expressed concern that CMS did not adequately adjust for the conversion of several hospitals in their market to critical access hospitals (CAH),[Footnote 33] which generally receive higher Medicare payments. Participant officials noted that their physician group treat more patients from these hospitals, which resulted in a higher spending trend and lower likelihood of obtaining a cost-savings bonus. CMS stated that the agency will examine this issue as part of its evaluation at the conclusion of the PGP Demonstration. Several participating physician groups were also concerned that their groups had more beneficiaries with specialist visits relative to their comparison groups.[Footnote 34] As a result, participants providing more specialty care may have had less control over the health outcomes of these beneficiaries. However, analyses conducted by CMS showed that these participants provided 80 percent or more--a predominant share--of the E&M services for most of the beneficiaries assigned to them in PY1, regardless of specialty, and had meaningful opportunities to influence beneficiary health care expenditures. CMS officials stated that they will continue examining this issue and other related issues brought to their attention by the participants as part of their evaluation of the demonstration. Participating Physician Groups Had Several Size-Related Advantages, Which May Pose Challenges in Broadening the Payment Approach Used in the Demonstration to More Participants: The large size of the 10 participating physician groups compared with the majority of physician practices operating in the U.S. gave the participants certain size-related advantages that might make broadening the payment approach used in the demonstration to more participants challenging. The 10 participating physician groups had significantly higher numbers of physicians, higher annual medical revenues, and higher numbers of supporting staff, and were more likely to be multispecialty practices than most practices in the U.S. Specifically, the participating physician groups generally had three unique size- derived advantages: institutional affiliations that allowed greater access to financial capital, access to and experience using EHR systems, and experience prior to the PGP Demonstration with pay-for- performance programs. While all the participating physician groups in the demonstration have 200 or more physicians in their practices,[Footnote 35] significantly less than 1 percent of the approximately 234,000 physician practices in the U.S. in 2005 had 151 or more physicians in their practice.[Footnote 36] (See fig. 5.) By contrast, practices with only 1 or 2 physicians comprised 83 percent of all practices. Furthermore, while all 10 participants were multispecialty practices, 68 percent of the all practices in the U.S. were single-specialty practices, which are generally smaller organizations. The 10 participating physician groups were also large compared with other physician practices in terms of annual medical revenues and nonphysician staff. Participants generated an average of $413 million in annual medical revenues in 2005 from patients treated by their group practice,[Footnote 37] far greater than the revenues generated by single specialty practices in the U.S. Only about 1 percent of single specialty practices had revenues greater than $50 million. In addition, these physician groups and their affiliated entities, such as hospitals, employed approximately 3,500 nonphysician FTEs, over 100 times more than the average single-specialty practice.[Footnote 38] Figure 5: Relative Proportion of Physician Practices in the United States, by Practice Size, 2005: [See PDF for image] This figure contains graphical illustrations of the Relative Proportion of Physician Practices in the United States, by Practice Size, 2005. The following data is depicted: 1 to 2 physicians: Proportion of physicians practices, by practice size: 83% (194,2780; All physician practices: 100% (234,222). 3 to 10 physicians: Proportion of physicians practices, by practice size: 14% (33,660); All physician practices: 100% (234,222). 11 to 25 physicians: Proportion of physicians practices, by practice size: 2% (4,135); All physician practices: 100% (234,222). 1 to 2 physicians: Proportion of physicians practices, by practice size: 1% (2,149) [Among this group, 280 physician groups had 151 or more physicians and all 10 of the physician groups participating in the PGP Demonstration had 200 or more physicians]; All physician practices: 100% (234,222). Source: Medical Group Management Association. Note: Physician groups are defined as a subset of physician practices, consisting of three or more physicians. [End of figure] Their larger relative size gave the 10 physician groups participating in the PGP Demonstration three size-related advantages over smaller physician practices, which may have better prepared them to participate in the demonstration's payment model and implement programs encouraged by the demonstration. First, participants typically had institutional affiliations with an integrated delivery system, a general hospital, or a health insurance entity. Specifically, 9 of the 10 participating physician groups were a part of an integrated delivery system, 8 were affiliated with a general hospital, and 5 were affiliated with an entity that marketed a health insurance product. In contrast, a representative of the Medical Group Management Association estimated that approximately 15 percent of all physician practices in the U.S. have an affiliation with a general hospital. As a result of these affiliations, participating physician groups generally have greater access to the relatively large amounts of financial capital needed to initiate or expand programs. On average, each participating physician group invested $489,354 to initiate and expand its demonstration-related programs and $1,265,897 in operating expenses for these programs in PY1. (See app. III.) Specifically for individual programs, participants reported spending an average of $190,974 to initiate and $409,332 to operate case management programs in PY1, almost twice the spending associated with any other type of program. (See table 5.) Several participants reported that the majority of their individual program expenditures were labor costs for care managers. Officials from several participating physician groups said that smaller practices might have difficulty implementing similar programs because they may not have the financial resources to do so. Table 5: Range and Average of Initial Start-up and PY1 Operating Expenditures Reported by Participating Physician Groups, by Program Type and Order of Average Amount Spent, 2005: Type of investment: Initial start-up expenditures; Type of program: Case management; Minimum amount spent: $75,000; Maximum amount spent: $891,499; Average amount spent: $190,974. Type of investment: Initial start-up expenditures; Type of program: Administrative processes[A]; Minimum amount spent: $2,350; Maximum amount spent: $411,000; Average amount spent: $107,595. Type of investment: Initial start-up expenditures; Type of program: Disease management; Minimum amount spent: $4,450; Maximum amount spent: $917,398; Average amount spent: $89,530. Type of investment: Initial start-up expenditures; Type of program: Other programs; Minimum amount spent: $93,200; Maximum amount spent: $93,200; Average amount spent: $93,200. Type of investment: Initial start-up expenditures; Type of program: Medication related; Minimum amount spent: $12,536; Maximum amount spent: $94,879; Average amount spent: $53,707. Type of investment: Initial start-up expenditures; Type of program: Patient education; Minimum amount spent: $15,110; Maximum amount spent: $85,500; Average amount spent: $20,122. Type of investment: Annual operating expenditures for PY1; Type of program: Case management; Minimum amount spent: $55,404; Maximum amount spent: $2,005,422; Average amount spent: $409,332. Type of investment: Annual operating expenditures for PY1; Type of program: Patient education; Minimum amount spent: $1,897; Maximum amount spent: $947,245; Average amount spent: $214,479. Type of investment: Annual operating expenditures for PY1; Type of program: Administrative processes[A]; Minimum amount spent: $33,049; Maximum amount spent: $411,000; Average amount spent: $179,059. Type of investment: Annual operating expenditures for PY1; Type of program: Disease management; Minimum amount spent: $5,500; Maximum amount spent: $917,398; Average amount spent: $174,873. Type of investment: Annual operating expenditures for PY1; Type of program: Medication related; Minimum amount spent: $65,997; Maximum amount spent: $238,003; Average amount spent: $147,221. Type of investment: Annual operating expenditures for PY1; Type of program: Other programs; Minimum amount spent: $92,500; Maximum amount spent: $92,500; Average amount spent: $92,500. Source: GAO. Note: Start-up investment expenditures for Integrated Resources for the Middlesex Area were not available. [A] Administrative processes include such activities such as physician and staff education programs, physician feedback systems, and data collection processes. [End of table] The second advantage the 10 large participating physician groups had over smaller physician practices is a greater likelihood of having or acquiring EHR systems, which was essential in participants' ability to gather data and track progress in meeting quality-of-care targets. Eight of the 10 participating physician groups had an EHR in place before the demonstration began, and the 2 other participants, out of necessity, developed alternative methods for gathering patient data electronically specifically for the demonstration, such as creating patient registries. In contrast, only an estimated 24 percent of all physician practices in the United States had either a full or partial EHR in 2005, and large practices were more likely to have EHRs than small practices.[Footnote 39] These systems enable physician group practices with multiple locations, such as the 10 participating physician groups, to share patient information and other administrative resources across a wide geographic area. Health care information technology experts believe that the primary reason smaller physician practices have not implemented EHRs is their cost, estimated at $15,000 and $50,000 per physician. In addition, experts estimated that annual maintenance costs add between 15 and 25 percent of the initial per physician investment.[Footnote 40] Furthermore, experts noted that small practices tend to pay more per physician for EHR systems than larger physician practices because larger physician practices are better able to spread the fixed costs of these systems across more physicians. Finally, the third size-related advantage that most of the 10 participating physician groups had over smaller physician practices was the larger groups' experience with other pay-for-performance systems prior to participating in the PGP Demonstration. Overall, 8 of the 10 participants had previous experience with pay-for-performance programs initiated by private or public sector organizations. This experience may have eased their adjustment to the PGP Demonstration and afforded them greater initial and overall success. For example, the University of Michigan Faculty Group Practice's participation in a pay-for- performance system sponsored by Blue Cross Blue Shield of Michigan offered the physician group incentives to upgrade its chronic care infrastructure. In addition to experience with pay-for-performance programs, generally, the majority of the 10 participating physician groups had experience with specific elements of pay-for-performance, such as physician bonus compensation methods and physician feedback processes. Representatives from some of the participating physician groups stated that their exposure to one of these various elements of pay-for-performance elements prior to the PGP Demonstration may have enabled their organizations to adjust to the demonstration more rapidly. Conclusions: The care coordination programs used by the participating physician groups show promise in achieving cost savings and improving patient outcomes for Medicare beneficiaries. As a result of the demonstration, participating physician groups generated several different approaches for coordinating patient care across inpatient and physician settings for high-risk and high-cost patients, such as those with CHF, and better managing patients with diabetes, the quality-of-care target for the demonstration set by CMS. Additional years of the demonstration may be needed, however, for CMS to collect and analyze the information necessary to fully evaluate the effectiveness of these care coordination programs and their potential for cost savings in this demonstration. Only one participant had all of its care coordination programs operational for all 12 months of PY1, and participants did not receive timely feedback from CMS on their progress until PY3 had already begun. While CMS's demonstration design was generally reasonable, the lengthy time CMS took to provide participating physician groups with performance feedback and bonus payments may limit more widespread use in other demonstrations, or use as an alternative method for paying physicians group in Medicare FFS. The lack of timely and actionable performance feedback also hinders participants' ability to improve their programs in response to data. Providing performance feedback and bonus payments to participants more than 12 months after the end of the measurement period precludes physician groups from adjusting their program strategies on a more "real-time" basis. CMS has recently taken action to provide participants with quarterly claims data sets on their beneficiaries for PY3, but most participants indicated they would have difficulty analyzing such data to determine their progress in achieving cost-savings and quality-of-care targets. Measuring participants' performance using a comparison group where beneficiaries were retrospectively assigned after the end of a performance year, as used in this demonstration, may be impractical for more widespread use beyond this demonstration because physician groups cannot accurately predict on an ongoing basis whether they would be able to generate cost-savings and receive bonus payments. In addition, the use of a uniform savings threshold amount, such as the 2 percent used in this demonstration, raises questions about whether this approach provides a disincentive for physician groups that have lower spending. Physician groups with fewer than 200 physicians--the vast majority of practices in the United States--may also have more difficulty than larger practices, such as the participants in this demonstration, absorbing the start-up and annual-operating costs of the care coordination programs and implementing them without EHR systems that many groups believed were necessary to achieve cost savings while maintaining and improving the quality of care. As the PGP Demonstration continues, data will become available to CMS to determine how much influence factors such as the delay in the start- up of participating physician groups' care coordination programs and its decision to use a uniform 2 percent threshold, and other factors, may have had on participants' ability to earn bonus payments. Consequently, it is understandably too early to determine the success of the PGP demonstration, but evidence so far indicates that the care coordination programs initiated by the participants show promise, but that the wider applicability of the payment methodology used in the demonstration may be more limited. Recommendation for Executive Action: We recommend that the Administrator of CMS provide participating physician groups with interim summary reports that estimate participants' progress in achieving cost-savings and quality-of-care targets. Agency Comments and Our Evaluation: CMS reviewed a draft of this report and provided comments, which appear in appendix IV. CMS stated that it appreciated our thoughtful analysis and that our report would provide additional insight into performance year one results and complement its ongoing evaluation efforts. CMS agreed with the intent of our recommendation. CMS stated that it was developing a new quarterly report and refined data set to aid the physician groups in monitoring their performance, coordinating care, and improving quality. CMS stated that these reports would address a key limitation of existing quarterly data sets--that most physician groups do not have the necessary resources to analyze the data sets in a timely manner. We agree that this information would be helpful in improving performance feedback to physician groups, which would allow them to adjust their program strategies on a more "real-time" basis. As the demonstration continues, we encourage CMS to continue its efforts to improve performance feedback to the physician groups participating in the PGP demonstration. We are sending copies of this report to the Administrator of CMS. We will provide copies to others on request. In addition, this report is available at no charge on the GAO web site at [hyperlink, http://www.gao.gov]. If you or your staff have questions about this report, please contact me at (202) 512-7114 or kingk@gao.gov. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. GAO staff members who made key contributions to this report are listed in appendix V. Signed by: Kathleen M. King: Director, Health Care: [End of section] Appendix I: Objectives, Scope, and Methodology: For the first performance year, we examined three objectives: (1) what actions the participating physician groups took to achieve cost savings and meet the diabetes quality-of-care targets selected by the Centers for Medicare & Medicaid Services (CMS), (2) the extent to which the demonstration design was a reasonable approach to rewarding participating physician groups for cost savings and quality performance, and (3) potential challenges involved in broadening the payment approach used in the demonstration from the 10 large participating physician groups to other physician groups and nongroup practices. For each of our reporting objectives, we analyzed data we collected, by written questionnaire, and supplemented this information with interviews in person and by telephone and site visits to 5 of the 10 locations. Questionnaire: We sent questionnaires to individuals CMS identified as points of contact at each of the participating physician groups. These individuals were often physicians or administrative staff tasked with overseeing their physician group's demonstration efforts. All 10 participants completed and returned our questionnaire. The questionnaire contained three sections. The first section gathered standardized information about the practice's general characteristics, including organizational structure, size, institutional affiliations, and the extent to which it used electronic health records systems. The second section gathered information about the programs participants used as a part of the Physician Group Practice (PGP) Demonstration. This section confirmed summary statements the individual practices described in their original applications to CMS or in other documents we obtained from CMS's contractor, Research Triangle Institute (RTI), and provided an opportunity for the group to add new programs, if needed. Summary statements detailed the purpose, type, and characteristics of each program. We also asked participants whether each of their programs was created specifically for the demonstration, was a preexisting program, or was an expansion of a preexisting program. In addition, we asked officials from these physician groups to identify the start-up costs and the annual operating costs of these programs. We also asked about the extent to which the physician groups believe smaller physician practices could implement similar programs. The third section of the questionnaire gathered information about how the participating physician groups compensated their physicians and how any demonstration bonus dollars they may earn would be distributed to individual physicians within the group. Site Visits and Interviews: We also conducted site visits or telephone interviews with staff of all 10 participating physician groups. Five of these interviews were site visits, which we chose to reflect geographic diversity (region of country and urban/rural), size, and ownership status, among other factors. We conducted site visits to Geisinger Health System in Pennsylvania, Park Nicollet Health Services in Minnesota, Marshfield Clinic in Wisconsin, Billings Clinic in Montana, and the Everett Clinic in Washington. We collected the same information by telephone from the other participants in the demonstration. For these in-person or telephone interviews, we interviewed the demonstration project managers, physicians, care managers, finance officials, and information technology staff. Analysis for Reporting Objectives: To identify programs used by the participating physician groups to achieve cost savings and meet the CMS-set diabetes quality-of-care targets, we analyzed data we collected by written questionnaires and interviews, supplemented with information we obtained at site visits to 5 of the 10 participants. We included in our analysis new programs or expansions of existing programs created in response to the demonstration. To determine the extent to which the demonstration's design was reasonable, we analyzed documents on the overall research design and bonus payment methodology obtained from CMS, analyzed data collected through the questionnaire, and used interviews we conducted with the participating physician groups and CMS and its contractor RTI. We also reviewed and analyzed CMS-contracted documents on the design of PGP Demonstration. To determine the potential challenges involved in broadening the payment approach used in the demonstration to other physician groups, we compared selected characteristics of the 10 participating physician groups to physician practices in the United States, using data primarily from the Medical Group Management Association's (MGMA) annual survey. We also used data we collected from the questionnaire and from our interviews with officials from the physician group practices. We also interviewed experts on health information technology systems. Data Reliability: We assessed the reliability of the information we obtained about participating physician group practices and the data we used to compare them to other physician groups in the U.S. in several ways. First, we checked the internal consistency of the information we obtained from the physician groups with information from RTI's PGP Demonstration site visit reports and CMS's 2006 Report to Congress on the PGP Demonstration. We verified the information we collected from the questionnaire with detailed follow-up interviews with officials from all 10 participants. Second, we spoke to the survey director for the 2005 MGMA survey to ensure that we used the information from that survey appropriately and that we understood any data limitations. In addition, we compared the data we used on U.S. group practices from the 2005 MGMA survey with data from the 2005 National Ambulatory Medical Care Survey and determined that the results were largely consistent and adequate for our purposes. Third, on the basis of this comparison and discussions with experts knowledgeable about the data, we used broad categories to describe the data. We determined that the data used in our analysis were sufficiently reliable for the purposes of this report. We conducted our work from May 2006 through December 2007 in accordance with generally accepted government auditing standards. [End of section] Appendix II: New and Expanded Programs Implemented by the 10 Participating Physician Groups for the PGP Demonstration: Participating physician group: Billings Clinic; Program name and program type: Cancer treatment center--care coordination (disease management); New or expanded program: Expansion; Description: Patients received coordinated cancer care including screening, prevention education, and infusion treatments. Participating physician group: Billings Clinic; Program name and program type: Chronic obstructive pulmonary disease (COPD) management--care coordination (disease management); New or expanded program: Expansion; Description: Care managers worked with COPD patients to avoid functional decline, offer preventive services such as immunizations, and treat complications early. Participating physician group: Billings Clinic; Program name and program type: Community crisis center--care coordination (case management); New or expanded program: New; Description: High-risk patients with chronic psychiatric conditions were redirected from the emergency room to the psychiatric center for treatment. Participating physician group: Billings Clinic; Program name and program type: Diabetes program--administrative process; New or expanded program: Expansion; Description: A patient registry was used to identify diabetes patients and create patient report cards, which displayed treatment confirmation and treatment gaps. Participating physician group: Billings Clinic; Program name and program type: Heart failure clinic--care coordination (disease management); New or expanded program: Expansion; Description: Care managers monitored an automated system, which recorded patients' answers to health status questions, and intervened if necessary. Participating physician group: Billings Clinic; Program name and program type: Hospitalist program--care coordination (case management); New or expanded program: New; Description: Hospitalists worked with internists and family practitioners to improve the communication and care provided to patients at hospital discharge. Participating physician group: Billings Clinic; Program name and program type: Medication reconciliation--medication related; New or expanded program: New; Description: Electronic prescription system better reconciled patients' medications between inpatient and outpatient settings to reduce adverse events. Participating physician group: Billings Clinic; Program name and program type: Palliative care program, 5 wishes-- administrative process; New or expanded program: Expansion; Description: Nursing home staff educated on consulting a Billings geriatrician before admitting patient to the hospital. Participating physician group: Billings Clinic; Program name and program type: Physician assistants at nursing homes-- administrative process; New or expanded program: Expansion; Description: To coordinate nursing home and hospital care, physician assistants were assigned to patients entering the hospital's emergency room from local nursing homes. Participating physician group: Dartmouth-Hitchcock Clinic; Program name and program type: Cancer care/palliative care--care coordination (case management); New or expanded program: New; Description: Care managers assisted cancer patients and families in coordinating, planning end-of-life care. Participating physician group: Dartmouth-Hitchcock Clinic; Program name and program type: Health coaching--care coordination (case management); New or expanded program: New; Description: Care managers helped patients follow hospital post- discharge instructions, make physician appointments, and take the correct medications and dosages. Participating physician group: Dartmouth-Hitchcock Clinic; Program name and program type: Provider performance support and feedback--administrative process; New or expanded program: New; Description: Physicians received feedback through intranet, met with management on identified issues. Participating physician group: The Everett Clinic; Program name and program type: Coronary artery disease (CAD) management--administrative process; New or expanded program: New; Description: Forms were placed on the front of patients' medical charts to remind physicians of CAD quality-of-care measures. Participating physician group: The Everett Clinic; Program name and program type: Director for PGP demonstration--other; New or expanded program: New; Description: A director position was created and charged with coordinating program interventions and with overseeing all care for Medicare patients. Participating physician group: The Everett Clinic; Program name and program type: Hypertension management program--care coordination (disease management); New or expanded program: New; Description: A patient registry was used to identify patients with hypertension and remind physicians to measure and document patients' blood pressure. Participating physician group: The Everett Clinic; Program name and program type: Palliative care program--care coordination (case management); New or expanded program: Expansion; Description: Care managers provided patients and families end-of-life planning information on quality-of-life issues, alternative living options, and in-home support. Participating physician group: The Everett Clinic; Program name and program type: Patient care coordination--care coordination (case management); New or expanded program: New; Description: Care managers coordinated inpatient and outpatient care, helping to ensure proper discharge planning and schedule follow-up appointments with physicians. Participating physician group: Geisinger Health System; Program name and program type: COPD management--care coordination (disease management); New or expanded program: Expansion; Description: Care managers worked with patients to monitor their health status and to encourage patients to visit their physicians when necessary. Participating physician group: Geisinger Health System; Program name and program type: Congestive heart failure management-- Care coordination (disease management); New or expanded program: Expansion; Description: Care managers monitored patients' health status through a voice recognition system and contacted patients when their health status became problematic. Participating physician group: Geisinger Health System; Program name and program type: Diabetes disease management--care coordination (disease management); New or expanded program: Expansion; Description: Care managers worked with patients to educate them on managing diabetes. Participating physician group: Geisinger Health System; Program name and program type: Moderate risk case management--care coordination (case management); New or expanded program: New; Description: Care managers worked with patients to reduce risk factors associated with potential future hospitalizations. Participating physician group: Geisinger Health System; Program name and program type: Postacute case management--care coordination (case management); New or expanded program: New; Description: Care managers contacted patients after a hospital discharge to ensure that home health services were received, correct medications taken, etc. Participating physician group: Integrated Resources for the Middlesex Area; Program name and program type: Anticoagulation--care coordination (case management); New or expanded program: New; Description: Pharmacists and physicians worked with patients during a hospitalization to ensure prescriptions were correct and to assist in the transition from inpatient to outpatient care. Participating physician group: Integrated Resources for the Middlesex Area; Program name and program type: Cancer care management--care coordination (disease management); New or expanded program: New; Description: Care managers worked with colon, breast, and lung cancer patients and their physicians to ensure that evidence-based treatment guidelines are followed for psychological, nutritional, and palliative care. Participating physician group: Integrated Resources for the Middlesex Area; Program name and program type: Chronic care management--care coordination (case management); New or expanded program: New; Description: Care managers educated patients admitted to the hospital on disease self-management and proper medication use. Participating physician group: Integrated Resources for the Middlesex Area; Program name and program type: Congestive heart failure (CHF)--care coordination (disease management); New or expanded program: New; Description: Care managers helped patients understand and follow their post-discharge instructions. Participating physician group: Integrated Resources for the Middlesex Area; Program name and program type: Diabetes disease management--care coordination (disease management); New or expanded program: Expansion; Description: Care managers worked with diabetes patients to coordinate care across providers, provide in-person patient education, and remind patients of appointments. Participating physician group: Integrated Resources for the Middlesex Area; Program name and program type: Diabetes education--patient education; New or expanded program: Expansion; Description: Certified diabetes educators assisted patients in understanding diabetes self management tools. Participating physician group: Integrated Resources for the Middlesex Area; Program name and program type: Heart smart program--care coordination (disease management); New or expanded program: Expansion; Description: Care managers provided case management services to cardiac patients enrolled in home care services. Participating physician group: Integrated Resources for the Middlesex Area; Program name and program type: HomeMed program--care coordination (case management); New or expanded program: New; Description: Frail, elderly patients with multiple conditions receive a telemedicine device in their home that monitors vital signs. Participating physician group: Marshfield Clinic; Program name and program type: Anticoagulation--care coordination (case management); New or expanded program: Expansion; Description: Care managers worked with patients to ensure that dosages of the anticlotting drug Warfarin were adjusted properly. Also educated patients on recognizing factors that can influence anticoagulation such as diet, activity, other medications, and other illnesses. Participating physician group: Marshfield Clinic; Program name and program type: CHF management--care coordination (disease management); New or expanded program: New; Description: Care managers called patients to check on health status, schedule physician visits, and answer questions. Participating physician group: Novant Medical Group; Program name and program type: Disease management, compass--care coordination (disease management); New or expanded program: New; Description: Care managers assisted patients with medication management, appointments, and physician referral. Participating physician group: Novant Medical Group; Program name and program type: Outpatient case management--care coordination (case management); New or expanded program: New; Description: Care managers helped high-risk, high-cost patients recently discharged from the hospital schedule follow-up physician visits and learn of available resources. Participating physician group: Novant Medical Group; Program name and program type: Palliative care program--administrative process; New or expanded program: Expansion; Description: Physicians and staff were educated on how to talk to patients about palliative care. Participating physician group: Novant Medical Group; Program name and program type: Physician and staff education-- administrative process; New or expanded program: Expansion; Description: Participating physician group: Physicians and clinical staff were educated on evidence-based guidelines for chronic disease management. Participating physician group: Novant Medical Group; Program name and program type: Transition of care program--care coordination (case management); New or expanded program: New; Description: Nurses contacted patients after hospital discharges to ensure patients made follow-up physician appointments. Participating physician group: Park Nicollet Health Services; Program name and program type: Diabetes care management--care coordination (disease management); New or expanded program: New; Description: Care managers educated newly diagnosed diabetes patients on diabetes self management. Participating physician group: Park Nicollet Health Services; Program name and program type: Health support model; New or expanded program: New; Description: During a 30-minute office visit, patients received an evaluation of needs, health education, diagnoses, prevention measures, and fitness counseling. Participating physician group: Park Nicollet Health Services; Program name and program type: Heart failure care coordination--care coordination (disease management); New or expanded program: New; Description: Care managers monitored patients' medication usage and dietary regimes through an interactive voice response system. Participating physician group: Park Nicollet Health Services; Program name and program type: 24/7 nurse triage/nurse on call--patient education; New or expanded program: Expansion; Description: Patients telephoned a call center where nurses directed them to care based on their symptoms. Participating physician group: St. John's Health System; Program name and program type: Case management systems--care coordination (case management); New or expanded program: Expansion; Description: Care managers provided care to high-risk patients including coordination of inpatient and outpatient care services and guidance on following treatment plans. Participating physician group: St. John's Health System; Program name and program type: Disease management--care coordination (disease management); New or expanded program: Expansion; Description: Care managers managed, educated, and coached patients with chronic conditions. Participating physician group: St. John's Health System; Program name and program type: CHF program (disease management); New or expanded program: New; Description: Nurses assessed the health status of heart failure patients. Participating physician group: St. John's Health System; Program name and program type: Medication access--medication-related; New or expanded program: New; Description: Low-income patients were assisted in obtaining free medications from pharmaceutical companies. Participating physician group: University of Michigan Faculty Group Practice; Program name and program type: Complex care coordination-- care coordination (case management); New or expanded program: New; Description: Care managers monitor patients with multiple chronic diseases and educate them on self management. Participating physician group: University of Michigan Faculty Group Practice; Program name and program type: Post-discharge transitional care--care coordination (case management); New or expanded program: New; Description: Care managers provided education, medication counseling, guidance on post-acute care treatment, and assistance with making and getting to post-discharge appointments. Sources: GAO and CMS. [End of table] [End of section] Appendix III: Reported PGP Demonstration-Related Start-up and Operating Costs for New and Expanded Programs: Physician group: Billings Clinic; Number of programs: 9; Start-up investment expenditures: $317,503; Total operating expenditures for performance year 1: $2,703,379. Physician group: Dartmouth-Hitchcock Clinic; Number of programs: 3; Start-up investment expenditures: $878,031; Total operating expenditures for performance year 1: $1,344,749. Physician group: The Everett Clinic; Number of programs: 5; Start-up investment expenditures: $365,750; Total operating expenditures for performance year 1: $617,500. Physician group: Geisinger Health System; Number of programs: 5; Start-up investment expenditures: $82,573; Total operating expenditures for performance year 1: $929,888. Physician group: Integrated Resources for the Middlesex Area; Number of programs: 8; Start-up investment expenditures: [A]; Total operating expenditures for performance year 1: $1,192,185. Physician group: Marshfield Clinic; Number of programs: 2; Start-up investment expenditures: $917,398; Total operating expenditures for performance year 1: $2,922,820. Physician group: Novant Medical Group; Number of programs: 5; Start-up investment expenditures: $916,499; Total operating expenditures for performance year 1: $917,500. Physician group: Park Nicollet Health Services; Number of programs: 3; Start-up investment expenditures: $402,226; Total operating expenditures for performance year 1: $512,762. Physician group: St. John's Health System; Number of programs: 5; Start-up investment expenditures: $96,354; Total operating expenditures for performance year 1: $1,081,801. Physician group: University of Michigan Faculty Group Practice; Number of programs: 2; Start-up investment expenditures: $427,848; Total operating expenditures for performance year 1: $436,386. Physician group: Average for physician group; Number of programs: 4.7; Start-up investment expenditures: $489,354; Total operating expenditures for performance year 1: $1,265,897. Physician group: Total; Number of programs: 47; Start-up investment expenditures: $4,404,182; Total operating expenditures for performance year 1: $12,658,970. Source: GAO. [A] Start-up investment expenditures for Integrated Resources for the Middlesex Area were not available. [End of table] [End of section] Appendix IV: Comments from the Centers for Medicare & Medicaid Services: Department Of Health & Human Services: Centers for Medicare & Medicaid Services: Office of the Administrator: Washington, DC 20201: Date: January 22, 2008: To: Kathleen King: Director, Health Care: Government Accountability Office: From: [Signed by] Kerry Weems: Acting Administrator: Subject: Government Accountability Office (GAO) Draft Report: "Medicare Physician Payment: Care Coordination Programs Used in Demonstration Show Promise, But Wider Use of Payment Approach May Be Limited" (GAO-08- 65): The Centers for Medicare & Medicaid Services (CMS) appreciates the opportunity to respond to GAO's draft report entitled, "Medicare Physician Payment: Care Coordination Programs Used in Demonstration Show Promise, But Wider Use of Payment Approach May Be Limited." Like GAO, CMS is encouraged by the Physician Group Practice (PGP) Demonstration's first-year results. The results show that we are on the right track to meeting the demonstration's goals and objectives of fostering coordination of Part A and Part B services, promoting investment in care management and quality improvement processes, and rewarding providers for improving health outcomes. At the end of the first year, we found that all groups improved the clinical management of diabetes patients. Two groups earned $7.3 million as their share of the total savings of $9.5 million generated for the Medicare Trust Funds, and six additional physician groups had lower Medicare spending growth rates than their local market areas, but not sufficiently lower to share in savings under the demonstration's performance payment methodology. We also observed the PGP participating physician groups investing in and implementing care management strategies designed to anticipate patient needs, prevent chronic disease complications and avoidable hospitalizations, and improve quality of care. Your report found 78 programs in place at the 10 physician practices, of which 47 programs were new or expanded as a result of the opportunity to share in savings under the demonstration. We are optimistic that as these care management programs mature, we will share additional savings and observe further quality improvements. The PGP Demonstration provides critical learning opportunities as we focus on finding ways to control health care costs while also improving the overall quality of care. In addition, the demonstration compliments many local pay-for-performance initiatives and serves as a catalyst in local markets to improve the quality and efficiency of care across all patient populations. GAO Recommendation: The Administrator of CMS should provide PGP participating physician groups with interim summary reports that estimate participants' progress in achieving cost-savings and quality-of-care targets. CMS Response: We have been working with the physician groups on how best to provide interim feedback within the demonstration's resources since the start of the demonstration. Initially, we provided quarterly data sets that identified patients with a chronic illness who had a visit at the group and who also had a hospital admission or emergency room visit during the quarter. However, most physician groups, as you note, do not have the necessary resources to analyze these data sets in a timely manner but are interested in expenditure data on patients likely to be assigned to them under the demonstration. As a result, we began to explore with the physician groups how best to provide interim information on patients likely to be assigned to them at annual reconciliation and their utilization of inpatient services. The information about inpatient services utilization is designed to serve as a proxy for financial performance since many of the physician groups' care management and quality improvement activities are designed to reduce avoidable inpatient admissions to generate shareable savings under the demonstration. In November 2007, we held a conference call with the physician groups to: discuss how the quarterly data sets have been used; identify specific ideas or strategies for using the quarterly data sets; and solicit feedback on the information that should be contained in a new quarterly data set. As a result of the call, we are developing a new quarterly report and refined data set to aid physician groups in monitoring their performance, coordinating care, and improving quality. The new quarterly report and data set will include aggregated information and specific data elements in the following areas: assigned beneficiary demographics; hospital utilization; hierarchical condition category risk adjustment status; and quality of care gaps. Our intention is that these new quarterly reports and refined data sets will be consistent with the reports and data sets provided to physician groups as part of the annual reconciliation process. They will address a key limitation of existing quarterly data sets identified in your report, that most physician groups do not have the necessary resources to analyze the data sets in a timely manner. They will give physician groups interim information on patients for whom they are most likely to be held accountable, as well as information on a key proxy for savings under the demonstration. While we are working to improve the quarterly reports, we do not plan to provide interim expenditure results for several reasons. First, interim expenditure results may conflict with annual performance year results since they do not use final action claims and have not undergone the extensive review and validation process that the annual reconciliation results and reports undergo. Second, the production of interim expenditure results presents significant operational and resource issues. Finally, interim expenditure results may not be actionable by physician groups since only information on likely assigned patients is provided and comparison group expenditure data is required to calculate savings and performance under the demonstration. In closing, your report provides additional insight into performance year one results and complements our ongoing evaluation activities. We appreciate your thoughtful analysis as we continue to learn from this and other pay-for-performance demonstrations on how best to incorporate value-based purchasing strategies into the Medicare program. [End of section] Appendix V: GAO Contact and Staff Acknowledgments: GAO Contact: Kathleen M. King, (202) 512-7114 or kingk@gao.gov: Acknowledgments: In addition to the contact named above, Thomas Walke, Assistant Director; Jennie Apter; Kelly Barar; Zachary Gaumer; and Jennifer Rellick made key contributions to this report. [End of section] Footnotes: [1] Medicare Part B helps pay for doctors' services, outpatient hospital care, and durable medical equipment. [2] MedPAC is an independent federal body established by the Balanced Budget Act of 1997, Pub. L. No. 105-33, §4022, 111 Stat. 251, 350-55, to advise Congress on issues affecting the Medicare program. [3] Medicare Part A pays for inpatient hospital stays, care in skilled nursing facilities, hospice care, and some home health care. [4] Pub. L. No. 106-554, App. F, § 412, 114 Stat. 2763, 2763A-509- 2763A-515. [5] 67 Fed. Reg. 61116 (Sept. 27, 2002). [6] The number of physicians includes physicians and clinical professionals who can bill Medicare as physicians, such as physician assistants. Unless otherwise noted, the term physician includes all professionals paid under the Medicare physician fee schedule. [7] Performance year two began April 1, 2006 and ended March 31, 2007, and performance year three began April 1, 2007, and ends March 31, 2008. [8] In each performance year, the Medicare beneficiaries assigned to each of the 10 unique comparison groups are similar to the Medicare beneficiaries being served by the participating physician group they are compared against, in terms of their geographic location and the services they receive. [9] We visited Geisinger Health System in Pennsylvania, Park Nicollet Health Services in Minnesota, Marshfield Clinic in Wisconsin, Billings Clinic in Montana, and The Everett Clinic in Washington. [10] CMS received 26 applications in response to its Request for Proposal for the PGP Demonstration from a variety of organizations. [11] We defined an integrated delivery system as a health care system that includes at least one hospital in addition to the physician group, and may include other health care providers, such as home health agencies or nursing homes. The Everett Clinic identified itself as an integrated delivery system that does not include a general hospital but does include two ambulatory surgical centers, a laboratory, and a radiation center. [12] For the purposes of the PGP Demonstration, service areas consist of all counties in which a given participating physician group derives at least 1 percent of its assigned Medicare FFS beneficiaries. These counties are combined to form each participating physician group's service area. [13] Under the Medicare physician payment system, E&M services refer to visits and consultations furnished to patients by physicians and include 10 current procedural terminology codes that represent the level of E&M service provided. [14] Other savings generated by the participating physician groups but not awarded to them include the first 2 percent of savings generated as well as any unearned quality-of-care bonuses. [15] The quality measures were developed by CMS in conjunction with professional associations such as the American Medical Association's Physician Consortium for Performance Improvement, the National Committee for Quality Assurance, the National Quality Forum, and the 10 participating physician groups. [16] The other measures were influenza vaccination, pneumonia vaccination, low-density lipoprotein cholesterol test, urine protein test, and Hemoglobin A1c test (which measures blood sugar level) in the recommended time interval, and whether Hemoglobin A1c, blood pressure, and low-density lipoprotein cholesterol are at the recommended levels. [17] NCQA created the health plan employer data and information set (HEDIS®) to measure performance on important dimensions of care and service, and it is now a tool used by more than 90 percent of health plans. For each measure, participating physician groups must achieve at least one of three targets: (1) meet the higher of 75 percent compliance or the Medicare HEDIS® mean for the measure (for those measures where HEDIS® indicators are also available), (2) demonstrate a 10 percent or greater reduction in the gap between the administrative baseline and 100 percent compliance, or (3) achieve the 70th percentile Medicare HEDIS® level (for those measures where HEDIS® indicators are also available). [18] The Marshfield Clinic was eligible to earn about $1.5 million ($1,448,613) in PY1 as a quality bonus if it met all 10 of the diabetes quality targets. [19] The University of Michigan Faculty Group Practice was eligible to earn $838,951 in PY1 as a quality bonus if it met all 10 of the diabetes quality targets. [20] Collectively, these theoretical savings amounted to about $8 million. However, CMS does not consider these savings to be actual savings because based on the demonstration design, savings below the 2 percent threshold could be caused by random fluctuations in expenditures. [21] A provider could be a physician, hospital, or other professional health care service organization. [22] Congressional Research Service, The Library of Congress, CRS Report for Congress: Pay-for-Performance in Health Care (Washington, D.C.: updated Dec. 12, 2006). CRS's count of pay-for-performance programs includes those targeting compensation to physicians, physician groups, and hospitals. [23] For this demonstration, participants can also choose a different payment methodology rather than the bonus payment methodology used in the PGP Demonstration as long as it includes a hybrid shared savings approach incorporating a cost-savings and quality component. [24] Seth W. Glickman et al., "Pay for Performance, Quality of Care and Outcomes in Acute Myocardial Infarction," The Journal of the American Medical Association, vol. 297, no. 21 (2007). [25] Participants in the PGP Demonstration reported using 78 total programs to achieve cost savings and quality targets: 28 were new programs created specifically for the demonstration, 19 were expansions of existing programs, and 31 were existing programs that did not change as a result of the demonstration. [26] Interactive voice response systems are automated telephonic systems that patients call in to on a regular basis to answer a series of health-related questions. [27] See Michael Trisolini et al., "Medicare Physician Group Practice: Innovations in Quality and Efficiency," Commonwealth Fund, (December 2006), and Stuart Gutterman et. al., "Enhancing Value In Medicare: Demonstrations and Other Initiatives to Improve the Program," Commonwealth Fund, (January 2007). [28] The quality measures selected by CMS for the PGP Demonstration are a subset of the measures CMS developed for its Doctors' Office Quality project. This project was designed to develop and test a comprehensive, integrated approach to measuring the quality of care for chronic disease and preventive services in the doctors' offices. Participants for this demonstration are located in California, Iowa, and New York. [29] Officials at the St. John's Health System regarded this as a precursor to a full-featured EHR system. [30] For a more detailed discussion of the study design and its tradeoffs, see Donald T. Campbell and Julian C. Stanley, Experimental and Quasi-Experimental Designs for Research (Boston: Houghton Mifflin Company, 1963). [31] To adapt the CMS-HCC model for the PGP Demonstration, CMS used current-year data to make projections, rather than using prior-year data, and in doing so increased the explanatory power of the model from 10 percent to 50 percent. In addition, CMS made a mathematical adjustment to the weights of the CMS-HCC model to accommodate the population of beneficiaries eligible for the demonstration, end-stage renal disease patients, organ transplant patients, and people who join Medicare in the middle of a performance year. [32] CMS officials stated that in any given year, they considered a claims data file to be sufficiently complete if it contained 98 percent of claims for services provided to a given group of Medicare beneficiaries. [33] CAHs are small rural hospitals that receive Medicare payments for their reasonable costs of providing inpatient and outpatient services, rather than being paid fixed amounts for services. [34] Assignment may be based on providing less than half of a beneficiary's evaluation and management E&M services if no other physician practice has provided a larger portion of E&M services during the performance year. [35] The 10 participating physician groups had an average of 627 providers in their practices, including nonphysician providers who can bill Medicare as physicians, ranging from 232 to 1,291. [36] The majority of individual physicians in the U.S. practiced medicine in a group-practice setting in 2005. While 1 percent of physician practices consisted of more than 25 physicians, 36 percent of physicians practiced in physician groups with more than 25 physicians. [37] Annual medical revenues were not available from IRMA. [38] Ninety-nine percent of single-specialty practices in the United States had annual medical revenues that were $50 million or less, and the average single-specialty practice employed 29 FTEs in 2005. [39] National Center for Health Statistics' National Ambulatory Medicare Care Survey, 2005. [40] An official from the Certification Commission for Healthcare Information Technology added that its estimate of initial acquisition costs of EHR systems includes hardware, software, training, and implementation costs and its estimate of annual maintenance costs does not include the expected costs physician groups will likely incur to hire the internal information technology management staff needed to operate EHR systems. [End of section] GAO's Mission: The Government Accountability Office, the audit, evaluation and investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO's commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each weekday, GAO posts newly released reports, testimony, and correspondence on its Web site. To have GAO e-mail you a list of newly posted products every afternoon, go to [hyperlink, http://www.gao.gov] and select "Subscribe to Updates." Order by Mail or Phone: The first copy of each printed report is free. Additional copies are $2 each. A check or money order should be made out to the Superintendent of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or more copies mailed to a single address are discounted 25 percent. Orders should be sent to: U.S. Government Accountability Office: 441 G Street NW, Room LM: Washington, D.C. 20548: To order by Phone: Voice: (202) 512-6000: TDD: (202) 512-2537: Fax: (202) 512-6061: To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: E-mail: fraudnet@gao.gov: Automated answering system: (800) 424-5454 or (202) 512-7470: Congressional Relations: Ralph Dawn, Managing Director, dawnr@gao.gov: (202) 512-4400: U.S. Government Accountability Office: 441 G Street NW, Room 7125: Washington, D.C. 20548: Public Affairs: Chuck Young, Managing Director, youngc1@gao.gov: (202) 512-4800: U.S. Government Accountability Office: 441 G Street NW, Room 7149: Washington, D.C. 20548:

The Justia Government Accountability Office site republishes public reports retrieved from the U.S. GAO These reports should not be considered official, and do not necessarily reflect the views of Justia.