American Community Survey

Key Unresolved Issues Gao ID: GAO-05-82 October 7, 2004

The Congress asked GAO to review operational and programmatic aspects of the Census Bureau's ACS that will affect the reliability of small geographic area data. The ACS will be a mail survey of about 3 million households annually, whose results will be cumulated over 5 years to produce estimates that will replace information previously provided by the Decennial Census long form. In addition, annual data will be published for geographic areas with 65,000+ populations and as 3-year averages for areas with populations of 20,000 to 65,000. Annual data will be published beginning in 2006 with data for 2005. The 5-year averages for 2008-12 will provide data for small geographic areas.

The Census Bureau's development of the American Community Survey goes back several decades and has included intensive research and field testing programs, as well as substantial outreach efforts, in particular through the reports and workshops at the National Academy of Sciences (NAS). However, if the ACS is to be an adequate replacement for the Decennial Census long form as the major source of data on small geographic areas and if it is to provide similar annual data for larger areas, the Census Bureau will need to incorporate in a timely manner the resolution of issues it has already identified in the ACS testing and 2000 Decennial Census evaluation programs, such as the residence concept, group quarters, and questions on disability; complete the ACS testing plan as originally planned, such as the comparison and evaluation of long form-ACS supplementary survey data at the state level, to identify other unresolved issues and to provide information for users of 2000 Decennial Census long-form data that will be necessary for the transition to the full ACS; evaluate and consult with stakeholders and users on the resolution of issues identified in this report, such as the methodology for deriving population and housing controls, guidance for users on the impact of the characteristics of multiyear averages for small geographic areas, and the presentation of dollar-denominated values; coordinate the results of the testing program for the 2010 Decennial Census short form with the ACS implementation schedule; and resolve all issues so that the ACS estimates beginning with 2008 are consistent with the ACS estimates for 2009-12 and with the 2010 Census short form. Although the Census Bureau has solicited advice from external stakeholders and users and has supported research by its own staff on most of the issues identified in this report, there is no indication that the Census Bureau has yet followed this advice or implemented plans for consultation on resolving these issues. In addition, it has been more than a year since the Census Bureau announced that it was looking into establishing an ACS partnership program that would involve advisory groups and expert panels to improve the program, but no such program has been established. Another issue related to the proposed ACS is how the Census Bureau might provide more timely and reliable small geographic area data. This goal could be accomplished, but it would require additional funding. The most direct approach would be to increase the sample size for 2009-11. This increase would enable the Bureau to provide small geographic area data that would be the replacement for the 2010 Census long form 1 year earlier.

Recommendations

Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.

Director: Team: Phone:


GAO-05-82, American Community Survey: Key Unresolved Issues This is the accessible text file for GAO report number GAO-05-82 entitled 'American Community Survey: Key Unresolved Issues' which was released on November 08, 2004. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. Congressional Requesters: United States Government Accountability Office: GAO: October 2004: American Community Survey: Key Unresolved Issues: GAO-05-82: GAO Highlights: Highlights of GAO-05-82, a report to congressional requesters: Why GAO Did This Study: The Congress asked GAO to review operational and programmatic aspects of the Census Bureau‘s ACS that will affect the reliability of small geographic area data. The ACS will be a mail survey of about 3 million households annually, whose results will be cumulated over 5 years to produce estimates that will replace information previously provided by the Decennial Census long form. In addition, annual data will be published for geographic areas with 65,000+ populations and as 3-year averages for areas with populations of 20,000 to 65,000. Annual data will be published beginning in 2006 with data for 2005. The 5-year averages for 2008–12 will provide data for small geographic areas. What GAO Found: The Census Bureau‘s development of the American Community Survey goes back several decades and has included intensive research and field testing programs, as well as substantial outreach efforts, in particular through the reports and workshops at the National Academy of Sciences (NAS). However, if the ACS is to be an adequate replacement for the Decennial Census long form as the major source of data on small geographic areas and if it is to provide similar annual data for larger areas, the Census Bureau will need to * incorporate in a timely manner the resolution of issues it has already identified in the ACS testing and 2000 Decennial Census evaluation programs, such as the residence concept, group quarters, and questions on disability; * complete the ACS testing plan as originally planned, such as the comparison and evaluation of long form–ACS supplementary survey data at the state level, to identify other unresolved issues and to provide information for users of 2000 Decennial Census long-form data that will be necessary for the transition to the full ACS; * evaluate and consult with stakeholders and users on the resolution of issues identified in this report, such as the methodology for deriving population and housing controls, guidance for users on the impact of the characteristics of multiyear averages for small geographic areas, and the presentation of dollar-denominated values; * coordinate the results of the testing program for the 2010 Decennial Census short form with the ACS implementation schedule; and * resolve all issues so that the ACS estimates beginning with 2008 are consistent with the ACS estimates for 2009–12 and with the 2010 Census short form. Although the Census Bureau has solicited advice from external stakeholders and users and has supported research by its own staff on most of the issues identified in this report, there is no indication that the Census Bureau has yet followed this advice or implemented plans for consultation on resolving these issues. In addition, it has been more than a year since the Census Bureau announced that it was looking into establishing an ACS partnership program that would involve advisory groups and expert panels to improve the program, but no such program has been established. Another issue related to the proposed ACS is how the Census Bureau might provide more timely and reliable small geographic area data. This goal could be accomplished, but it would require additional funding. The most direct approach would be to increase the sample size for 2009– 11. This increase would enable the Bureau to provide small geographic area data that would be the replacement for the 2010 Census long form 1 year earlier. What GAO Recommends: The Secretary of Commerce should direct the Census Bureau to revise the ACS evaluation and testing plan, focusing on issues GAO identifies; give stakeholders meaningful input on related decisions; and make the underlying information public. The Secretary should direct the Bureau to set a schedule for incorporating operational and programmatic changes into the 5-year averages for 2008–12. In commenting on a draft of this report, the Secretary stated that Commerce has already addressed most of the key issues we identified in this report. We believe, however, that the matters are not being fully addressed and need further attention by Commerce. [End of section] Contents: Letter: Results in Brief: Background: Outstanding Issues Jeopardize ACS's Replacement of the Long Form: Improving Timeliness and Quality of Small Geographic Area Data Would Increase Costs: Resolving Outstanding Issues Needs a Time Schedule: Conclusions: Recommendations for Executive Action: Agency Comments and Our Evaluation: Appendix I: Scope and Methodology: Appendix II: Recent NAS Findings on Continuous Measurement and the ACS: 1998 NAS Workshop on the ACS: 2000 Interim Report: 2001 Letter Report: 2003 Interim Report: The 2000 Census: Counting under Adversity: Appendix III: The Decennial Census Long Form and the Evolution of the ACS Plan: Decennial Census Long Form: Evolution of ACS Plan: Appendix IV: Continuous Measurement ACS Testing and Development Program: Appendix V: Current Status of Unresolved Issues: Independent Controls for Population and Housing Characteristics: Operational Issues: Valuation and Presentation of Dollar-Denominated Data Items: Evaluations of ACS, Long-Form, and CPS Data Comparisons: Information on Key Properties of Multiyear Averages: External Consultation: Appendix VI: Comments from the Department of Commerce: Appendix VII: GAO Contacts and Staff Acknowledgments: GAO Contacts: Staff Acknowledgments: Bibliography: History of the Long Form and Mid-Decade Census: Census Bureau ACS Reports: Census Bureau Advisory Committee Presentations: Census Bureau Continuous Measurement Series: Census Bureau Internal Reports: Census 2000 Evaluation Reports: Census Bureau 2003 JSM Staff Papers: Other Census Bureau Staff Research Papers: Association of Public Data Users Papers: Congressional Hearings and Testimony: Other Reports and Papers: Related GAO Products: Tables: Table 1: ACS Milestone Events and Unresolved Issues, 2004-13: Table 2: Continuous Measurement and ACS Funding, Fiscal Years 1995- 2005: Table 3: The 2000 Census Long Form and ACS Use of Independent Controls for Population and Housing Characteristics: Table 4: Population Comparison for Counties in 2000 from ICPE and 2000 Census by County Size: Abbreviations: ACS: American Community Survey: ASA: American Statistical Association: BEA: Bureau of Economic Analysis: BLS: Bureau of Labor Statistics: C2SS: Census 2000 Supplementary Survey: CM: Continuous Measurement: CPI: Consumer Price Index: CPS: Current Population Survey: HUD: Department of Housing and Urban Development: ICPE: Intercensal Population Estimates: NAS: National Academy of Sciences: OMB: Office of Management and Budget: United States Government Accountability Office: Washington, DC 20548: October 8, 2004: The Honorable Tom Davis, Chairman: The Honorable Henry Waxman, Ranking Minority Member: Committee on Government Reform: House of Representatives: The Honorable William Lacy Clay, Ranking Minority Member: Subcommittee on Technology, Information Policy, Intergovernmental Relations, and the Census: Committee on Government Reform: House of Representatives: The Honorable Adam Putnam: House of Representatives: The Census Bureau has designed the 2010 Decennial Census around three new operations. One will replace the Census long-form questionnaire with the American Community Survey (ACS).[Footnote 1] Testing the ACS began in 1996 and full implementation will begin in 2005 and continue as long as the program receives annual funding. A separate long-form questionnaire has been mailed to a sample of households once a decade to collect detailed information on demographic, housing, social, and economic characteristics since the 1960 Decennial Census. This information has been the main source of information for small geographic areas, including tracts and block groups; it has been used extensively by federal agencies for program implementation and by state and local governments for programmatic and planning purposes. In the 2000 Decennial Census, the long form was mailed to a sample of about 19 million housing units. The ACS will contain the same questions as the long form but will be mailed monthly to an annual sample of 3 million housing units. With the smaller sample, the ACS is designed to provide the same information at the same level of geographic detail as the long form by means of a continuous measurement methodology in which survey responses will be accumulated over time. The Census Bureau has determined that in order to produce reliable estimates at the same geographic level of detail as the long form, ACS results will be cumulated over 5 years. It also has determined that the ACS will provide reliable estimates for geographic areas with populations of 20,000 to 65,000 by cumulating ACS responses over 3 years and for geographic areas with populations of more than 65,000 by cumulating ACS results for 1 year but that these estimates will be less reliable than the corresponding long-form estimates. According to the plan the Congress approved, the first annual ACS data for geographic areas with populations larger than 65,000 will be published beginning in 2006 with data for 2005; 3-year averages for geographic areas with populations between 20,000 and 65,000 will begin in 2008; and 5-year averages for geographic areas with populations smaller than 20,000, including tracts and block groups, will begin in 2010. The 5-year averages for 2008-12 to be published in 2013 will replace the 2010 Decennial Census long form for small geographic areas, as they will be centered on 2010 and closely reflect the population and housing characteristics data from the 2010 Decennial Census short form. In replacing the long form, the ACS will provide the same long-form data items at the same level of geographic area detail but in a more timely way. Whereas the long form provided small geographic detail once a decade, the ACS will provide annual estimates for large geographic areas and estimates for smaller areas in terms of 3-year or 5-year averages. You asked us to examine issues about replacing the long form with the ACS related to the reliability of data for small geographic areas. As agreed with your offices, our objectives for this report were to (1) review the Census Bureau's testing program on operational and programmatic aspects that will affect the reliability of small geographic area data and (2) determine whether alternatives to the proposed ACS would provide more frequent and more reliable data for small geographic areas. To address these topics, we reviewed ACS-related Census Bureau documents, congressional testimony, National Academy of Sciences (NAS) reports, and consultants' reports prepared for the Census Bureau, Bureau of Labor Statistics (BLS), and Department of Housing and Urban Development (HUD). We also interviewed small-area data experts on the latest NAS report on the ACS and reviewed the Census Bureau's responses to recommendations on the ACS in our earlier reports. We conducted our work between April 2003 and August 2004 in accordance with generally accepted government auditing standards. We describe our scope and methodology in more detail in appendix I. Results in Brief: If the ACS is to be an adequate replacement for the Decennial Census long form as the major source of data on small geographic areas and if it is to provide similar annual data for larger areas, we believe that the Census Bureau will need to (1) incorporate in a timely manner the resolution of issues it has already identified in testing the ACS, (2) complete the ACS evaluation and testing plan to identify other issues and provide information for users that will be necessary for the transition to the full ACS, (3) evaluate issues identified in this report and consult with stakeholders and users on their resolution, (4) coordinate the results of the testing program for the 2010 Decennial Census short form with the ACS implementation schedule, and (5) resolve all issues so that the ACS estimates beginning with 2008 are consistent with the ACS estimates for 2009-12 and with the 2010 Census short form. Key Unresolved Issues: Unresolved issues that might affect the reliability of ACS small geographic area data include (1) the introduction of a new concept of residence, (2) the uncertainty about the new methodology for deriving independent controls for population and housing characteristics, (3) the lack of guidance for users from the Census Bureau on the characteristics of multiyear averages for small geographic areas, and (4) operational procedures, such as questionnaire design and the adjustment to dollar-denominated values, and to the consistency between ACS and 2000 Census long-form data. The Census Bureau has announced that it will adopt a concept of "current residence" for determining the geographic location of seasonal residents for the full ACS. The concept will differ from "usual residence," used for decennial censuses and the ACS testing programs. Under the usual residence concept, people who spend their winter in Florida and the rest of the year in New Hampshire, for example, are recorded as residents of New Hampshire; college students living away from home in dormitories are recorded as residents of the college. Under the current residence concept, people have only one residence at any point in time, but their place of residence does not have to be the same throughout the year. Although the Census Bureau plans to change this concept for the ACS, it has reported that sufficient research has not been conducted to make a final set of rules for determining current residence. In addition, it found problems with the residence questions used in 2000 but does not plan to incorporate improved questions until 2010. To determine independent controls for population and housing characteristics, which will be used to adjust ACS sample results, the ACS will use the characteristics derived from decennial censuses for the census year and for other years from the Census Bureau's Intercensal Population Estimates (ICPE) program. The Census Bureau has not developed a methodology for using ICPE for the full ACS to derive controls consistent with the ACS residence concept and ACS reference period or at the same level of geography used for the 2000 Census long form. Before data for 2005 on places with populations of 65,000 or more can be released in 2006, a methodology is needed to provide controls that reflect changes in the residence concept and reference period. A methodology for controls for places with populations of more than 20,000 that incorporates ICPE revisions is needed before the first multiyear averages are released in 2008. In addition, if the averages for 2008-12 are to replace the 2010 Census long form, the methodology for incorporating 2010 Census data and the related revisions to ICPE data will be needed in 2009. User Guidance on Multiyear Averages: ACS data for geographic areas with populations smaller than 65,000 will be presented only in terms of multiyear averages. Because of the statistical properties of these averages and users' unfamiliarity with them, we found that it is critical for the Census Bureau to provide users with guidance on topics such as the reliability of multiyear averages for areas with rapidly changing populations, the reliability of trends calculated from annual changes in multiyear averages, and the use of multiple estimates from the ACS data for geographic areas with populations larger than 20,000. Census Bureau officials told us that they agreed with the need for such guidance but had no plans for its contents. ACS Implementation Schedule: We found that the latest schedule for the 2010 Decennial Census does not provide adequate time for the Census Bureau to incorporate into the full ACS program changes necessary for the ACS data for 2008-12 to be reliable enough to replace the 2010 Census long form. We identified issues that need to be resolved before the 2006 release of the 2005 ACS and other issues that need to be resolved before the release of the first 3-year averages in 2008. The most important issues to be resolved are those that need to be in place by 2008, when the collection of data for calculating the 5-year averages (for 2008-12) that will replace the 2010 Census long form will begin. Prompt resolution of the other issues would improve consistency between the 2005-07 ACS data and the ACS data beginning with 2008. Alternatives to Improve Small Geographic Area Data: Besides the key unresolved issues discussed above, we also identified an alternative to the proposed ACS that would provide more timely and reliable small geographic area data. This alternative would require additional funding to support a larger sample. Under an alternative, patterned after the Census Bureau's initial plan to replace the 2000 Census long form, the sample size for 2009-11 would be increased to 4.8 million housing units and then reduced to 3.0 million housing units for subsequent years. The larger sample would provide small geographic area data that would be the replacement for the 2010 Census long form from 3-year averages (for 2009-11). These averages would be as reliable as the proposed 5-year averages (for 2008-12) and would provide the replacement for the long form data 1 year earlier. The larger sample could also be used permanently after 2011 and would provide continuous 3-year averages for small geographic areas. In written comments on a draft of this report, the Secretary of Commerce addressed three of the four recommendations we addressed to him. Regarding the first recommendation, the Secretary stated that the current ACS testing and evaluation plan already included the issues we have identified in the report. In following up to the Secretary's response, we learned that there is not yet a written plan, but only a rough outline of the types of work planned. Therefore, we believe our recommendation remains valid. Regarding the second recommendation, suggesting that the Census Bureau provide key stakeholders more direct and timely input into decisions on these issues, the Secretary stated that he believes that the present consultation process is adequate. We disagree, because as noted in appendix II of our report, the Census Bureau has not been responsive to recommendations from several National Academy of Sciences reports relating to the ACS. The Secretary agreed with our third recommendation that the Census Bureau provide public documentation for key decisions on issues we have identified in this report. The Secretary did not respond directly to our recommendation that he direct the Census Bureau to prepare a schedule for the 2010 Census that ensures that all necessary changes are made in time for the 2008 ACS so the 5-year ACS averages for 2008-2012 will be an adequate replacement for the 2010 long form for small geographic areas. The comments from the Secretary also include a list of detailed technical comments from the Census Bureau. We reviewed each of these comments and revised the report where appropriate. Background: Now that the Census Bureau has congressional approval to begin the full ACS, data collection will begin in November 2004. The ACS test survey of a sample of 800,000 housing units, which has been conducted since 2000, will end in December 2004. The Bureau has been using this survey, known as the ACS Supplementary Survey, to test procedures and to produce annual data for geographic areas with populations of 250,000 or more. As one part of the test program, the supplementary survey data for 2000 have been compared with corresponding data from the 2000 Census long form to evaluate the quality of the ACS data and to provide users with information to make the transition from the long-form data to the full ACS data. According to the plan the Congress approved, the first annual ACS data for geographic areas with populations larger than 65,000 will be published beginning in 2006 with data for 2005; 3-year averages for geographic areas with populations between 20,000 and 65,000 will begin in 2008; and 5-year averages for geographic areas with populations smaller than 20,000, including Census tracts and block groups, will begin in 2010. The 5-year averages for 2008-12 will replace the 2010 Decennial Census long form for small geographic areas; they will be published in 2013 and will incorporate population and housing characteristics data from the 2010 Decennial Census short form. In replacing the long form, the ACS will provide the same long-form data items at the same level of geographic area detail but in a more timely way. Whereas the long form provided small geographic detail once a decade, the ACS will provide annual estimates for large geographic areas and estimates for smaller areas in terms of 3-year or 5-year averages; the 5-year averages will provide data at the same geographic area level as the long form. According to the Census Bureau, these 5- year averages will be about as accurate as the long-form data; the annual and 3-year averages will be significantly less reliable than the long-form data but more reliable than existing annual household surveys the Census Bureau conducts.[Footnote 2] In the remainder of the Background section of this report, we briefly describe the major differences between the ACS and the Decennial Census long form. We also discuss the Census Bureau's outreach program, designed to involve stakeholders and users in shaping the ACS. Appendix III provides additional background information on the evolution of the ACS plan, appendix IV on the ACS testing and measurement program. Appendix II describes recent NAS findings on Continuous Measurement (CM) and the ACS. Multiyear Averages: The 2000 Census long form used a decennial sample of about 19 million housing units; the full ACS will use an annual sample of 3 million housing units. In order to provide reliable estimates for geographic areas with populations of 65,000 or less, monthly ACS responses will be cumulated over several years--3 years for places with populations of 20,000 to 65,000 and 5 years for places with populations smaller than 20,000. Because of the statistical properties of these averages and users' unfamiliarity with them, the Census Bureau has long recognized the need to provide guidance on such topics as the reliability of the averages for areas with rapidly changing population and the use of multiple estimates for states and other, larger geographic areas. The Concept of Residence: For the 2000 Decennial Census, the ACS test programs, and federal household surveys, including the Current Population Survey (CPS), seasonal residents are recorded in a geographic area according to a concept of usual residence. As we noted above, under this concept, people who spend their winter in Florida and the rest of the year in New Hampshire, for example, are recorded as residents of New Hampshire; college students living away from home in dormitories are recorded as residents of the college. For the full ACS, the Census Bureau has announced its decision to change the concept to current residence. According to the Census Bureau, although each concept requires that a person have only one residence at any point in time, current residence recognizes that the place of residence does not have to be the same throughout a year, allowing ACS data to more closely reflect the actual characteristics of each area. The Census Bureau plans to use current residence because the ACS is conducted every month and produces annual averages rather than point-in-time estimates, unlike the Decennial Census. Current residence is uniquely suited to the ACS, because it continuously collects information from independent monthly samples throughout all months of all years. Because the ACS is designed to produce a continuous measure of the characteristics of states, counties, and other places every year, the new residence rules were needed for seasonal and migratory individuals. Reference Period: The underlying population and housing characteristics data for the 2000 Census long form were for April 1, 2000. For the ACS test program, the underlying population and housing characteristics varied. For all years except 2000, they were for July 1; for 2000, they were for April 1. For the full ACS, because the data are collected monthly, the reference period will be the average for the year, and the Census Bureau will assume this average is equivalent to data for July 1. Independent Controls for Population and Housing Characteristics: The ACS will use population characteristics (age, sex, race, and ethnicity) and housing characteristics (occupied and vacant units) derived from an independent source and not from the results collected in the survey. Using independent controls for these characteristics is standard practice to correct sample survey results for the effects of nonresponse and undercoverage. Population and housing characteristics from the 2000 Census short form were used as independent controls for the 2000 Census long form, down to the tract level. For the ACS supplementary surveys, independent controls were from ICPE, which uses Decennial Census short-form data as benchmarks and administrative record data to interpolate between and extrapolate from the census benchmarks. ICPE develops and disseminates annual estimates of the total population and the distribution by age, sex, race, and Hispanic origin for the nation, state, counties, and functioning government units. ICPE provides annual estimates of population and housing characteristics at the county level, and for some subcounty levels, as of July 1, using the usual residence concept for seasonal residents. Dollar-Denominated Data Items: According to current Census Bureau plans, annual estimates of dollar- denominated data items, such as income, rent, and housing-related expenses, will be presented after adjustment for inflation in order to facilitate comparisons over time. As in the ACS test programs, only annual estimates with this adjustment will be presented. The Census Bureau also has decided to continue to adjust annual data collected each month in the ACS to a calendar year basis. It will be using the Consumer Price Index (CPI) for the annual and monthly adjustments for all geographic areas.[Footnote 3] Operational Differences: The long form and ACS will also differ in how operations are conducted, such as nonresponse follow-up and data capture. For the 2000 Census long form, nonresponse follow-up was conducted for all nonrespondents. For the ACS supplementary surveys and for the full ACS, nonresponse follow-up will be conducted for a sample of one-third of all nonrespondents. For the 2000 Census long form, all data items were entered using automated optical character recognition procedure; data from the ACS will be manually keyed. Group Quarters: The ACS supplementary surveys excluded persons living in group quarters. Group quarters--which include nursing homes, prisons, college dormitories, military barracks, institutions for juveniles, and emergency and transitional shelters for the homeless--accounted for roughly 2.8 percent of the population in 2000. The Census Bureau decided not to cover these persons in the supplementary surveys, to avoid duplication with the 2000 Census, and because it lacked funding to cover them in subsequent years. Procedures for including in the ACS persons living in group quarters beginning with 2005 are discussed in the Census Bureau's ACS Operations Plan, issued in March 2003.[Footnote 4] In addition, it has announced that it intends to continue testing procedures to improve the mailing list for group quarters to be used for the 2010 Decennial Census. Outreach: The Census Bureau has long recognized the need to seek input from stakeholders and users in making decisions for all its programs. The Census Bureau sponsors technical reports that NAS prepares. (In appendix II, we summarize recent NAS reports on the ACS and related decennial censuses.) The Census Bureau has also held conferences on the ACS and has contracted with Westat Inc. to organize two conferences of experts on specific aspects of the ACS. Additionally, the Census Advisory Committees, which are Census Bureau-appointed advisory committees whose members represent professional associations such as the American Statistical Association (ASA) and the American Marketing Association, meet twice a year. The Census Bureau and other federal statistical agencies also participate in the quarterly meetings of the Council of Professional Associations on Federal Statistics, whose members include professional associations, businesses, research institutes, and others interested in federal statistics.[Footnote 5] To obtain input from other federal agencies, the Office of Management and Budget (OMB) established an interagency advisory committee for the ACS in 2000. The committee's major purpose was to coordinate the review of questions to be included in the ACS. Because of the committee's limited focus, the Census Bureau established the ACS Federal Agency Information Program in 2003, responding to a recommendation we made.[Footnote 6] This program is designed to assist each federal agency that has a current or potential use for ACS data to achieve a smooth transition to using the ACS. Outstanding Issues Jeopardize ACS's Replacement of the Long Form: From its beginnings in the mid-1990s, the Census Bureau's development plan for the ACS was designed to ensure that the ACS would satisfactorily replace the Decennial Census long form as the major source of small geographic area data. In our review of the plan, we found that the Census Bureau, as well as key ACS stakeholders, had for many years identified the key issues that needed to be resolved if the ACS were to reach this goal. We have identified the following unresolved issues from our research (described in appendix I): * the methodology to be used for deriving independent controls for population and housing characteristics with ACS definitions of place of residence and reference date, * improvements needed to operational procedures, * methods for valuation and presentation of dollar-denominated data items, * comprehensive analysis of the comparability between new ACS data and corresponding data from the 2000 Census long-form and 2004 supplementary survey, and: * the provision of user guidance on multiyear averages. Despite the Census Bureau's early identification of issues critical to the successful replacement of the 2010 Decennial Census long form as the new source of small geographic area data, we found that its plans to resolve these issues have been only partially completed. Furthermore, we found that despite recent changes to the ACS implementation schedule, it is not fully synchronized with the Census Bureau's time schedule for implementing the testing program for the 2010 Decennial Census. Consequently, if these issues are not resolved in a timely manner, the Census Bureau's plan to replace the 2010 Decennial Census long form with the 2008-12 ACS averages for detailed geographic areas will be jeopardized. A Methodology for Independent Controls for Population and Housing Characteristics Is Lacking: It is standard practice to use independent controls for population and housing characteristics to correct the results of sample surveys for the effects of nonresponse and undercoverage. For the 2000 Census long form, characteristics from the 2000 Census short form were used as independent controls down to the tract level. For the annual ACS supplementary surveys, characteristics from ICPE were used as the independent controls.[Footnote 7] Independent controls for the full ACS will require a new methodology. Short-form data are available only once every 10 years, and the annual ICPE estimates do not provide data for the detailed geographic areas needed to prepare long-form detail and do not use the ACS residence concept or reference period. The new methodology is critical to the reliability of the ACS estimates of small geographic areas that ICPE does not provide and of areas that have large numbers of seasonal residents. Census Bureau staff have long recognized the need for the new methodology. For example, a 1995 paper by Love, Dalzell, and Alexander expressed concern about population controls and residence rules as well as the need for consultation with users on these topics.[Footnote 8] They reported that the Census Bureau was planning to conduct research using data from the 1996 test sites to produce controls at the census tract and block group levels. They also noted that the Census Bureau would need to conduct research on the residence rule. A 2000 paper by Alexander and Wetrogan also discussed the issue of population controls.[Footnote 9] They reviewed possible methods for using ICPE to develop controls for the ACS and noted the need to consult with users on how to present information on the differences in ACS controls and ICPE in ACS publications. Key stakeholders, including experts on the ACS we interviewed in August 2003 (listed in appendix I), expressed similar concerns about the methodology. It appears that no progress had been made on a new methodology until the Census Bureau reported in October 2003 to its advisory committees on the status of a new methodology to derive controls. It announced that when full ACS collection starts in November 2004, (1) interim procedures would be used and (2) a final methodology would not be determined until after the necessary research was completed. The Census Bureau did not provide a date when the methodology would be incorporated. In our review of Census Bureau presentations about the new methodology (described in detail in appendix V), we found that it had no plans to maintain time-series consistency of the population and housing controls by routinely incorporating the regular revisions to ICPE estimates into the ACS. Without such revisions, there could be a significant lack of comparability in the ACS data being averaged, and the reliability of multiyear estimates would be reduced. For example, without such revisions, the 2008-12 averages that are to replace the 2010 Decennial Census long form would be based on controls extrapolated from the 2000 Census for 2008-09 and controls from the 2010 Census for 2010-12. In addition, time-series consistency in the annual ACS data would be reduced, especially in the data for 2010 and previous years. Census Bureau officials told us that they were not planning any such revisions, unless the inconsistencies between 2010 ICPE and 2010 Census characteristics were significant, even though there were significant inconsistencies between the 2000 ICPE estimates and the 2000 Census data, especially for small geographic areas. We found that regularly incorporating all revisions to ICPE into the ACS would improve ACS reliability and that planning would give users advance notice on the Census Bureau's revision practice. The need for such planning is critical, as evidenced by the failure that occurred in January 2004, when a revised set of ICPE data was incorporated into the calculation of monthly CPS data on employment. Initially, the revised employment estimates were released without a revision of the pre-2004 data, resulting in a significant discontinuity between December 2003 and January 2004. As a result of users' dissatisfaction about the discontinuity, a consistent set of employment estimates was released.[Footnote 10] Finally, failure to adequately involve stakeholders in the decision process may contribute to significant misunderstanding about the use of the ACS estimates and corresponding estimates from the Decennial Census. In past decennial censuses, except for the very smallest geographic areas, the population and housing characteristics data published as part of the long-form detail were the same as the official data based on data collected on the short form. Because of differences in the residence and reference period concepts and the use of multiyear averages for small geographic areas, there will be less consistency between the ACS averages for 2008-12 and the 2010 Census data. Operational Issues Have Not Been Addressed: The Census Bureau has identified operational issues with the ACS test programs, primarily from its evaluation studies on the 2000 Decennial Census and Census Bureau staff research papers on comparisons between data collected in the ACS 2000 Supplementary Survey and the 2000 Decennial Census long form. These issues (described in detail in appendix V) include problems with questionnaire design, nonresponse followup, and data capture, as well as coverage of persons living in group quarters. For example, the Census Bureau conducted a study to evaluate the design of the ACS questions that are needed to implement the residence concept and reference period for the ACS.[Footnote 11] The study suggested that additional testing was needed for the questions about multiple residences and noted "that asking these questions on a person basis may produce different and probably better data than asking them on a household basis."[Footnote 12] Similarly, the authors found potential problems with the identification of seasonal residents. We were not able to identify in the Census Bureau's plans whether these issues would be addressed before implementation of the full ACS. We also found, for the implementation of the full ACS for 2005, that the Census Bureau had addressed only the inclusion of group quarters and that it may not resolve the issue of questionnaire design until 2010. In addition, even for group quarters, it is planning for improvements that may not be included until 2010. Furthermore, not all problems have been identified because of the delays in the Census Bureau's completing the evaluation studies of comparisons of long-form and ACS data items. Moreover, the Census Bureau's plans do not provide for external consultations on key decisions about resolving issues. Although the Census Bureau has acknowledged the importance of the timing of incorporating changes to resolve the various issues, any delay in implementing solutions to 2010 would not meet the needs of the ACS collection and production schedule. For example, in its March 2003 ACS operations plan, the Census Bureau recognized the need for maintaining questionnaire continuity to calculate consistent multiyear averages. It also has reported that it needs to incorporate changes in the ACS questionnaire no later than 2008 because changes introduced after 2008 and before 2013 would create inconsistencies in calculating the 5-year averages that are to replace the 2010 Decennial Census long form. Nevertheless, we found that the Census Bureau's current time schedule does not call for resolving issues such as questionnaire design before 2008. Incorporating changes into the ACS beginning with 2008 will help maintain the reliability of the 5-year averages for small geographic areas; failing to incorporate them beginning with 2005 will reduce the reliability of the annual changes in the ACS data. With regard to external consultation, we found that the Census Bureau's plans do not include time for consulting with stakeholders and users, despite NAS, BLS, and Census Advisory Committee suggestions and recommendations. For example, in a February 15, 2001, report to the Census Bureau, the NAS Panel on Research on Future Census Methods recommended that it conduct evaluation studies on "the effectiveness of operations used to designate special places and enumerate the group quarters and homeless populations."[Footnote 13] Members of the Census Advisory Committee had raised similar concerns. In a 2003 report prepared for BLS, their consultant had made a number of recommendations about the questions on employment. We found that the Census Bureau needs to develop a time schedule so that changes can be introduced to minimize inconsistencies between the 2005 and subsequent ACS data and to ensure that all necessary changes are made so that the ACS data for 2008-12 that will replace the 2010 Decennial Census long form will be collected consistently. In addition, the prompt completion of the ACS--long-form comparison studies and related evaluations will provide sufficient time for the Census Bureau to consult with stakeholders and to provide users with the information they need to understand the effect of making changes to the ACS questionnaires or procedures between 2005 and 2008. Plans for Valuation and Presentation of Dollar-Denominated Data Items Are Questionable: When the Census Bureau began releasing data from the ACS test programs, all dollar-denominated items such as incomes, housing values, rents, and housing-related expenditures were adjusted for inflation. As in the ACS test programs, only annual estimates with this adjustment will be presented, and when the Census Bureau releases ACS data for each new year, it revises all dollar-denominated data for prior years. It makes a similar inflation adjustment for the annual income data collected in the CPS, but it releases the unadjusted estimates.[Footnote 14] The Census Bureau also has decided to continue to adjust annual data collected each month in the ACS to a calendar year basis. It will be using the CPI for the annual and monthly adjustments for all geographic areas. The treatment of dollar-denominated data items is critical to all users of these data. It is particularly critical for federal agencies that will be using the ACS instead of the long form for many government programs to determine the allocation of funds or program eligibility. It is also critical to users of dollar-denominated items for small geographic areas because the inflation adjustments under the current procedure are based on a national average index. In our review of the development and implementation of the ACS, we identified questions on the appropriateness of the methodology for the adjustment and the suppression of the unadjusted annual values. A report prepared for HUD found problems with the calculation of the adjustment and the use of the adjustment for income measures used for HUD programs. The report also noted that the lack of the unadjusted annual data would severely limit HUD's use of calculations appropriate to its program needs. Research by Census Bureau staff questioned the adjustment for incomes when they found that it was a probable source of difference between income data from the supplementary survey and corresponding data from the CPS and the 2000 Census long form.[Footnote 15] (We discuss these findings in detail in appendix V.) Our statisticians reviewed these findings and found a similar problem with the calculation of the adjustment because of the lack of trending adjustment. We found that the Census Bureau could estimate calendar year values using a combination of past trends in related series, information from other ACS respondents, or known information such as changes in cost-of-living adjustments for various transfer payment programs and changes in wage rates. We also found that converting ACS data from monthly to calendar year data is similar to conversion issues faced by other agencies that collect annual statistics compiled on a fiscal-year basis and that the procedures these agencies use could be adapted for the ACS.[Footnote 16] With regard to the use of a national cost-of-living adjustment, we have previously reported that for purposes such as allocating federal funds to states using income and poverty data, the CPI, a national measure of inflation, does not reflect variations in geographic areas.[Footnote 17] Census Bureau staff have reported similar findings.[Footnote 18] The HUD and Census Bureau findings and our review raise serious questions about the inflation adjustments. We found no documentation explaining the rationale for the adjustment for either the ACS or the CPS, where its use is limited to income data. Bureau officials informed us that alternative procedures had not been examined and that stakeholders or users had not been consulted on the adjustment. Evaluations of Comparisons Are Incomplete and Users Lack Information on ACS Time-Series Consistency: We noted above that one of the Census Bureau's major justifications for the ACS test programs has been its comparing data collected in these programs, and corresponding data from the 2000 Decennial Census short and long forms, to identify operational problems. Another major justification for the ACS test programs has been the use of these comparisons, and comparisons with corresponding data from the CPS, to inform users in making the transition from the 2000 long form to the ACS. Census Bureau Director Kenneth Prewitt emphasized the importance of transition needs in testimony to the Congress in 2000 when he reported the following about the ACS test programs: "These data will also contribute to a comparison with data from Census 2000 that is necessary because there are differences in methods and definitions between the census and the ACS. Moreover, decision makers will want to compare an area's data to those from Census 2000. Comparisons using data from the operational test and from the 31 sites are essential to determine how much measured change between Census 2000 and future years of the ACS is real and how much is due to operational differences between the ACS and the census."[Footnote 19] Despite acknowledging the importance of these comparisons, the Census Bureau's publication of evaluations of the comparisons has been delayed, and their scope has been reduced in terms of levels, data items, and time period. The lack of information will create problems for ACS users who will be comparing the annual ACS data for 2005 (to be released in mid-2006) with 2000 Decennial Census data or comparing annual ACS supplementary survey data beginning with 2000. In addition to delaying the release of the evaluation studies, the Census Bureau has reduced their scope. For the evaluations of ACS test site data, local experts did not participate in the evaluation of the comparisons for 27 of the 31 test sites. For the 4 test sites that were studied by local experts, they did not cover subcounty local government units. For evaluations of ACS supplementary survey data, the Census Bureau has eliminated the analyses of comparisons between (1) the 2000 supplementary survey and the 2000 long form for geographic areas with populations of 250,000 or more and (2) the supplementary surveys for 2000-02 to corresponding data from the CPS. It has reduced the scope of its evaluation studies by also eliminating comparisons of single-year estimates for most subnational areas and comparisons of data items such as financial characteristics of housing.[Footnote 20] NAS found that the Census Bureau has not placed sufficient priority on completing the necessary evaluation studies.[Footnote 21] Furthermore, we found that the Census Bureau does not have a plan that includes the timely completion of all the studies. Once the studies are complete, it will need to incorporate the findings into ACS operations, consult with stakeholders, and provide users with the information they need to make the transition from the long form to the ACS. The plan will be needed to ensure that as many changes as possible can be introduced before the first annual ACS estimates are published in 2006 and that all necessary changes are implemented before 2008. We found that the delays in completing the evaluations and their reduction in scope are likely to affect the use of the ACS in improving the small geographic area estimates of unemployment and poverty. For example, Labor uses the unemployment data extensively to administer a variety of federal programs. Several other departments use the poverty rates for similar purposes.[Footnote 22] Users Are Not Informed on Key Properties of Multiyear Averages: One of the major differences between the ACS and the long form is that the ACS will provide data for geographic areas with populations smaller than 65,000 in terms of multiyear averages. Experts outside and inside the Census Bureau have identified serious issues regarding the statistical properties of multiyear averages and have recommended that the Census Bureau provide guidance to federal agencies and others on their use. We found that stakeholders have urged the Census Bureau for many years to provide guidance on the strengths and weaknesses of these averages. The most recent request for guidance on using multiyear averages came in the July 2003 report by the NAS Panel on Research on Future Census Methods: "The Census Bureau should issue a user's guide that details the statistical implications of the difference between point-in-time and moving average estimates for various uses."[Footnote 23] In the report's executive summary, the panel also stated that "The Census Bureau must do significant work in informing data users and stakeholders of the features and the problems of working with moving average-based estimates."[Footnote 24] It also expressed particular concern about the use of the multiyear (or moving) averages in fund allocation formulas. Stakeholders have requested guidance on topics such as (1) the reliability of multiyear averages for areas with rapidly changing populations, (2) the reliability of trends calculated from annual changes in multiyear averages, and (3) the selection of ACS data for geographic areas with populations larger than 20,000 for which there will be multiple estimates. The Census Bureau has recognized the need for such guidance but has not announced any information about its contents or when it might be available, even though the guidance is needed well in advance of the release of the first multiyear averages in 2008. We also found that plans for research to evaluate the statistical properties of multiyear averages are limited. The contracts to evaluate 3-year averages for the ACS test sites cover only averages for 1999- 2001, with no comparisons with averages for 2000-02, 2001-03, or 1999- 2003. In addition, the evaluation studies discussed earlier lack any time-series dimension, such as comparisons of the supplementary surveys with annual data from the CPS. Thus, it appears that the Census Bureau has missed the opportunity to test (1) distortion and stability in multiyear averages, (2) differences between multiple estimates for the same geographic areas, and (3) the use of annual ACS data for small geographic areas. Meaningful External Consultation on Key Issues Is Needed: We found that in recent years, the Census Bureau has used its outreach efforts with stakeholders and users primarily to gain support for the ACS. Although it also has solicited advice from NAS panels, advisory committee members, and experts at workshops and conferences on some of the issues we have identified in this report, there is no indication that the Census Bureau will be following this advice. (For additional information, see appendix V.) Likewise, it has not yet followed similar advice from us, other government agencies, or even its own staff. It has been more than a year since the Census Bureau announced, in March 2003, that it was looking into establishing an ACS partnership program that would involve advisory groups and expert panels to help it improve the program. We found that no such program has been established yet. Given that many key issues remain unresolved and that the Census Bureau has no plans to seek advice on resolving them, key aspects of the ACS will receive little or no input unless the Census Bureau revises its plans. Improving Timeliness and Quality of Small Geographic Area Data Would Increase Costs: In 1994, the Congress began to fund testing of the survey to replace the 2000 Decennial Census long form, beginning with the 2000 Census. In reviewing the development of the ACS, we found that the Census Bureau was planning to replace the 2000 long form by starting the ACS program with an annual sample of 4.8 million housing units for 1999, 2000, and 2001 and reducing the sample for subsequent years to 3 million.[Footnote 25] The larger sample would have provided 3-year averages for all small geographic areas for 2000 and would have provided data for the smallest geographic areas of the same quality as the traditional long form. In fiscal year 1998, plans to introduce the ACS to replace the census long form were delayed until after the 2000 Census was completed. When the Census Bureau submitted its plans in 1998 to replace the long form for the 2010 Decennial Census, a similar increase in sample size for 2009-11 was not proposed. Thus, compared with the plans for 2000, data for small geographic areas for 2010 would be delayed by a year and would be based on 5-year averages. When we reviewed the previous plan and other alternatives to the proposed ACS that would provide more timely and reliable data for small geographic areas, we determined that the only viable alternative to the current plans would be to expand the sample size for 2009-11, as proposed earlier. This expansion would allow the Census Bureau to publish data for geographic areas with populations smaller than 20,000 a year earlier, and it would provide more reliable small-area data than under the currently planned 5-year averages. In addition, if the Congress were to provide the additional funds for this alternative, an additional year would be available for the Census Bureau to resolve issues we have identified in this report by giving it until the collection of data for 2009 rather than for 2008. According to Census Bureau estimates, increasing the sample size for the 3 years would add about $250 million to the estimated $500 million cost for the 3 years, using the smaller sample. Resolving Outstanding Issues Needs a Time Schedule: The most recent Census Bureau schedule for implementing the ACS over the complete cycle of the 2010 Decennial Census was prepared in December 2003. Except for the completion of the questionnaire for the 2008 ACS, the milestones do not cover the resolution of issues that it has already identified and issues we identify in this report. (See table 1.) Ideally, all these issues should be resolved before the first annual results of the full ACS sample are released. However, the Census Bureau has already announced that final plans for calculating independent population and housing controls with ACS residence and reference concepts will not be available for several years, the 2004 test plans for the 2010 Decennial Census will cover group quarters and residence rules, reports from the 2004 tests will not be completed until 2005, and the 2006 test plans for 2010 also cover group quarters. Table 1: ACS Milestone Events and Unresolved Issues, 2004-13: 2004; Fiscal quarter: Q4; ACS milestone event: Expand the ACS sample to 250,000 addresses per month; Unresolved issue: (1) Residence rule to be used; (2) Changes to operational procedures, questionnaire design, etc., based on analyses of differences between earlier ACS estimates and other sources such as 2000 Census or on evaluation of 2000 Census data. 2005; Fiscal quarter: Q1; ACS milestone event: Submit proposed topics for 2008 ACS to Congress; Unresolved issue: (1) Changes based on analyses of differences between earlier ACS estimates and other sources such as 2000 Census or on evaluation of 2000 Census data; (2) Consultation with stakeholders and users. 2005; Fiscal quarter: Q4; ACS milestone event: Publish 2004 ACS single-year results for all states and most areas with population 250,000+; Unresolved issue: (1) Information on degree of stability of year-to- year changes in 2000-04 ACS based on comparisons with corresponding data from CPS and other surveys; (2) Release of dollar-denominated data items without adjustments for inflation and adjustment methodology. 2006; Fiscal quarter: Q1; ACS milestone event: Submit actual questions for 2008 ACS to Congress; Unresolved issue: Changes to questions to reflect results of analysis of differences between ACS test data and 2000 Census long-form data, evaluation of reporting in 2000 Census, and results of 2004 Census test. 2006; Fiscal quarter: Q4; ACS milestone event: Publish 2005 ACS single-year results for all geographic areas and population groups of 65,000+; Unresolved issue: (1) Methodology for calculating independent controls for population characteristics and housing units based on ACS definition of residence and reference period; (2) Source of independent controls for geographic areas not covered by ICPE; (3) Level of geographic detail to be released--for example, counties with population of less than 65,000 or incorporated places other than counties with population of 65,000 or more; (4) Information on consistency between 2004 and 2005 results. 2007; Fiscal quarter: Q1; ACS milestone event: Determine final content for the 2008 ACS; Unresolved issue: (1) Changes to questions to reflect results of analysis of differences between ACS test data and 2000 Census long- form data, evaluation of reporting in 2000 Census, and results of 2004 and 2006 Census tests; (2) Consultation with stakeholders and users. 2007; Fiscal quarter: Q4; ACS milestone event: Publish 2006 ACS single-year results for all geographic areas with population groups of 65,000+; Unresolved issue: (1) See 2006 Q4; (2) Guidance for users on statistical properties of multiyear averages to be released in 2008 and on use of single-year results and multiyear accumulations for same geographic area. 2008; Fiscal quarter: Q1; ACS milestone event: Implement content or methodology changes for 2008 ACS data collection (first year of 5-year ACS accumulation to replace 2010 long form); Unresolved issue: (1) Consultation with stakeholders and users; (2) Final decisions on 2010 Census short form. 2008; Fiscal quarter: Q4; ACS milestone event: Publish 2007 ACS single-year results for all geographic areas with population 65,000+, publish 3- year (2005-07) accumulation for all areas with population of 20,000+; Unresolved issue: (1) See 2006 Q4; (2) Plans and procedures for 3-year ACS accumulation--for example, revision to independent controls for previous years. 2009; Fiscal quarter: Q1; ACS milestone event: Complete 2008 ACS data collection; Unresolved issue: Changes to operational procedures, such as sampling rate for nonresponse followup. 2009; Fiscal quarter: Q4; ACS milestone event: Publish 2008 ACS single-year results for all geographic areas with population 65,000+, publish 3- year (2006-08) accumulation for areas with population 20,000+; Unresolved issue: Updated guidance for users on statistical properties of multiyear averages to be released in 2009 and on use of single-year results and multiyear accumulations for same geographic area. 2010; Fiscal quarter: Q4; ACS milestone event: Publish 2009 ACS single-year results for all geographic areas with population 65,000+, publish 3- year (2007-09) accumulation for areas with population 20,000+, publish 5-year (2005- 09) accumulation for all areas; Unresolved issue: (1) See 2008 Q4; (2) Incorporation of revisions to independent controls for 2005-08. 2011; Fiscal quarter: Q4; ACS milestone event: Publish 2010 ACS single-year results for all geographic areas with population 65,000+, publish 3- year (2008-10) accumulation for areas with population 20,000+, publish 5-year (2006- 10) accumulation for all areas; Unresolved issue: (1) Incorporate revisions to independent controls beginning with 2005 for benchmarking to 2010 Census; (2) Methodology for April 1 reference date for independent controls for 2010 Census. 2012; Fiscal quarter: Q4; ACS milestone event: Publish 2011 ACS single-year results for all geographic areas with population 65,000+, publish 3- year (2009-11) accumulation for areas with population 20,000+, publish 5-year (2007- 11) accumulation for all areas; Unresolved issue: (1) See 2011 Q4; (2) Reconcile differences between ACS and 2010 Census short form. 2013; Fiscal quarter: Q4; ACS milestone event: Publish 2012 ACS single-year results for all geographic areas with population 65,000+, publish 3- year (2010-12) accumulation for areas with population 20,000+, publish 5-year (2008- 12) accumulation for all areas; Unresolved issue: (1) See 2010 Q4; (2) Level of geographic detail from the 2010 Census to be used for independent controls similar to that used for the 2000 Decennial Census. Source: U.S. Census Bureau and GAO analysis. [End of table] In addition, the Census Bureau has announced that comparisons of 2000 ACS and 2000 Census long-form data critical to the transition to the full ACS will be limited. Nevertheless, users who need the evaluation of these comparisons to compare data from the 2000 Decennial Census long form with data from the new ACS data or from the ACS supplementary surveys would benefit from the early resolution of other issues. For example, resolving issues before the release of the first 3-year averages (2005-07) would improve the consistency between these averages and the subsequent ACS data. Resolving all issues for the 2008 ACS is critical if these data are to be fully consistent with the ACS data for 2009-12 and the 2008-12 averages are to be fully consistent with the 2010 Decennial Census short-form data. As we noted above, the Census Bureau's schedule does call for timely completion of the 2008 questionnaire. However, if questions to be included in the 2010 Census short form are changed during the congressional and OMB approval processes, currently scheduled for 2008 and 2009, data collected on the 2010 Census short form will be inconsistent with the ACS data. Conclusions: The Census Bureau's development of the ACS goes back several decades and has included intensive research and field testing programs, as well as substantial outreach efforts, in particular through the reports and workshops at NAS. However, its current plan to begin full implementation of the ACS for 2005 has several critical deficiencies. The Census Bureau has not completed its testing program, and it has not acted to resolve key issues already identified by the ACS test program, by evaluation studies of the 2000 Decennial Census, by Census Bureau research studies, and by stakeholders and users, including us, NAS, and other federal agencies. Furthermore, the ACS implementation plan and the 2010 Decennial Census test programs are not synchronized, and there is no comprehensive program for external consultation on the resolution of these issues. Without prompt resolution of issues such as those relating to the calculation of independent controls for small geographic areas and the consistency of data used to calculate multiyear averages, the ACS will not be an adequate replacement for the long form in the 2010 Decennial Census. If the Census Bureau is not able to use the ACS to replace the long form, the Congress and other stakeholders need to be advised in 2005 in order to allow for the Census Bureau time to reinstate the long form for the 2010 Census.[Footnote 26] Recommendations for Executive Action: To ensure that the ACS is an adequate replacement for the Decennial Census long form, we recommend that the Secretary of Commerce direct the Census Bureau to (1) revise the ACS evaluation and testing plan and focus on the issues we have identified in this report; (2) provide key stakeholders, such as the National Academy of Sciences, with meaningful and timely input on decisions relating to these issues; and (3) make public the information underlying the Census Bureau's decisions on these issues when it makes the decisions. We also recommend that the Secretary direct the Census Bureau to prepare a time schedule for the 2010 Decennial Census that provides for resolving these issues by incorporating all operational and programmatic changes into the 2008 ACS so that the 5-year averages for 2008-12 will adequately replace the 2010 Decennial Census long-form data for small geographic areas. These revisions should be reflected in the single, comprehensive project plan for the 2010 Census, as we have previously recommended. Agency Comments and Our Evaluation: In written comments on a draft of this report, the Secretary of Commerce provided comments on our recommendations. (The Secretary's comments are reprinted in appendix VI.) He disagreed with our recommendation that the ACS evaluation and testing plan needed to be revised to focus on issues we have identified in this report, stating that the current ACS testing and evaluation plan already included these issues. In following up on the Secretary's response, we learned that there is not yet a written plan, but only a rough outline of the types of work planned. Therefore, we believe our recommendation remains valid. The Secretary did not accept our recommendation to provide key stakeholders more direct and timely input into decisions on these issues because he believes that the present consultation process is adequate. We disagree, because as noted in Appendix II of our report, the Census Bureau has not been responsive to recommendations from several National Academy of Sciences reports relating to the ACS. The Secretary agreed with the recommendation that the Census Bureau provide public documentation for key decisions on issues we have identified in this report. The Secretary did not respond directly to our recommendation that he direct the Census Bureau to prepare a schedule for the 2010 Census that ensures that all necessary changes are made in time for the 2008 ACS so the 5-year ACS averages for 2008-2012 will be an adequate replacement for the 2010 long form for small geographic areas. The Secretary provided comments on the five major outstanding issues that, in our view, jeopardize the ACS as a replacement of the long form: lack of methodology for independent controls, operational issues not addressed, questionable plans for dollar-denominated items, incomplete evaluations and lack of information on ACS time-series consistency, and lack of information about multiyear averages. The Secretary disagreed with our findings about the lack of a methodology for independent population and housing controls. He stated that a methodology for the ACS was already in place. On the issue that changes to that methodology are needed to account for the difference in the ACS residence concept, the Secretary agreed that a change was needed but stated that it could be delayed for several more years. On the issue of independent controls for subcounty areas, he stated that the Census Bureau had no plans to develop such controls, which we found were used for the 2000 Census long form, but that it might develop such controls using data from the ACS or administrative records. However, he did not respond to our findings about the use of existing subcounty area data from the ICPE or from the 2010 Census short form. The Secretary stated that the Census Bureau also had no plans to revise the ICPE. On the issue of the ACS reference period, the Secretary reported that the Census Bureau had recently decided to assume that July 1 would be used as the reference period. The Secretary did not comment on GAO's findings about the lack of plans to incorporate into the ACS 2010 Census data and related revisions to the ICPE estimates for previous years. We disagree with the Secretary's comments about the independent subcounty population and housing controls and believe that their use in the ACS is needed for the ACS to be an adequate replacement for the 2010 Census long form for small geographical areas. We found that independent controls from the 2000 Census short form were used for detailed geographic areas for the 2000 Census long form and that differences in counts of population and housing (occupied and vacant) between the long form and the short form were limited to the smallest geographic areas. The similar use of 2010 Census short-form counts in the ACS also would minimize differences in these counts from the ACS and the 2010 Census. Consequently, we disagree with the Bureau's plan not to commit to the development of subcounty controls and its plans not to base these controls on ICPE total population and housing estimates, which are prepared annually for all general government units, and the more detailed and reliable data from the 2010 Census short form. We also disagree with the Secretary that the implementation of a new methodology for independent controls with subcounty controls and the new residence concept can wait until 2008. As we noted in our report, we found that controls for subcounty areas with population of more than 65,000 will be needed before the 2005 ACS estimates are released for these areas in 2006 and that controls for subcounty areas with populations of more than 20,000 will be needed before the first multiyear averages are released in 2008. (For the 2000 Census long form, controls for most areas of this size were from the 2000 Census short form.) With regard to the new residence concept, a decision to delay introducing a new methodology until 2008 would create time-series inconsistencies between 2000-2007 and 2008 and subsequent years. These inconsistencies could be very significant for geographic areas with a large population of seasonal residents. The Secretary did not comment on our findings about the need for a methodology to revisions relating to the ICPE into the ACS. We found that this methodology, which is important to both the time-series consistency of the annual ACS estimates and to the multiyear averages, is not covered by the current ACS methodology, but that it will be needed when the 2010 Decennial Census short form data become available. We found that it has been the Census Bureau's practice for the ICPE, whose estimates are used as the independent controls for the ACS, to be benchmarked to the decennial census short-form data and that it uses similar practices for many other Census Bureau programs. For the ICPE, the Bureau will replace the 2010 ICPE estimates with the 2010 Census data, and use the differences in these estimates to revise the ICPE estimates back to the previous benchmark year, which for 2010 will be 2001. (Table 4 of our report shows the impact of benchmarking on county population data for 2000.) It should be noted that we found that this practice is not followed in all Census Bureau programs. For example, for the Annual Economic and Social Supplement to the CPS, the Census Bureau introduced the benchmark information from the 2000 Decennial Census into the 2001 estimates and presented the data on both the old and the revised basis. This approach, to present estimates on an old and new basis for a single year, may be appropriate for an annual survey. However, GAO found that because of the use of multiyear averages in the ACS, it is imperative that the ACS estimates for all years beginning with 2001 be revised. Without such a revision program, ACS estimates for 2010, which we assume will not be released until the 2010 Census short-form data have been incorporated, will be inconsistent with the 2009 estimates. In addition, the ACS estimates for 2008 and 2009 used to calculate the 5-year averages that will replace the 2010 Census long form will be based on controls that are inconsistent with those for 2010-12. Based on the revisions for 2000 shown in our report, there could be many significant inconsistencies, especially for small geographic areas. Although the Secretary did not comment on the issue of revision, in its technical comments on our draft report, the Census Bureau reported (comment 22) that with regard to incorporating 2010 Census data, it has decided "to make appropriate changes to the [ACS] population controls when necessary, including the possibility of reweighting the data around the 2010 time period and for all multiyear estimates." We disagree with the Census Bureau's approach primarily because it is not consistent with the practices used by the Census Bureau to incorporate census data into surveys and programs such as the ICPE and monthly retail sales that are controlled or benchmarked to a census or similar data set. For these surveys, it revises all previously published data on a predetermined schedule using a transparent statistical procedure. Most important, these procedures do not depend on the size of revisions, which can only be determined after a benchmark is completed. Regardless of the benchmarking procedures adopted for the ACS, we believe that the Census Bureau needs to have extensive consultation with external stakeholders to make its decision. In addition, because of the complexity of most benchmarking procedures, the Census Bureau needs to begin this consultation as soon as possible. With regard to the recent Census Bureau decision about the reference period for the ACS, we are pleased that a decision has been made because any delay in this decision would have resulted in additional time-series inconsistencies in the ACS. We have changed our report to reflect this decision. Unfortunately, we have no documentation on the research underlying the decision and, as has been the case in other key decisions, we do not believe that there was any public discussion of this decision. The second issue identified in our report related to the operational aspects of the ACS, including questionnaire design and the collection of data for persons living in group quarters. On these issues, the Secretary limited his comments to the questionnaires and addressed our findings that improvements identified as part of the 2000 Census cognitive testing research and research based on comparisons of ACS and 2000 Census long-form data would not be completed until 2008. The Secretary noted that the Census Bureau has resolved the issue of finalizing the ACS questions, including the questions to be asked on the 2010 Census short form before 2008. Although this recent decision appears to have resolved the scheduling issue, we believe that uncertainties remain as to whether this schedule can be met. For example, the ACS milestones in the latest available schedule call for final approval of the questions by the Congress and by OMB in 2008 and 2009, respectively, so that any changes made as a result of these steps would not be incorporated into the 2008 questions. As the Census Bureau has recognized, failure to maintain consistency in the questions for the 2008-2012 ACS will result in inconsistencies in the 5-year averages centered on 2010, which are the averages designed to provide the small geographic area data that would have been collected on the 2010 Census long form. In addition, the recently released ACS evaluation reports identify issues on which new research is necessary, including the issues with the questions on disability identified in our report, but the Census Bureau has not indicated its plan to complete this additional research or to consult with stakeholders about decisions related to the research. Although the Secretary did not comment on our findings with regard to group quarters, we remain concerned that the work on group quarters being conducted as part of the 2004, 2006, and 2008 tests for the 2010 Census will not be reflected in the ACS beginning with 2008. Our report also identified as unresolved issues the two inflation adjustments that the Census Bureau is applying on all dollar- denominated ACS items. The first adjustment is used to convert annual data collected each month in the ACS to a calendar year basis. This adjustment recognizes that the annual data collected in the ACS are for different periods because the data are collected monthly and cover the previous 12 months. The second adjustment is used to present dollar- denominated items in dollars of the most recent calendar year. This adjustment eliminates the impact of inflation when comparing data across years. The index used for both adjustments is the national-level CPI. The Secretary correctly observed that the CPI is a generally accepted measure of inflation and that most federal programs that allocate funds do not use regional measures of inflation. However, these observations did not directly address GAO's findings about the adjustments or the concerns raised by HUD in its report on future use of the ACS, which are discussed in appendix V of our report. For example, the Secretary did not address our finding about a lack of a rationale for adjusting items other than incomes for changes in overall inflation rather than adjusting with indexes, such as wage rates or rent, that are directly related to the item being adjusted. He did indicate that the Census Bureau would reconsider its present policy of showing only the inflation-adjusted annual estimates and multiyear averages. We believe our findings about the need for the Census Bureau to provide a comprehensive rationale for the two adjustments still apply. The Secretary disagreed with the issue we identified on completeness of the Census Bureau's comparison and evaluation reports. He noted that after our draft report was completed, the Census Bureau released seven additional comparison reports and that it planned to prepare additional reports to evaluate issues we identified on the time-series consistency of the annual ACS estimates. However, despite earlier statements by the Census Bureau to compare and evaluate differences between state-level estimates from the Census 2000 Supplementary Survey (C2SS) and the 2000 Census long form, these reports did not include any reference to the preparation of such comparisons, and the Secretary did not indicate they would be prepared. Because the focus of the long form and the ACS is on data from small geographic areas, we believe that reports on states and on other areas with population of 250,000 or more should be prepared. The last issue we identified was the need to provide users with guidance on the interpretation of key properties of multiyear averages. The Secretary agreed about the need but noted that guidance is not needed in 2005. He reported on a newly created NAS panel that will be studying many of the key issues identified in our report. However, we believe that the Census Bureau should begin to release guidance on the averages before the first multiyear averages are released in 2008. One area in which such guidance will be needed is the interpretation and use of the multiple ACS estimates. When the 2005-07 averages are released in 2008, users will have annual estimates for some of these areas for 2006 as well as the 3-year averages, which will be centered on 2006. In 2010, when the first 5-year averages are released (2005- 09), users will have three sets of ACS estimates for places with populations larger than 20,000. For example, for each state, there will be an annual estimate for 2007 as well as 3-year and 5-year averages centered on 2007. The comments from the Secretary also include a list of detailed technical comments from the Census Bureau. We reviewed each of these comments and revised the report where appropriate. As agreed with your offices, unless you release the report's contents earlier, we plan no further distribution of it until 30 days from its issue date. We will then send copies to the Secretary of Commerce, the Director of the U.S. Census Bureau, and others who are interested. Copies will be made available to others on request. This report will also be available at no charge on GAO's Web site at http://www.gao.gov. If you or your staff have any questions concerning this report, please contact me at (202) 512-9750. Other staff who made major contributions to this report are listed in appendix VII. Signed by: Robert P. Parker: Chief Statistician: [End of section] Appendix I: Scope and Methodology: We used a combination of approaches and methods to examine the Census Bureau's plans to develop, test, and implement the American Community Survey (ACS). We reviewed published and unpublished ACS-related Census Bureau reports, papers, presentations, budget documents, and congressional testimony; National Academy of Sciences (NAS) reports; congressional testimony delivered by outside experts; and consultants' reports prepared for the Census Bureau, the Bureau of Labor Statistics (BLS), and the Department of Housing and Urban Development (HUD). We reviewed an extensive set of internal planning documents prepared between 1992 and 1995 that the Census Bureau provided, relevant papers Census Bureau staff presented at professional association meetings and similar symposiums from 1995 on, and evaluation reports based on the 2000 Census. We also reviewed official Census Bureau presentations in special reports, congressional testimony, and recent advisory committee meetings. We reviewed similar materials NAS and consultants prepared for the Census Bureau and other federal agencies, as well as materials we prepared. The most important documents we reviewed are listed in the bibliography, organized by document type, at the end of this report. In addition, we conducted independent research and analysis. To assess the evaluations the Census Bureau conducted to assist users in making the transition from the 2000 Census long form to the ACS, we obtained data from the 2000 Census and 2000 ACS (the Census 2000 Supplementary Survey) and prepared comparisons of key detailed data items at the state level. To determine the potential effect of replacing independent population and housing characteristics controls from the 2000 Census with corresponding data from the 2010 Census, we compared county-level intercensal estimates for April 1, 2000, based on the 1990 Census, with 2000 Census counts. We also analyzed the Census Bureau's use of independent controls for estimates of population and housing characteristics for previous decennial censuses and its plans for the ACS. To assess alternatives to the ACS, we spoke to current Census Bureau officials and individuals familiar with early efforts to provide more frequent long-form type data, including the Mid-Decade Census. We also reviewed congressional hearings on these developments and Census Bureau documents prepared in the 1990s on the Continuous Measurement program and on implementing the ACS to replace the 2000 Census long form. We did not independently verify the cost information the Census Bureau provided for the alternative we discuss. We also interviewed staff of the NAS Committee on National Statistics and outside small-area data experts. The outside experts we interviewed were: Constance Citro, Committee on National Statistics Michael Cohen, Committee on National Statistics Linda Gage, California State Department of Finance Edwin Goldfield, Committee on National Statistics Ken Hodges, Claritas Inc. Graham Kalton, Westat Inc. Terri Ann Lowenthal, Consultant Joseph Salvo, New York City Planning Department Edward J. Spar, Council of Professional Associations for Federal Statistics Paul Voss, University of Wisconsin: [End of section] Appendix II: Recent NAS Findings on Continuous Measurement and the ACS: In Modernizing the U.S. Census, a 1995 report, the NAS Panel on Census Requirements in the Year 2000 and Beyond stated that: "Although we believe that the proposed continuous measurement system deserves serious evaluation, we conclude that much work remains to develop credible estimates of its net costs and to answer many other fundamental questions about data quality, the use of small-area estimates based on cumulated data, how continuous measurement could be integrated with existing household surveys, and its advantages compared with other means of providing more frequent small-area estimates. In our judgment, it will not be possible to complete this work in time to consider the use of continuous measurement in place of the long form for the 2000 census."[Footnote 27] The panel concluded that: "With regard to proposals to drop the long form in the next decennial census and substitute a continuous monthly survey to obtain relevant data, substantial further research and preparatory work are required to thoroughly evaluate the likely effect and costs of these proposals. Continuous measurement deserves serious consideration as a means of providing more frequent small-area data; however, the necessary research and evaluation cannot be completed in time for the 2000 census." [Footnote 28] Although 1994 saw the first proposals to implement the continuous measurement methodology as a replacement for the 2000 Census long form, the Census Bureau changed its plans in 1998, shifting to implementation to replace the long form in 2010. Since 1995, NAS has produced several reports that relate totally or in part to the ACS, including a summary of a September 13, 1998, Committee on National Statistics workshop at NAS, two interim reports, a letter report, and a final report, by the Panel on Research on Future Census Methods, and a report released in early 2004 by the Panel to Review the 2000 Census.[Footnote 29] (In this appendix, we do not discuss NAS reports after 1995 in which the ACS was discussed as a potential data source for federal programs.[Footnote 30]): With few exceptions, the members of these two NAS panels and the workshop participants reported findings that cover most of what we have identified as unresolved issues and summarize in this appendix.[Footnote 31] The NAS reports and ours differ somewhat in emphasis. We have focused on the production and use of ACS data, whereas NAS focused more on data collection and processing methodologies. These differences may reflect the fact that NAS panel members are very sophisticated users who are more likely to use ACS microdata files and make their own adjustments for methodological issues; they make little use of the regular ACS publications. 1998 NAS Workshop on the ACS: NAS sponsored a 1-day workshop in September 1998 to discuss methodological issues related to the ACS. Experts prepared "thought pieces" on issues NAS staff selected, with input from Census Bureau staff. The workshop's specific discussion topics were combinations of information across areas and across time, funding formula, weighting and imputation, sample and questionnaire design, and calibration of the output from this survey with that from the long form. The thought pieces and comments on them prepared Census Bureau staff for the discussions at the workshop.[Footnote 32] NAS noted in the report on the conference that its six focus issues reflected only a partial list of key ACS topics; the report's conclusions identified other key issues.[Footnote 33] Stating that the workshop's purpose was "to assist the Census Bureau in developing a research agenda to address these and other methodological issues," the report pointed out that the Census Bureau's past focus on the ACS: "has been on refining data collection, leaving the final answers to the difficult analysis questions for later. Thus, procedures for nonresponse and undercoverage adjustment were modeled, to the extent possible, after current procedures used for the census long form. Now that data collection has matured as the ACS demonstration phase is well under way, the Census Bureau is developing a research plan and initiating research to address all issues relating to ACS methodology. Fall 1998 therefore seemed an opportune moment for a workshop to assist the Census Bureau in developing a research agenda to deal with many of these challenging issues." [Footnote 34] The report contained no specific recommendations but identified areas where additional research was needed, including issues we have expressed concern about, such as the availability of multiple ACS estimates for geographic areas with populations larger than 20,000 and the likelihood of differences between ACS estimates and estimates from a Decennial Census short form. From our perspective, the most relevant of the workshop's specific issues were (1) combining information across time, (2) weighting and imputation, and (3) calibrating the output from this survey with that from the long form. Technical papers in the workshop's agenda book contained considerable discussion of time-series issues. The discussion in this section of the workshop focused on replacing moving averages with time-series modeling and using current household survey data to develop models. Speaking for the Census Bureau, Alexander stated that "Our current plan is to release annual data for even very small areas and let users perform their own time series analyses. We welcome ideas about what the Bureau's role should be . . ."[Footnote 35] On the evaluation of comparisons between the ACS test data, the workshop report noted that the objective of the comparison using the national sample data was: "to make comparisons between the long form and ACS for all states, large metropolitan areas, large substate areas, and population groups. "The objective of the 1999-2001 comparison is to understand the factors associated with the differences between the 1999-2001 ACS and the 2000 long form in the 31 areas, using the second comparison study to develop a calibration model to adjust the 2000 long-form estimates to roughly represent what the full ACS would have yielded in 2000."[Footnote 36] Chapter 7 of the report was devoted to a discussion of calibration. The report stated that the model would "determine the effects that would be expected when switching from the long-form estimates to those from the ACS on various applications of long-form data." Once adjusted, the calibrated long-form data for 2000 can be compared with ACS data that are collected following full field implementation in 2003, "in order to understand the dynamics over time of such characteristics as poverty and employment."[Footnote 37] The technical papers in the workshop's agenda book also noted other comparisons of ACS data. For example, Alexander discussed comparisons with CPS data, reporting that: "We very much like the idea of viewing information from an ongoing comparison of ACS to CPS and other surveys as a way to help understand how the ACS 'error profile' might be changing over time and using this to help interpret ACS data in the context of the long-term time series of census estimates."[Footnote 38] The use of independent controls for population and housing characteristics was also discussed at the workshop, but very generally, because the Census Bureau had not yet developed proposals for the controls. For example, the report's chapter 5 discussed improving the existing population controls. The Census Bureau reported discomfort with the quality of the existing county-level controls (from ICPE) and agreed that the ACS could be used to improve these estimates.[Footnote 39] The Census Bureau also acknowledged that differences in residence rules and reference period would complicate the calculation of population weights. However, no discussion was reported of how the population counts from the 2010 Census would be used. The report referred to moving averages in the conclusions chapter as one of the methodological problems noted at the workshop: "the development of estimates that (a) sum to estimates at higher levels of geographic aggregation and (b) more closely approximate direct estimates at higher levels of aggregation . . . in the event that aggregate estimates are not constrained to (approximately) equal direct estimates (and also the release of direct estimates at lower levels of aggregation for analysis purposes) . . . ."[Footnote 40] 2000 Interim Report: The Panel on Research on Future Census Methods, sponsored by the Census Bureau, was formed to examine alternative designs for the 2010 Census and to assist the Census Bureau in planning tests and analyses to help assess and compare their advantages and disadvantages. In addition to the first interim report, Designing the 2010 Census, released in 2000, a letter report was issued in 2001, and a second interim report was issued in 2003 (both discussed below). The panel issued a final report in 2004. The panel's first interim report identified information from 2000 Census data useful in assessing designs for the 2010 Census. In the executive summary, the panel made four specific recommendations and proposed other changes. One of the recommendations, relating to evaluation studies, is directly relevant to our report: "The Census Bureau should develop a detailed plan for each evaluation study on how to analyze the data collected and how to use the results in decision making concerning 2010 census design. The Census Bureau should then use these plans to identify the benefits and resources required for each evaluation study, set priorities among them, and allocate sufficient resources for the careful completion of all or, at least, the highest priority evaluations."[Footnote 41] In addition, the report proposed three changes for the 2010 Census and ACS. The first proposed change to the direction and nature of the evaluation program, was that the Census Bureau use the "ACS as a census testing platform" "The American Community Survey is a proposed national, continuous, mailout-mailback survey of 250,000 households per month, with field follow-up that makes use of techniques closely related to those used in the census. Therefore, rather than rely exclusively on the two or three large-scale census tests, which are always at least slightly limited in their generalizability by the specific locations selected, the Census Bureau could use the ACS as a platform for testing possible changes in the census. This work could serve as preliminary testing to the larger mid-decade tests for the census design."[Footnote 42] The second proposed change called for "a match study of the census short form and the ACS." This proposal, which could provide information on the effect of a change in the residence rule in the 2010 census, stated: "The decennial census makes use of one residence rule definition, the ACS uses a second, and a third approach is being tested in the alternative questionnaire study. As the Census Bureau is well aware (based on the allocation of an experiment to this issue), confusion over residence rules is a source of possibly substantial error in the census. . . . The Census Bureau needs to find the residence rule (within the set of rules satisfying legal and other restrictions) that results in the most accurate estimates. To learn more about this issue, the panel proposes an ACS-short-form match study in 2000 to examine this and other short-form measurement error issues."[Footnote 43] The third proposed change was the recommendation that the Census Bureau "form an ACS advisory group" to improve its efforts to consult with stakeholders. The panel stated: "The development of the ACS raises a number of issues related to the quality of and planning for the 2010 census. There are also many other important technical issues raised by the introduction of the ACS into the federal statistical system. Formation of a technical working group could help to address many of these issues.î[Footnote 44] 2001 Letter Report: The 2001 letter report--addressed from Benjamin King, Chair of the Panel on Research on Future Census Methods, to William Barron Jr., Acting Director of the Census Bureau--was prepared in response to a December 7, 2000, presentation by Census Bureau staff on the major elements of the Census Bureau's strategy for the 2010 Census. The panel recommended that the Census Bureau produce a "business plan" for the 2010 Census that would provide an overall framework for development. It recommended that this plan include (1) a statement of objectives, (2) a timeline for completing tasks, (3) a cost-benefit analysis, and (4) more complete information on coordinating tasks within the Census Bureau.[Footnote 45] The panel also recommended the preparation of specific types of evaluation studies. On the evaluation studies, the panel reported, "The Bureau is currently conducting a wide array of evaluation studies and experiments designed to assess the quality of the 2000 census and inform approaches to the 2010 census. As noted above, the panel applauds the scope of these evaluation studies. However, the panel is concerned that the Bureau has not sufficiently focused its evaluation program and has instead labeled most of its evaluation categories as high priority."[Footnote 46] In the letter report's conclusions, the panel recommended that the Census Bureau give the highest priority to studies and data analysis in seven specific areas, most of which related to the ACS. The panel's list of studies and analyses included the following recommendations, which we have also discussed: "comparison of estimates from the ACS and 2000 census long-form data, in sites where both are available; coverage of the population, disaggregated by demographic and geographic subgroups; the effectiveness of major automated systems for data collection, capture, and processing; the quality and completeness of long-form data collection; and the effectiveness of operations used to designate special places and enumerate the group quarters and homeless populations."[Footnote 47] In making these recommendations, the panel noted the need for the Census Bureau to maintain a strategy that would provide for "a smooth transition" from the long form to the ACS. The panel urged the Census Bureau: "to broaden its justification for the ACS, detailing the need for and use of long-form data and how those data needs will be addressed through the ACS, perhaps in conjunction with the CPS and other demographic surveys. Accordingly, the Bureau should expedite ongoing evaluations that assess the quality of ACS data relative to the quality associated with the traditional census long form."[Footnote 48] 2003 Interim Report: In the second interim report, Planning the 2010 Census, issued in 2003, the panel identified four areas of primary interest: reengineering the census, geographic coding, the ACS, and testing for the 2010 Census. With regard to the ACS, the panel reported in the executive summary that: "The most basic question the panel faces regarding the ACS is whether it is a satisfactory replacement for the census long form. We recognize that significant estimation and weighting challenges must be addressed and that more research is needed on the relative quality of ACS and long-form estimates."[Footnote 49] The panel found that the Census Bureau needed to complete evaluations of differences between 2000 Census long-form data and data from the ACS test sites and from the 2000-02 supplementary surveys. It also found that the Census Bureau needed to undertake a major effort to inform data users and stakeholders of the results of these evaluations and the features and problems of working with multiyear averages. One of the four main topics of this report was a separate chapter on the ACS, in which the panel discussed the following recommendations: "The Census Bureau should carry out more research to understand the differences between and relative quality of ACS estimates and long-form estimates, with particular attention to measurement error and error from nonresponse and imputation. The Census Bureau must work on ways to effectively communicate and articulate those findings to interested stakeholders, particularly potential end users of the data. The Census Bureau should make ACS data available (protecting confidentiality) to analysts in the 31 ACS test sites to facilitate the comparison of ACS and census long-form estimates as a means of assessing the quality of ACS data as a replacement for census long-form data. Again, with appropriate safeguards, the Census Bureau should release ACS data to the broader research community for evaluation purposes. The Census Bureau should issue a user's guide that details the statistical implications of the difference between point-in-time and moving average estimates for various uses. The Census Bureau should identify the costs and benefits of various approaches to collecting characteristics information should support for the full ACS not be forthcoming. These costs and benefits should be presented for review so that decisions on the ACS and its alternatives can be fully informed."[Footnote 50] With regard to the first recommendation, the panel stated that: "The fact that the Census Bureau has not done more in comparing the data collected from the 31 test sites, the C2SS, and the 2001 and 2002 Supplementary Surveys with the data collected by the 2000 census long form is disappointing. Such analyses would help assess the quality of ACS data and would be helpful in making the argument for transition from the long form to the ACS. This deficiency is probably due to limited analytic resources at the Census Bureau and creates an argument for 'farming out' this analysis to outside researchers."[Footnote 51] On the recommendation about the need for more information on multiyear or moving averages, the panel discussed several technical issues. The panel commented that: "The ramifications of this basic concept emerge when moving average estimates are entered into sensitive allocation formulas or compared against strict eligibility cutoffs. A smoothed estimate may mask or smooth over an individual year drop in level of need, thus keeping the locality eligible for benefits; conversely, it may also mask individual-year spikes in activity and thus disqualify an area from benefits. It is clear that the use of smoothed estimates is neither uniformly advantageous nor disadvantageous to a locality; what is not clear is how often major discrepancies may occur in practice."[Footnote 52] On using moving-average data to measure year-to-year changes, the panel commented: "It is incorrect to use annual estimates based on moving averages over several years when assessing change since some of the data are from overlapping time periods and hence identical. At the least, the results will yield incorrect estimates of the variance of the estimates of change. Therefore, users should be cautioned about this aspect of the use of moving averages."[Footnote 53] In both recommendations on evaluations and moving averages, the panel called for the Census Bureau to engage in a greatly expanded effort to inform users and stakeholders. It also suggested that the Census Bureau farm out some of the research efforts. In summarizing the results of its efforts, the panel noted the 1995 NAS report, as follows: "Eight years later, faced with the task of offering advice on making the vision of continuous measurement a reality in the 2010 census, the similarity between the arguments then and now is uncanny. Similar, too, are the points of concern; the current panel is hard-pressed to improve upon the basic summary of concerns outlined by our predecessors. We are, however, much more sanguine that a compelling case can be made for the ACS and that it is a viable long-form replacement in the 2010 census."[Footnote 54] However, while the panel was identifying its concerns, it also supported full funding of the ACS, believing that existing "flaws" in the plan could be resolved. The 2000 Census: Counting under Adversity: In 2004, the Panel to Review the 2000 Census, sponsored by the Census Bureau, issued a report entitled The 2000 Census: Counting under Adversity. The findings were based primarily on the panel's review of information from the 2000 Census. The panel's charge had been to "review the statistical methods of the 2000 Census, particularly the use of the Accuracy and Coverage Evaluation Program and dual-systems estimation, and other census procedures that may affect the completeness and quality of the data."[Footnote 55] Thus, although the report focused on the 2000 Census, it made some recommendations for improving the 2010 Census. Its major recommendation was that: "the Census Bureau, the administration, and Congress agree on the basic design for the 2010 census no later than 2006 in order to permit an appropriate, well-planned dress rehearsal in 2008. In particular, this agreement should specify the role of the new American Community Survey (ACS). Further delay will undercut the ability of the ACS to provide, by 2010, small-area data of the type traditionally collected on the census long-form sample and will jeopardize 2010 planning, which currently assumes a short-form-only census."[Footnote 56] In its discussion of the 2010 Census, the report included several recommendations on ACS operations and evaluations. The panel recommended that the Census Bureau develop estimates of the effect on estimates from the 2000 long form resulting from imputation as well as sampling variability and nonresponse. More specifically, "the Bureau should also study the effects of imputation on the distributions of characteristics and the relationships among them and conduct research on improved imputation methods for use in the American Community Survey (or the 2010 census if it includes a long-form sample)."[Footnote 57] Finally, the panel recommended that the Census Bureau's plans for the 2010 Census "include research on the trade-offs in costs and accuracy between imputation and additional fieldwork for missing data (Recommendation 4.2)."[Footnote 58] The panel also recommended that the Census Bureau: "publish distributions of characteristics and item imputation rates, for the 2010 census and the American Community Survey (when it includes group quarters residents), that distinguish household residents from the group quarters population (at least the institutionalized component). Such separation would make it easier for data users to compare census and ACS estimates with household surveys and would facilitate comparative assessments of data quality for these two populations by the Census Bureau and others."[Footnote 59] The panel's findings were similar to our findings, with one major difference. The panel's findings imply that some research on the ACS can be conducted after the results of the 2010 Census short form become available. In contrast, we see that such research is needed in order to improve the ACS by 2008, the first year in which ACS data will enter into the calculation of the 5-year average estimates (2008-12) that will replace the long form. [End of section] Appendix III: The Decennial Census Long Form and the Evolution of the ACS Plan: Decennial Census Long Form: In the decennial census for 1940 and for 1950, the Census Bureau used a single form to collect, from all households, population and key characteristics such as age and gender and, from a sample of households, detailed demographic, economic, and housing items. In the 1940 Census, the Census Bureau used a sample of 5 percent of the population to collect data on questions on income, internal migration, and Social Security status, as well as on more refined questions on unemployment. In addition, the Congress authorized a new set of questions about the types of plumbing, heating, and appliances in dwellings. Beginning with the 1960 Census, the first conducted by mail, it became necessary to use separate forms--a short form to collect population data from all households and a long form to collect the detailed items from a sample of households. In the 2000 Census, for example, the Census Bureau conducted a sample of 17 percent of the population and asked 45 questions on the long form. Since 1960, the long form has evolved into a cost-efficient way to collect data federal agencies need that minimizes respondent burden. For 2000, for example, the long form consisted of 45 questions that the Census Bureau developed working through OMB and with the consent of the Congress.[Footnote 60] Each question provided information required by statute. Thus, the 2000 long form provided all federal departments and agencies with critical data, and it was estimated that these data were used to allocate more than $200 billion in federal funds.[Footnote 61] Evolution of ACS Plan: In the 1950s, Census Bureau officials and users of Decennial Census data had begun to develop a program to provide intercensal data on population characteristics. The first major proposal to provide intercensal data called for a mid-decade census that would provide information every 5 years. In 1976, the Congress enacted legislation to require a mid-decade census beginning with 1985, but did not fully fund the program. In the late 1980s, the Census Bureau shifted efforts to provide intercensal estimates to a program based on CM methodology, or Continuous Measurement. This approach would provide for more timely population data as well as the detailed demographic, economic, and housing data collected every 10 years by the Decennial Census long form.[Footnote 62] The program would integrate a new sample survey, existing surveys, administrative records, and statistical modeling. After a thorough analysis of alternatives based on this methodology, the Census Bureau developed a plan similar to the current ACS to replace the 2000 Census long form. Initial $2.6 million funding for the CM program was included in the 2000 Decennial Census budget for fiscal year 1995. These funds were to develop, test, and evaluate a CM program to replace the Decennial Census long form and to provide more timely long-form type data. In the program description in the budget documents, the Census Bureau reported that it planned to develop the new program that would integrate a new sample survey, existing surveys, administrative records, and statistical modeling. Table 2 shows that about $330 million has been provided to fund the CM program since 1995, with funding provided separately until 2003 and additional funding from both the 2000 and 2010 Decennial Census programs. Beginning with 2003, all funding has been provided as part of the 2010 Census program. The Census Bureau requested $165 million for fiscal year 2005. Table 2: Continuous Measurement and ACS Funding, Fiscal Years 1995- 2005: Fiscal year: 1995; Budget[A]: $2.6; Activity: Develop and test a continuous measurement (CM) system to replace 2000 Census long form; study integration of administrative records, existing current surveys, and statistical modeling with a new survey for CM programs; Other information: None. Fiscal year: 1996; Budget[A]: $10.0; Activity: Develop and test a CM system to replace 2000 Census long form and provide annual data; Other information: Continue evaluating and developing an integrated CM program. Fiscal year: 1997; Budget[A]: $16.6; Activity: Rename test survey American Community Survey (ACS); complete data collection and processing for 1996 survey at 4 test sites; develop list to cover group quarters; Other information: Develop methods for integrating administrative records and information from household survey data into CM program. Fiscal year: 1998; Budget[A]: $16.6; Activity: Continue testing and processing 1997 test site survey data; Other information: Make long-run plans for replacing 2010 long form with full implementation of ACS in 2003. Fiscal year: 1999; Budget[A]: $20.0; Activity: Publish 1998 test site data and expand test site surveys to 31 sites; prepare for supplementary surveys from national sample to begin with 2000 to compare with 2000 Census long-form data; Other information: Develop statistical models to evaluate comparisons of long form to supplementary surveys; continue testing to integrate information from administrative records. Fiscal year: 2000; Budget[A]: $47.0; Activity: Provide for 2000 supplementary survey; continue comparison studies of differences for test site areas; develop plans for comparison studies with supplementary survey and using multiyear averages; Other information: None. Fiscal year: 2001; Budget[A]: $45.2; Activity: Provide for continuation of supplementary surveys; continue testing and comparison studies with data from test sites and national survey; begin testing in Puerto Rico; Other information: Continue processing data from test sites for comparison studies. Fiscal year: 2002; Budget[A]: $56.1; Activity: Prepare for full implementation in 2003; Other information: Continue evaluating comparison studies for test-site and supplementary survey data. Fiscal year: 2003; Budget[A]: $57.1; Activity: Prepare to begin full implementation for 2005; Other information: Complete testing for group quarters and Puerto Rico. Fiscal year: 2004; Budget[A]: $64.8; Activity: Prepare to begin full implementation for 2005; Other information: None. Fiscal year: 2005; Budget[A]: $165.0; Activity: Full implementation; Other information: None. Sources: Budget of the United States, House of Representatives Report 108-401, and Census Bureau budget documents. [A] Dollars in millions. Fiscal year 1995 funding provided as part of the 2000 Decennial Census program. Fiscal year 1996-99 funding provided by CM program. Fiscal year 2000 and 2001 funding provided as both the CM program and part of the 2000 Census program. Fiscal year 2002 funding provided as both the CM program and part of the 2010 Census program. Beginning with fiscal year 2003, funding provided as part of the 2010 Census program. Fiscal year 2005 figure is the budget request. [End of table] In 1996 and 1997, funding was provided to field-test what became the ACS, to replace the 2000 Census long form. The ACS was to begin in 1999 with an annual sample of 4.8 million housing units for 1999, 2000, and 2001 and 3 million housing units for subsequent years. Under this plan, a 3-year average of ACS data for 1999-2001 was to replace the 2000 Census long form.[Footnote 63] It would provide the same detailed items and same level of geographic detail as the traditional long form with about the same quality. Annual ACS data would subsequently be provided for geographic areas with populations of 65,000 or more, 3-year averages would provide ACS data for geographic areas with populations larger than 20,000, and 5-year averages would provide ACS data for small geographic areas, such as census tracts, small towns, and rural areas. The 5-year average for 2010, 2020, and beyond would replace future Decennial Census long forms. In the 1998 budget request, the Census Bureau shifted the timing for replacing the long form from the 2000 Census to the 2010 Census. As a result, it was funded to conduct annual supplementary surveys of 750,000 households beginning with 2000, in addition to the ACS testing at four test sites (or counties). The Census 2000 Supplementary Survey, known as C2SS, and the surveys for subsequent years were to be used to test the feasibility of collecting long-form data at the same time as, but in a separate process from, the Decennial Census. Data from C2SS and the supplementary surveys were also to be used to test ACS data usability and reliability and to evaluate operational and programmatic issues associated with implementing the ACS. Also, the number of test sites was increased to 31 by 1999. Funding to compare and evaluate differences between data collected from the 2000 Census long form and the ACS testing programs began in 1999, to develop data to expand coverage to group quarters and Puerto Rico in 2001. Plans to integrate existing surveys, administrative records, and statistical modeling into the new program were dropped in 2001. The 1998 budget request also reported that the Census Bureau would proceed with plans to replace the 2010 Census long form with an ACS based on an annual sample of 3 million housing units, as with the previous plan. Unlike that plan, the sample size for 2009-11 would not increase to provide 3-year averages for 2010. This revised plan called for full implementation of the ACS in 2003. Full ACS data for 2003 to 2007 would have made 5-year averages available in 2008, 4 years before the long-form sample statistics from the 2010 Census would become available. However, budget decisions by the Congress delayed full implementation until the fourth quarter of fiscal year 2004. [End of section] Appendix IV: Continuous Measurement ACS Testing and Development Program: The Congress initially provided funds for testing the CM methodology in 1994. As we have noted, the Census Bureau had begun formal testing of the CM program in 1996 with an operational test of the ACS in four counties; this test was expanded to 31 test sites by 1999. A second testing program, the Supplementary Survey program, began in 2000 as a part of the 2000 Decennial Census. The Census Bureau designed C2SS to test the feasibility of collecting long-form data at the same time as, but in a separate process from, the 2000 Decennial Census. Data from C2SS and the same supplementary surveys, beginning with 2001, were also to be used to test ACS data usability and reliability. According to the Census Bureau, these surveys were to be used to examine technical, statistical, and operational issues associated with implementing the ACS and to document the key results in a series of reports.[Footnote 64] Before field testing began, the Census Bureau had conducted an extensive research program to identify the issues related to using the CM methodology and to replacing the long form. The research program resulted in a series of 20 reports, known as the Continuous Measurement Series, between 1992 and 1995.[Footnote 65] These reports, most of which were prepared by Charles Alexander, addressed a wide range of topics such as replacing the 2000 Census long form, collecting intercensal population data, and integrating the ACS with existing household surveys. The reports on replacing the long form identified the key issues that needed testing, and they served as the primary input to the Census Bureau's ACS test program. These issues included those subsequently tested by the Census Bureau as well as the unresolved issues we identify in this report. Following the CM reports, Census Bureau staff presented papers from 1995 through 2001 on ACS testing at various professional association and similar meetings, as well as at a 1998 symposium on the ACS sponsored by the Census Bureau.[Footnote 66] For example, the 1995 paper by Love, Dalzell, and Alexander discussed issues related to the evaluation of the 1996 test site results, expressing concern about population controls and residence rules as well as the need for consultation with users.[Footnote 67] They reported that the Census Bureau was planning to conduct research using data from the 1996 test sites to produce controls at the census tract and block group level. They also noted that the Census Bureau would need to conduct research on the residence rule. Alexander and Wetrogan also discussed the issue of population controls in their 2000 paper.[Footnote 68] They reviewed possible methods for using ICPE to develop controls for the ACS and discussed using ACS estimates on the foreign-born U.S. population to improve the Census Bureau's foreign-migration component of the intercensal estimates. (They reported that this effort would be part of what the Census Bureau had previously referred to as the Program of Integrated Estimates.) They also noted the need to consult with users on how to present information on the differences in ACS controls and ICPE in ACS publications. Several papers have focused on the key role of evaluating differences among the ACS test data, census long-form data, and CPS data. Alexander, Dahl, and Weidman reported in 1997 that during the demonstration period, they would be working closely with experts familiar with specific test sites to learn about the quality of the ACS estimates.[Footnote 69] For example, they reported that the Census Bureau would be looking into sources of differences between the 1999- 2001 ACS test-site average estimates and the 2000 Census long-form results and using the results of differences between the 2000-02 national sample and the 2000 long form to generate model-based estimates for small geographic areas. The authors noted that these model-based estimates, based largely on information from test sites, would be used to interpret changes between 2000 and future ACS estimates. In another 1997 paper, Davis and Alexander reported the Census Bureau's action plan for evaluation studies.[Footnote 70] They called for evaluating the results of all test sites and releasing the expert review of the analyses of the differences between the 1999-2001 ACS and the 2000 Census long form. The schedule called for releasing this information before beginning the implementation of the full ACS. Alexander's 1998 paper on completed research, research in progress, and planned research included among the four items for planned research a "close study of differences between 1999-2001 ACS and 2000 long form in comparison areas."[Footnote 71] The quality of the ACS measures of income was the subject of the paper Posey and Welniak presented at the Census Bureau's 1998 symposium on the ACS.[Footnote 72] They compared income reported in the 1996 ACS and 1990 Decennial Census in an effort to evaluate the quality of the ACS income data. One of the adjustments they made to compare the two series was for the effect of inflation between 1990 and 1996. They noted that the results of the comparisons indicated a potential problem that may relate to the ACS inflation adjustment. (They described the calculation of the adjustment, which is based on the CPI, but did not provide a rationale for using the adjustment in the ongoing ACS data.) Alexander and two BLS staff reported in 1999 on the potential for using the ACS to improve labor force data from the CPS for state and smaller geographic levels.[Footnote 73] They stressed that to develop procedures for making these improvements, much research would be needed to evaluate differences between the ACS and CPS.[Footnote 74] The last research paper in this period was Alexander's 2001 paper focusing on the origins of the CM methodology and its developers.[Footnote 75] He discussed the ACS in the context of the methodology, noting several important differences related to the nature of the ACS. He included a review of the Census Bureau's testing and evaluation program, noting that the ACS test-site program had been expanded and that national sample supplementary surveys had been added. He said that these test data would be compared with the 2000 Census long-form data and that in 2001 and 2002, the Supplementary Survey would be used as part of the transition to the ACS. He also pointed to unresolved issues relating to the residence rule and the multiyear averages, because they would provide users with multiple estimates for geographic areas with populations larger than 20,000. Between 2001 and 2003, the Census Bureau has issued three official reports and one internal report on the status of the ACS testing and development program. In Demonstrating Operational Feasibility, published in July 2001, the Census Bureau gave a brief history of the ACS development program, which by 2001 was focused on preparing for full implementation in 2003 (although the Census Bureau later revised this to 2004) but on its operational feasibility, using data from C2SS.[Footnote 76] On the basis of the Census Bureau's analysis of the results of its tests of operation feasibility, it reported the tests a success. However, it recognized that more evaluation on measures of data quality was necessary, as well as on differences between ACS and 2000 Census long-form data. The Census Bureau announced that over the next 2 years it would issue reports comparing data from the 2000 Census long form at the national, state, and smaller geographic areas with data from the C2SS and the ACS development program. Demonstrating Survey Quality, published in May 2002, focused on measures of C2SS survey quality, summarizing sampling and nonsampling error levels in both C2SS and the 31 ACS test sites.[Footnote 77] The Census Bureau used available, generally accepted measures of quality.[Footnote 78] On the basis of its analysis of the results of these quality tests, the Census Bureau reported the tests a success. This conclusion rested on test results that showed the C2SS program capable of providing reliable long-form data. As in the July 2001 report, the Census Bureau recognized that more evaluation was necessary on measures of data quality as well as on differences between ACS and 2000 Census long-form data and the detailed estimates produced from C2SS. The Census Bureau repeated its commitment that over the next year and a half, it would release other reports to (1) analyze in detail basic demographic characteristics (relationship, race, tenure) produced from the C2SS at the national and state levels, including comparisons between C2SS and Census 2000; (2) describe the data release plan and products for the ACS and the usability and accessibility of estimates resulting from ACS methods; and (3) give several detailed analyses of selected social, economic, and housing characteristics (education, income, commuting patterns), including comparisons between C2SS and Census 2000 at the national and some subnational levels. In June 2002, shortly after Demonstrating Survey Quality was released, a team of Census Bureau specialists who had been working on the ACS for several years prepared an internal report on testing. They presented a revised program development plan and identified key questions to be answered in testing the adequacy of the ACS in replacing the Decennial Census long form. Their plan included the preparation of a series of nine evaluation reports over 2 years.[Footnote 79] The reports that evaluated differences between the 2000 Census short-form data (100 percent reported) and corresponding C2SS items were included in Demonstrating Survey Quality. Three reports to be completed between October 2002 and January 2003 would evaluate differences between the detailed housing, social, and economic characteristics between C2SS and the 2000 Census long form, as described in Demonstrating Survey Quality. (Although this schedule was later extended to the end of 2003, these three reports still had not been released when we prepared our final draft of this report.) Finally, the team's plan included a report that would focus on the comparisons of 3-year averages for the basic demographic, housing, social, and economic characteristics from the C2SS and ACS test sites and comparable estimates in the 2000 Census long form. The last report in the plan would compare data for 2001 and 2002 with measures shown in Demonstrating Operational Feasibility. The plan did not provide completion dates for these reports. American Community Survey Operations Plan, Release 1, published in March 2003, identified research projects to be completed in preparation for full implementation of the ACS.[Footnote 80] Two projects were on "weighting and estimation," which covered the methodology for using independent population and housing controls, and on "program of integrated estimates," which covered the calculation of these controls from the Census Bureau's intercensal population estimates program. The operations plan also reported on the schedule for completing several comparison and evaluation projects with ACS and 2000 Census long-form data discussed in Demonstrating Survey Quality. It discussed the need to evaluate multiyear estimates from the supplementary surveys to demonstrate the usability, reliability, and stability of ACS estimates over time, and it stated that a report comparing 3-year ACS data with data from the 2000 Census long form would be released in mid- 2003.[Footnote 81] The Census Bureau reported that the results of these research projects would not be available in 2004. Instead, it said, it would use interim procedures, taking "extensive long-term investigation and experimentation" to develop final procedures.[Footnote 82] For the ACS weighting and estimation project, the Census Bureau reported that it would be using an interim adjustment to adjust the intercensal population and housing characteristics estimates to the ACS residence concept. The Census Bureau reported that ACS estimates of occupied housing units, households, and householders should agree at all geographic levels. For the program of integrated estimates project, the operations plan discussed the need for more research to introduce improvements to the estimates from ICPE. (The ACS estimates are weighted to a population benchmark, either the most recent Decennial Census results or the most recent ICPE estimates.) The Census Bureau reported that because the accuracy of the intercensal estimates is important to overall ACS accuracy, it is important to use ACS data wherever appropriate to improve the intercensal estimates. The plan for the program on integrated estimates will use information from the 2000 Census, more current ACS distributions of population characteristics, and administrative records to produce improved population and housing unit estimates for all areas, including small areas. The plan also discussed improving housing characteristics by incorporating ACS distributions of local area vacancy rates and household characteristics into statistical models to better estimate subcounty populations. No time schedule for completing the research was provided. Finally, the March 2003 American Community Survey Operations Plan, Release 1 discussed a plan in the ACS to cover group quarters. Persons living in group quarters live in places that the Census Bureau does not classify as housing units--for example, nursing homes, prisons, college dormitories, military barracks, institutions for juveniles, and emergency and transitional shelters for the homeless. Such residences accounted for roughly 2.8 percent of the population in 2000. Although data on group quarters were collected at the test sites beginning with 1999, data were not collected in C2SS or subsequent supplementary surveys. The operations plan discussed the use of an updated Census 2000 Special Places file for the sampling frame for the full ACS. In this case, the plan noted, training field representatives on collecting data from this population is to begin in October 2004, so that full data collection production can begin in January 2005. Census Bureau staff made a presentation on comparison and evaluation reports at the April 2003 meetings of the Census Advisory Committee. The paper's author reported that work was under way on the comparison reports noted in the March 2003 operations plan, and she described the methodology to be used to evaluate differences between the 2000 long form and C2SS. She also reported that the results of the comparisons would be used to identify how the ACS should be improved but that additional research would be needed to address consistency over time between the 2000 Census and the full ACS. She stressed the importance of evaluating consistency "in educating users on the transition from the decennial census sample estimates to the ACS estimates."[Footnote 83] With regard to the comparison report of selected demographic, housing, social, and economic characteristics of 3-year estimates from the ACS test sites to the 2000 Census, the Census Bureau let four contracts with local experts to conduct comparisons of 3-year averages of ACS data for 1999-2001 for selected test sites with selected 2000 Census long-form data as well as 2000 Census population and housing unit characteristics. The comparisons, prepared at the county and census tract levels, would be made for measures of data quality (self-response rates, sample unit nonresponse rates, item nonresponse rates, and sample completeness ratios), as well as for data levels (counts, percentages, means, and medians) for demographic, social, economic, and housing characteristics. In summer 2003, Census Bureau staff presented a number of research papers on the ACS at the annual Joint Statistical Meetings. Papers evaluated differences between long-form and C2SS data items, such as persons with disabilities, educational attainments, and income. Most of the papers that provided comparisons with long-form data indicated whether differences were statistically significant for every comparison. Comparisons were presented at a variety of geographic levels (national, state, and test site levels). Some papers cited operational differences as possible explanatory factors, but information was not presented using a standard set of factors. The Census Bureau published ACS-2010 Census Consistency Review Plan, an internal document, at the beginning of October 2003. Its purpose was to identify methods for major operations used in the ACS and for the 2010 Census that were likely to lead to inconsistent results and to recommend ways to address these inconsistencies.[Footnote 84] Papers prepared on these operations were to discuss how an issue might result in inconsistencies between the ACS and 2010 Census results and to set forth options for dealing with consistency issues, including a research process. The plan identified residence rules and group quarters as two topics. It did not discuss completing the work in time to incorporate it into the full ACS in the next several years. Also in October 2003, the Census Bureau made two public announcements related to the ACS development plan at the Census Advisory Committee meetings. Two papers related directly to projects described in American Community Survey Operations Plan, Release 1. In "Enhancing the Intercensal Population Estimates Program with ACS Data: Summary of Research Projects," Weidman and Wetrogan reported on research to improve the intercensal estimates by using ACS data for two "high priority" areas--international migration and internal migration. This work was being conducted within the Program of Integrated Estimates.[Footnote 85] The second paper described options for determining population control weights for ACS implementation in fall 2004 but did not indicate that research was under way to determine the effect of the options.[Footnote 86] Another source of information related to ACS development was the various reports prepared as part of the Census 2000 Testing, Experimentation, and Evaluation Program. Schneider's January 2004 report compared employment, income, and poverty estimates from the 2000 Census long form and the CPS.[Footnote 87] From this comparison, the author concluded that this work should be continued in an effort to use the results of the comparisons to improve consistency between data collected in the CPS and data in the ACS; the ACS uses the same questions as the 2000 long form. The author also identified for additional research long-form questions that performed badly, based on a reinterview survey. From May to July 2004, the Census Bureau released seven ACS evaluation reports. Four reports compared data from the 2000 Census long form and the C2SS at the national level. Two reports compared these long-form data with 1999-2001 data from the ACS test sites for selected counties and one of these compared these data at the tract level. The other report reviewed operational data from the 2001 and 2002 supplementary surveys. In most of the reports comparing long-form and ACS data, the Census Bureau identified additional work that was needed to improve the quality of the ACS estimates or to help explain differences between the two sets of data for 2000. As noted earlier, these comparisons were limited to the national level. (The seven new reports are listed in the bibliography.) [End of section] Appendix V: Current Status of Unresolved Issues: Independent Controls for Population and Housing Characteristics: According to the Census Bureau's plans, the calculation of independent controls for population characteristics (age, sex, race, and ethnicity) and housing characteristics for the full ACS will require a significantly different methodology from that used for the ACS supplementary surveys. Controls will be needed at the same level of geographic area detail as those that were used for the 2000 Census long form and will need to reflect the new concepts of residence and reference period underlying the ACS. For the annual ACS supplementary surveys, these characteristics were used from ICPE as the independent controls. ICPE uses Decennial Census short-form data as benchmarks and administrative record data to interpolate between and extrapolate from the census benchmarks.[Footnote 88] The program provides "official" annual estimates of population and housing characteristics at the county level, and for some subcounty levels, as of July 1 of each year, using the usual residence concept for seasonal residents. The program also provides annual estimates of total population and housing units for all areas of general-purpose government, such as cities, villages, towns, and townships.[Footnote 89] Table 3 shows information on the calculation of the independent controls used for the 2000 Census long form, the ACS supplementary series, and the fully implemented ACS through 2012. Table 3: The 2000 Census Long Form and ACS Use of Independent Controls for Population and Housing Characteristics: Survey and date: 2000 Census long form; Source of controls[A]: 2000 Census; Weighting area: About 65,000 areas; Items weighed: Population: age group (13), sex, race (6), Hispanic origin (2). Housing: occupied or vacant, owner or renter; Residence concept: Usual; Reference period: Apr. 1, 2000. Survey and date: ACS test site: 1999; Source of controls[A]: ICPE benchmarked to 1990 Census; Weighting area: County; Items weighed: Population: age group,[B] sex, race (3), Hispanic origin (2). Housing: no direct use of housing weights; Residence concept: Usual; Reference period: July 1, 1999. Survey and date: ACS test site: 2000; Source of controls[A]: 2000 Census; Weighting area: County; Items weighed: Population: age group,[B] sex, race/ Hispanic origin (6), Housing: total number of units; Residence concept: Usual; Reference period: Apr. 1, 2000. Survey and date: ACS test site: 2001-04[C]; Source of controls[A]: ICPE benchmarked to 2000 Census; Weighting area: County; Items weighed: Population: age group,[B] sex, race/ Hispanic origin (6). Housing: total units; Residence concept: Usual; Reference period: July 1, 2001-04. Survey and date: ACS supplementary survey: 2000; Source of controls[A]: 2000 Census; Weighting area: County or county combinations; Items weighed: Population: age group,[B] sex, race/ Hispanic origin (6). Housing: total units; Residence concept: Usual; Reference period: Apr. 1, 2000. Survey and date: ACS supplementary survey: 2001-04[C]; Source of controls[A]: ICPE benchmarked to 2000 Census; Weighting area: County or county combinations; Items weighed: Population: age group,[B] (14), sex, race/ Hispanic origin (6). Housing: total units; Residence concept: Usual; Reference period: July 1, 2001-04. Survey and date: Full ACS: 2005-09,[D]; Source of controls[A]: ICPE benchmarked to 2000 Census adjusted to ACS residence concept; (adjustment methodology not announced); Weighting area: Three options:[E]; (1) Intercensal estimates (usual residence) for large areas and ACS estimates (current residence) for small areas; (2) Option 1 but model- based estimates to modify intercensal estimates for large areas to current residence; (3) Develop methods to generate current residence estimates for all small areas; Other: Same areas as 2000 Census long form using intercensal estimates and detail from 2000 census; Items weighed: Not announced; Residence concept: Current; Reference period: Not announced[F]; Comments: Residence concept changes; weighting area options part of research program to determine weighting areas for use with 3-and 5- year averages for 2010. Survey and date: Full ACS: 2010[E]; Source of controls[A]: 2010 Census adjusted to ACS residence concept; (adjustment methodology not announced); Weighting area: Not announced[G]; Items weighed: Not announced; Residence concept: Current; Reference period: Not announced; Comments: 2010 short-form data replace ICPE estimates. Survey and date: Full ACS: 2011; Source of controls[A]: ICPE benchmarked to 2010 Census adjusted to ACS residence concept; (adjustment methodology not announced); Weighting area: Not announced[G]; Items weighed: Not announced; Residence concept: Current; Reference period: Not announced. Survey and date: Full ACS: 2012; Source of controls[A]: ICPE benchmarked to 2010 Census adjusted to ACS residence concept (adjustment methodology not announced); Weighting area: Not announced[G]; Items weighed: Not announced; Residence concept: Current; Reference period: Not announced. Source: GAO analysis of Census Bureau documents. [A] The Intercensal Population Estimates Program (ICPE) develops and disseminates annual estimates of the total population and the distribution by age, sex, race, and Hispanic origin for the nation, state, counties, and total population for subcounty functioning government units. ICPE is authorized by 13 U.S.C. §181, which requires the production of "current data on total population and population characteristics." ICPE estimates, benchmarked to the latest Decennial Census counts, are compiled using administrative record data on births, deaths, and migration. Because they are benchmarked to the census, they reflect the usual residence concept. They are adjusted to reflect the counts as of July each year. [B] Not available. [C] Assumes that test program and supplementary surveys end after 2004: [D] When 2010 Census estimates become available, ICPE estimates beginning with 2001 will be revised to reflect the new benchmark. There is no announced use of revised ICPE estimates to revise previously published ACS estimates. [E] The three options are from Alfredo Navarro, "American Community Survey: Use of Population Estimates as Controls in the ACS Weighting," presented at Census Bureau Advisory Committee of Professional Associations meeting, Washington, D.C., October 23, 2003. [F] One of the Department of Commerce comments on our draft report stated that the Census Bureau would be using July 1 as the reference period for a given year's ACS annual average. [G] There is no announced use of the tract or block group data from the 2010 census. [End of table] Using ICPE for the ACS supplementary surveys, the Census Bureau prepared controls for counties, or combinations of counties. As shown in table 3, for the residence concept, controls from the 2000 Census and ICPE, which were based on the usual residence concept, were used. The reference period for the ACS test program for all years except 2000 was for July 1; for 2000, it was for April 1. (Controls for the 2000 Census long form also were for April 1.) For the full ACS, the Census Bureau will use controls based on the current residence concept. According to the Census Bureau, the current residence concept recognizes that the place of residence does not have to be the same throughout a year, so that the current residence concept allows the ACS data to more closely reflect the actual characteristics of each area. The Census Bureau will use the current residence concept because the ACS is conducted every month and produces annual averages rather than point-in-time estimates, as the Decennial Census does. Also, because the ACS data are collected monthly, it will be necessary to use independent controls that define the reference period as the average for the year using a July 1 reference period. To produce ACS estimates for the full sample, the Census Bureau will need new methodologies for calculating independent controls. For the first annual estimates, for 2005, a methodology will be needed to provide ACS-defined controls for all places with population of 65,000 or more, including those for which intercensal population estimates are not available. For the 2005-07 estimates, which will be used to calculate the first multiyear averages, a methodology for controls for geographic areas with populations between 20,000 and 65,000 will be needed. For the 2008-12 estimates, a methodology for controls down to the geographic levels used for the 2000 Census long form will be needed. Finally, when the population and housing characteristics data from the 2010 Census short form become available and are incorporated into the ICPE estimates, another new methodology will be needed to revise the ACS controls for 2010.[Footnote 90] The Census Bureau also has reported that it is not planning to revise earlier years' ACS data for consistency with revised 2010 estimates unless the inconsistencies between the 2010 ICPE and 2010 Census characteristics were significant. Table 4 shows the differences between population estimates at the county level for 2000 using ICPE based on the 1990 Census and the corresponding data from the 2000 Census. In 2000, the population estimates for almost 20 percent of the counties differed by more than 5 percent. For counties whose population was smaller than 20,000, almost 25 percent had similar differences. Table 4: Population Comparison for Counties in 2000 from ICPE and 2000 Census by County Size: County population[A]: All counties; Total: 3,141; Number of counties with ratio of less than 0.90: 118; Number of counties with ratio of 0.90-0.949: 384; Number of counties with ratio of 0.95- 0.999: 1,722; Number of counties with ratio of 1.00-1.049: 809; Number of counties with ratio of 1.05-1.099: 88; Number of counties with ratio of 1.10 or more: 20; Percentage of counties with ratio less than 0.95 or more than 1.05: 19%. County population[A]: Less than 20,000; Total: 1,348; Number of counties with ratio of less than 0.90: 67; Number of counties with ratio of 0.90-0.949: 174; Number of counties with ratio of 0.95- 0.999: 672; Number of counties with ratio of 1.00-1.049: 355; Number of counties with ratio of 1.05-1.099: 63; Number of counties with ratio of 1.10 or more: 17; Percentage of counties with ratio less than 0.95 or more than 1.05: 24%. County population[A]: 20,000 to less than 65,000; Total: 1,046; Number of counties with ratio of less than 0.90: 41; Number of counties with ratio of 0.90-0.949: 112; Number of counties with ratio of 0.95-0.999: 576; Number of counties with ratio of 1.00- 1.049: 294; Number of counties with ratio of 1.05-1.099: 20; Number of counties with ratio of 1.10 or more: 3; Percentage of counties with ratio less than 0.95 or more than 1.05: 17%. County population[A]: 65,000 to less than 250,000; Total: 516; Number of counties with ratio of less than 0.90: 9; Number of counties with ratio of 0.90-0.949: 62; Number of counties with ratio of 0.95- 0.999: 315; Number of counties with ratio of 1.00-1.049: 125; Number of counties with ratio of 1.05-1.099: 5; Number of counties with ratio of 1.10 or more: 0; Percentage of counties with ratio less than 0.95 or more than 1.05: 15%. County population[A]: 250,000 or more; Total: 231; Number of counties with ratio of less than 0.90: 1; Number of counties with ratio of 0.90-0.949: 36; Number of counties with ratio of 0.95-0.999: 159; Number of counties with ratio of 1.00-1.049: 35; Number of counties with ratio of 1.05-1.099: 0; Number of counties with ratio of 1.10 or more: 0; Percentage of counties with ratio less than 0.95 or more than 1.05: 16%. Sources: Census Bureau reports and GAO analysis. Note: Initial intercensal estimates for 2000 were benchmarked to the 1990 Census; counties include county equivalents, such as parishes in Louisiana. [A] Population classes reflect level of geographic area detail to be calculated from ACS. For example, geographic areas with populations smaller than 20,000 will be available using 5-year averages. [End of table] Census Bureau staff had long recognized the need for new methodologies to develop independent controls for the ACS. For example, a 1995 paper by Love, Dalzell, and Alexander, discussing issues related to evaluating the 1996 test site results, expressed concern about independent controls and residence rules, as well as the need for consultation with users.[Footnote 91] In 1998, the Census Bureau sponsored a conference on the quality of ACS data for rural data users. In the final report on this conference, the Westat authors concluded that the Census Bureau needed to continue and expand its contacts with stakeholders and to conduct additional research on several issues, including independent controls.[Footnote 92] Alexander and Wetrogan also discussed this issue at the 2000 Joint Statistical Meetings when they reviewed possible methods for using ICPE estimates.[Footnote 93] They also noted the need to consult with users on how to present information on the differences in ACS controls and ICPE in ACS publications. Census Bureau staff also recognized that the new ACS would create differences between (1) ACS population and housing characteristics data and the corresponding "official" data from the Decennial Census and (2) ACS population and housing characteristics data and the "official" ICPE population estimates, which are benchmarked to Decennial Census data. They also recognized that the creation of new controls for the ACS would result in inconsistencies between ACS data and data from federal household surveys, such as the CPS, whose population and housing characteristics are also based on the Decennial Census and ICPE estimates. Such differences might hinder the use of ACS data to expand and improve small geographic area estimates based on the other surveys. (CPS provides official national estimates of labor force information, such as the unemployment rate and income estimates used to calculate the number of persons in poverty.) In March 2003, the Census Bureau announced that it did not have a final methodology and that such methodologies would not be established for several years. In March 2003 in American Community Survey Operations Plan, Release 1, the Census Bureau identified research projects to be completed in preparation for full implementation of the ACS. One of these projects, "weighting and estimation," covered the methodology for calculating the independent controls for the ACS; a second, "program of integrated estimates," covers the calculation of these controls from the ICPE. This plan also reported that the results of these research projects would not be available in 2004 to begin implementing them with the start of the full ACS. Instead, the Census Bureau said it would use interim procedures and that it would take "extensive long-term investigation and experimentation" to develop final procedures. For the weighting and estimation project, the Census Bureau reported that it would be using an interim adjustment to adjust the intercensal population and housing characteristics estimates to the ACS residence and reference period concepts. This project would include research to examine the need to achieve agreement between the estimates of occupied housing units, households, and householders at all geographic levels. The Census Bureau reported that work on the project to revise and simplify the weighting methodology began in early 2003, that preliminary papers documenting the revisions might be available by summer 2004, and that research would continue for several years. For the program of integrated estimates project, the operations plan discussed the need for more research to introduce improvements to the ICPE estimates using information from the 2000 Census, more current ACS distributions of population characteristics, and administrative records to produce improved population and housing unit estimates for all areas, including small geographic areas. The plan also discussed improving the housing characteristics. ACS distributions of local area vacancy rates and household characteristics can be incorporated into statistical models that use distributions of housing unit characteristics to better estimate subcounty populations. No time schedule was provided for completing the research.[Footnote 94] In October 2003, Census Bureau staff presented a paper at the Census Advisory Committee meetings that described the options being considered to convert the ICPE estimate to the current residence concept.[Footnote 95] The paper described options for determining controls for ACS implementation in fall 2004 but did not indicate that research was under way to determine the options' effects. A second paper at the same meetings reported on research to improve the intercensal estimates by using ACS data for two "high priority" areas--international migration and internal migration. This work was being conducted as part of the Program of Integrated Estimates.[Footnote 96] Although the latest NAS report on the ACS does not specifically note issues relating to independent controls, we asked experts who had participated in preparing NAS reports, as well as other experts in small area data, the following question about ACS weighting: "Given the newly benchmarked intercensal estimates, the following question arises regarding the use of the 2010 Census data in the ACS: Should ACS estimates continue to be controlled to 2010 Census data at the county or county group level and differences between the ACS and census population counts and characteristics allocated proportionately to the tract or block group levels? Or should ACS estimates be controlled to 2010 Census data at the tract and block group level, as would have been the case with a long form?" All the experts agreed that the ACS should be controlled to the decennial census, but several noted that they had not thought about the issue and had not heard anything from the Census Bureau on the issue. (The experts are listed in app. I.) Operational Issues: The Census Bureau has identified operational issues with the ACS test programs, primarily from information from evaluation studies on the 2000 Decennial Census and Census Bureau staff research papers on comparisons between data collected in the ACS 2000 supplementary survey and the 2000 Decennial Census long form. These issues include problems with questionnaire design, nonresponse followup, and data capture, as well as coverage of persons living in group quarters. In January 2004, the Census Bureau released the results of a key evaluation study of 2000 Decennial Census long-form data, using a reinterview survey.[Footnote 97] The study identified problems with long-form questions, which are the same as those used for the ACS, and proposed several research efforts based on a statistical evaluation of the quality of the responses to each question. For questions identified as having significant quality problems, the study recommended research on the design of the form and placement of the questions and suggested using cognitive experts in testing revised questions. The study also recommended that the Census Bureau and BLS work on the ACS employment and unemployment questions to ensure that they would complement the BLS local area unemployment statistics program. The Census Bureau also conducted a study to evaluate the design of the ACS questions that are needed to implement the residence concept and reference period for the ACS.[Footnote 98] The study suggested that additional testing was needed for the questions about multiple residences (currently, the last set of questions in the housing section). It noted "that asking these questions on a person basis may produce different and probably better data than asking them on a household basis."[Footnote 99] The study was limited in scope and did not assess how accurately ACS respondents assign persons associated with the household to a current residence. In the ACS, the Bureau uses "In the past 12 months . . ." whereas the Census Bureau used "In 1999 . . ." for the long form. Because the reference date is not fixed, it is important for a respondent to supply the date that the ACS questionnaire filled out. Otherwise, it cannot be determined whether there is an inconsistency in an ACS questionnaire received in late April 2004 that lists a resident aged 10 with a birthdate of April 15, 1993.[Footnote 100] Census Bureau staff also discussed operational issues in research papers, based on evaluations of comparisons between 2000 Decennial Census long-form and ACS 2000 supplementary survey data for selected items presented at the 2003 Joint Statistical Meetings. A paper on income data identified the new question on the reference period as a potential source of problems, even though an additional instruction had been added to the ACS questionnaire in 1999.[Footnote 101] The authors expressed concern that some ACS respondents may misinterpret the question on "income in the past 12 months" as a request for monthly income instead of income during the previous year. The paper also included recommendations for additional research on the effect of the data capture methods. For the 2000 long form, all data items were entered with an automated optical character recognition procedure; data from the ACS will be manually keyed. Another paper, presented at the same 2003 meetings, that evaluated differences in the data on disabled persons found large and significant differences at the national level and also recommended that new questions be tested.[Footnote 102] Additional areas were identified for further research, based on evaluations of questions such as educational enrollment, ancestry, and grandparents caring for grandchildren. These areas included specific facets of the mailout-mailback system and nonresponse followup. For example, nonresponse follow-up for the 2000 long form was conducted for all nonrespondents, but for the ACS test program and for the full ACS, nonresponse follow-up will be conducted for a sample of one-third on all nonrespondents. The Census Bureau also has discussed issues with the expansion of ACS coverage to include persons living in group quarters--for example, nursing homes, prisons, college dormitories, military barracks, institutions for juveniles, homeless shelters.[Footnote 103] In October 2002, it informed its advisory committee members of the formation of a special planning team to address issues on the definition of group quarters and duplication in the address file. From the minutes of this meeting, it appears that this team will focus on group quarters in the context of the 2010 Census short form. In the ACS March 2003 operations plan, the Census Bureau reported on a new project to cover group quarters in the full ACS.[Footnote 104] The Census Bureau reported that the special project was needed because of the special challenges of developing an updated address list; in the past, such a list had been updated only once a decade. According to the Census Bureau, tests on the new list were to be completed in time for use in the full ACS in January 2005. In addition, an internal planning document issued in October 2003 identified group quarters (and residence rules) as special problems and instructed staff to provide recommendations on the collection of data on them in January 2004.[Footnote 105] Usually, the Census Bureau tests new questions. According to recent Census Bureau decisions, those tests would have to be completed so that new questions could be incorporated into the 2008 ACS questionnaire.[Footnote 106] Valuation and Presentation of Dollar-Denominated Data Items: The Census Bureau has adjusted all dollar-denominated items from the ACS testing programs, such as incomes, housing values, rents, and housing-related expenditures, for inflation. For example, ACS data for 2001 and 2002 released in September 2003 for median household income are expressed in 2002 dollars. This practice means that when each added year of ACS data is released, all dollar-denominated items for prior years will be revised. The Census Bureau makes a similar adjustment for the annual income data collected in the CPS. Unlike the ACS, the Census Bureau releases annual CPS data without the adjustment. In addition, the annual values collected in the ACS were adjusted to the calendar year. It will be using the CPI for the annual and monthly adjustments for all geographic areas. A report prepared for HUD found problems with the adjustment, including (1) the lack of a "trending" adjustment in the calculation of annual averages, (2) the use of the adjustment for multiyear averages, (3) the adjustment for cost of living for data items other than income, and (4) the lack of the unadjusted annual data that would enable HUD to use alternative methodologies. In addition, research by Census Bureau staff questioned the adjustment for incomes when they found that it was a probable source of difference between income data from the supplementary survey and corresponding data from the CPS and the 2000 Census long form.[Footnote 107] The report prepared for HUD provided a detailed review of HUD's use of the ACS for program applications. On the methodology for the inflation adjustment, the first step should be a trending adjustment that would convert the reported monthly data to a calendar year basis. Discussing this omission, the report stated, "Making an inflation adjustment is not the same as trending. The cost of living adjustment assumes that the purchasing power measured at any point in the data collection period remains constant throughout the period. For example, assume that the cost of living rises by 3 percent a year. If a household reports an annual income of $50,000 in January, a cost of living adjustment to the end of the year would increase this income to $51,500, the amount needed in December to equal the purchasing power of $50,000 in January. A trending adjustment makes no assumption about purchasing power. It attempts to track movements in dollar income. Assume that dollar income is growing at 5 percent a year. Then a trending adjustment to the end of the year would increase the $50,000 reported in January to $52,500 in December."[Footnote 108] HUD's second concern was that the methodology the Census Bureau used to calculate the adjustment was not appropriate for multiyear averages. The HUD report stated, "The Census Bureau plans to report income in constant dollars. Income information collected in the various months will be adjusted for inflation so that all collected income will be expressed in dollars with the same purchasing power, presumably the purchasing power of dollars in December of the survey year. For moving average tabulations, all income information will be adjusted for changes in purchasing power over the period used to calculate the moving average. In other words, income reported by a respondent in the first month of a five-year moving average will be adjusted for almost five years of inflation."[Footnote 109] To illustrate this problem, the HUD report gave the following example: "The standard Census Bureau tables for areas over 65,000 will tabulate the rents reported by respondents over the twelve months during which data were collected. A unit reporting a contract rent of $800 in January might actually be paying $850 in December. The standard table would record this unit as having a rent of $800. The standard Census Bureau tables for areas under 20,000 will tabulate rents reported by respondents over a sixty-month period. A unit reporting a contract rent of $800 in the January of the first year might actually be paying $1,070 in December of the fifth year. The standard table would record this unit as having a rent of $800."[Footnote 110] Such changes would not be captured with an adjustment based on the all- items CPI. The HUD report also noted that the inflation index the Census Bureau proposed related to income and not to the other dollar value data, such as rent, utility costs, or home value, where a purchasing power concept did not meet HUD's needs. The report concluded that to overcome this problem, before HUD could use dollar-denominated data from the ACS, it would first have to eliminate the inflation adjustment from the published data. The report stated: "For applications that involve trending income, HUD users will have to center the ACS information at an appropriate point in the collection period and remove the inflation adjustment before applying a trending factor."[Footnote 111] In addition, it noted that: "The ACS will generate income distributions comparable to those from the decennial census, but the distributions will have a feature that will complicate the use of income data from the ACS in APP [HUD's Annual Performance Plan] measures. Whereas the decennial long form measures money income, the ACS reports average purchasing power."[Footnote 112] The report thus recommended that HUD use the unadjusted data--data that the Census Bureau had not planned to publish--in order to make the changes needed for HUD. The validity of the Census Bureau's inflation adjustment was also questioned in research Census Bureau staff conducted to evaluate differences between the data reported in the ACS supplementary surveys. In a paper presented at the 2003 Joint Statistical Meetings, staff evaluated differences between income data from C2SS and the 2000 Census long form, as well as the CPS.[Footnote 113] The paper summarized the major differences in the income data from these sources in terms of data collection, capture, and processing and provided preliminary assessments of their contributions to these differences. The authors noted the need for further research on the effect of the difference in reference period and the inflation adjustment, as well as operational aspects such as data capture. With regard to the inflation adjustment, they reported: "If no CPI adjustment had been made to the dollars reported on either Census 2000 or C2SS/ACS, the difference between medians at the U.S. level would have been smaller than the 4.6 percent shown in Table 3 [omitted]. Instead, the difference would have been 2.5 percent. Since adjustment clearly played a role in determining the size of the difference between Census 2000 and C2SS/ACS estimates, it would be worthwhile to examine the costs and benefits of adjusting C2SS/ACS incomes as well as the choice of factors used to adjust them."[Footnote 114] The authors summarized their findings by concluding that "it is clear that we are just at the beginning stages of understanding why Census 2000 and C2SS income figures differ."[Footnote 115] They noted that the income comparisons are most critical because these Census Bureau data are used in the calculation of the number of people in poverty. In a December 2003 research paper, Census Bureau staff examined concerns about the absence from the official poverty measures of adjustment for geographic differences in cost of living. Like the ACS, for which the Census Bureau is assuming that the cost of living is the same throughout all geographic areas, the poverty measures are based on the same assumption. The authors concluded that the use of a poverty measure that takes into account geographic differences in housing costs, would significantly change the poverty measures in many states.[Footnote 116] Evaluations of ACS, Long-Form, and CPS Data Comparisons: One of the Census Bureau's major justifications for the ACS test programs has been its comparing data collected in these programs, and corresponding data from the 2000 Decennial Census short and long forms, to identify operational problems. Another major justification for the ACS test programs has been the use of these comparisons, and comparisons with corresponding data from the CPS, to inform users in making the transition from the 2000 long form to the ACS. The Census Bureau's 1999 request to OMB for approval of the forms for the ACS test programs stated that: "to make a transition from the Census 2000 long form to collecting long-form data throughout the decade, we will begin ACS data collection in 1,203 counties. This data collection will allow for comparison of estimates from Census 2000 with estimates from the ACS for all states, large cities, and population subgroups, and will help data users and the Census Bureau understand the differences between estimates from the ACS and the Census 2000 long form."[Footnote 117] In testimony to the Congress a year later, Kenneth Prewitt, the Census Bureau's Director, referred to the ongoing ACS test programs: "These data will also contribute to a comparison with data from Census 2000 that is necessary because there are differences in methods and definitions between the census and the ACS. Moreover, decision makers will want to compare an area's data to those from Census 2000. Comparisons using data from the operational test and from the 31 sites are essential to determine how much measured change between Census 2000 and future years of the ACS is real and how much is due to operational differences between the ACS and the census."[Footnote 118] When the Census Bureau began in 2001 to report on full implementation of the ACS, its first report focused on the operational feasibility of conducting the ACS.[Footnote 119] Its second report in 2002 focused on differences in operational characteristics of the ACS and the census long form, such as response rates and the extent of imputations.[Footnote 120] The 2002 report stated that three reports evaluating differences between the ACS and census long form would be published at the end of 2003.[Footnote 121] The Census Bureau repeated this schedule in March 2003 when it released another official report on ACS plans.[Footnote 122] In September, we were told by one of the ACS experts that consultants had been hired to conduct evaluations for 4 of the 31 test sites. The reports on comparisons with long-form items and for the test sites were published in May, June, and July 2004. The results of these comparisons are similar to comparisons and evaluations of long-form data items previously prepared by Census Bureau staff, BLS, and GAO. In September 2002, we prepared national and state comparisons between the 2000 ACS supplementary survey and the 2000 Decennial Census long form for about 10 items and between the 2000 ACS supplementary survey and the 2000 CPS for the poverty and unemployment rates. From the long- form comparisons, we reported that: "These comparisons showed large national differences for key items that did not appear to be accounted for by coverage differences between the two surveys. For example, at the national level, the largest differences were for these items: (1) for the number of housing units lacking complete plumbing facilities, with the long-form estimate 27 percent higher than the estimate from the supplementary survey, and (2) for the number of unpaid family workers, with the long-form estimate 59 percent lower. . . . We also found a great degree of variation in the state differences between the long form and the supplementary survey."[Footnote 123] From the CPS comparisons, we reported that: "We found that at the national and state levels, there were small differences for the unemployment rate and for the poverty rate for all individuals. In contrast, comparisons of these rates for the CPS with these two surveys showed larger differences. The national unemployment rate, according to the CPS, was 4.0 percent, compared with 5.8 percent for the long form and 5.4 percent for the supplementary survey. The national rate for individuals in poverty for the CPS was 11.3 percent, compared with 12.4 percent for the long form and 12.5 percent for the supplementary survey."[Footnote 124] Given these results, we recommended that the Census Bureau expand the scope of evaluation studies to develop supplementary survey estimates for states and large places consistent with the 2000 long form and that it include in its evaluations comparisons of year-to-year changes for 2001 and 2002, using data from the supplementary surveys and the CPS at the national and state levels for key economic and housing items. In September 2003, BLS received a report from a consultant who had been hired to evaluate differences between labor force data, such as the unemployment rate, reported in the ACS test programs and the CPS.[Footnote 125] The evaluation's purpose was to provide BLS with information on whether and how to incorporate ACS data into its measures of unemployment and the labor force. The consultant compared several labor market indicators from the CPS and ACS for 2000-02 at the national and state levels: "Relative to the CPS, the ACS consistently generates lower estimates of the labor force and employment but higher estimates of unemployment. These patterns are present in each of the years 2000, 2001, and 2002. They are repeated in nearly all state-level data as well."[Footnote 126] He made a series of recommendations for additional research, some requiring additional information from the Census Bureau. The need for such research was also reported in a January 2004 Census Bureau report that examined differences between labor force data from the CPS and the 2000 Decennial Census long form.[Footnote 127] Other findings and recommendations for further research similar to ours and those of BLS were also reported in research papers Census Bureau staff presented at the 2003 Joint Statistical Meetings. One paper on comparisons of income data for 2000 from the 2000 Decennial Census long form and the 2000 ACS Supplementary Survey reported that it: "provided a summary of the major differences between the two income data sources, in terms of data collection, capture, and processing, and provided very preliminary assessments of the possible role these differences may have played."[Footnote 128] The authors reported that additional work was needed to understand the differences and offered recommendations for further research. Another paper presented at the same meetings examined differences between the national estimates for people aged 5 or older with a disability--48.9 million was the 2000 Census long-form estimate, 39.7 million the C2SS estimate. The author did not determine which estimate was more reliable but did find that the wording of some questions might explain the overall difference. In addition, the author reported that more work, such as additional analysis of currently available data and testing of new questions, was needed to clearly identify the reasons for the difference.[Footnote 129] The differences in disability data were also the subject of a National Council on Disability position paper, which recommended changes to the questions on disability.[Footnote 130] In addition to results from these comparison studies, the NAS Panel on Research on Future Census Methods found in July 2003 that the Census Bureau needed to complete evaluations of differences between 2000 Census long-form data and data from the ACS test sites and the 2000-02 Supplementary Surveys. Specifically, the panel stated that: "The Census Bureau should carry out more research to understand the differences between and relative quality of ACS estimates and long-form estimates, with particular attention to measurement error and error from nonresponse and imputation. The Census Bureau must work on ways to effectively communicate and articulate those findings to interested stakeholders, particularly potential end users of the data."[Footnote 131] The panel also stated that, to facilitate this effort, "The Census Bureau should make ACS data available (protecting confidentiality) to analysts in the 31 ACS test sites to facilitate the comparison of ACS and census long-form estimates as a means of assessing the quality of ACS data as a replacement for census long-form data. Again, with appropriate safeguards, the Census Bureau should release ACS data to the broader research community for evaluation purposes."[Footnote 132] Information on Key Properties of Multiyear Averages: One of the major differences between the ACS and the long form it will replace is that the ACS will provide data for geographic areas with populations smaller than 65,000 in terms of multiyear averages. Because of the statistical properties of these averages and users' unfamiliarity with them, we and many other stakeholders have identified these averages as a major challenge for users, including federal agencies. The Census Bureau has recognized the need for such guidance on the averages but has not made public plans for the topics to be discussed or when the guidance will be published. From the 1998 conference that the Census Bureau had asked Westat to conduct on the quality of ACS data for rural data users, the report's authors concluded that "On the basis of the full exchange between the Bureau and the participants, they saw no evidence of an antirural bias in the design of the ACS."[Footnote 133] Nevertheless, they also concluded that the Census Bureau needed to conduct and expand its contacts with stakeholders and to conduct additional research on several issues we discussed in our report, including population controls, operational aspects of nonresponse followup, and multiyear averages. For these averages, the conference report noted that there would be issues with small geographic areas and the interpretation of changes in these averages: "In discussing this issue, a number of the participants thought that averages were particularly problematic for those areas in which change is irregular. For example, the question was raised as to the meaning of 'average poverty' over a 5-year period in which poverty rose and fell from one year to the next and, thus, the average would have no obvious meaning."[Footnote 134] The report made similar comments with regard to such characteristics as unemployment and income. Although the conference participants had generally agreed with these concerns, the report pointed out that annually updating the 5-year averages "will provide some insight into trends, although turning points will be difficult to discern precisely, as will short-term trends."[Footnote 135] About a year later, the Census Bureau had Westat convene another conference, this one focusing on the use of multiyear averages. The 1999 report concluded: "Although a 5-year moving average will generally provide reasonably reliable cross-section statistics for all areas, including very small communities, some care will have to be exercised in choosing time periods for which changes in population or their characteristics are measured. With 5-year averages, four-fifths of the data in a pair of neighboring years will be identical. The change being measured will then be one-fifth of the difference between the most recent year and the first year of the earlier time period. The sampling errors of the differences will thus be based on annual sample sizes, not 5-year averages, and will generally be too large to make useful inferences for small areas. The two 5-year averages that are being compared should generally be discrete and non-overlapping periods, e.g., 2003-2007 and 2008-2012. These comparisons will have about the same reliability as changes between two censuses using data collected in the Census long form."[Footnote 136] Census Bureau staff have been well aware of the difficulties of using the new multiyear averages. The Census Bureau's Charles Alexander presented a paper at a 2001 Statistics Canada conference on statistical methodology in which he recognized that the multiplicity of estimates for the same geographic area would be an issue for users (and for the Bureau). He said that the Census Bureau's presentation would: "encourage analysts to use the same length of cumulation when comparing areas of different sizes . . . . For example, we would use one year for comparing states, but would recommend 5 years for all the counties in a table comparing large and small counties."[Footnote 137] Alexander noted that this approach differed from that of Kish, the developer of the concept of a "rolling sample," who would "let us use tables of counties with one-year estimates for large counties, 3-year averages for medium-sized ones, and 5-year averages for small ones." He concluded this section of the paper by saying, "It will be interesting to see what practices data users will adopt in this regard."[Footnote 138] At the fall 2002 Census Advisory Committee Meetings, Navarro presented a paper that Alexander had written. Focusing primarily on the quality of the 5-year averages, the paper noted advantages and shortcomings, including that multiyear averages are not useful in all situations. For example, "If there is little change in the population over the time covered by the average, the interpretation is about the same as that of a point- in-time estimate with the advantage that the ACS estimate is more current than the historical decennial census long-form estimate."[Footnote 139] The paper provided examples with "naive" assumptions about how users extrapolate between censuses to show that multiyear averages "work." By implication, under other conditions, users will need guidance on when multiyear averages can be used. The paper also did not discuss the interpretation of changes in the multiyear averages, as in the 1999 Westat conference report or multiple estimates, which Alexander had discussed in his paper for the 2001 Statistics Canada conference. In September 2002, two reports focused on issues related to the statistical properties of multiyear averages. We published a report on several aspects of the ACS, including the selection of questions and the feasibility of conducting the ACS as a voluntary survey, and HUD released a report prepared for its staff on the use of the ACS for HUD programs.[Footnote 140] We stated in our report that the Census Bureau evaluation would not discuss "measures of stability of annual ACS data and ACS multiyear averages." We recommended that, as a first step, the Census Bureau: "Analyze and report on differences between year-to-year changes for 2001 and 2002, using the data--from ACS special supplements and the CPS at the national and state levels--for key economic and housing characteristics, such as the unemployment and poverty rates, to determine the stability of the annual ACS data."[Footnote 141] We also discussed the need for additional information on the characteristics of the multiyear averages to help federal agencies make the transition to the ACS. We specifically noted the need for information on the selection of ACS data for geographic areas with populations larger than 20,000 for which there will be multiple estimates. On this issue, we stated that, "In addition, we found that the ACS development program did not cover information about different ways to integrate the annual data for states and large counties and the 3-and 5-year averages for smaller counties."[Footnote 142] For example, federal agencies that need state data can choose to use the annual data, multiyear averages of the annual data, or 3-year or 5- year ACS averages. Federal agencies that also need county data can choose to use the most recent annual data for large counties and adjust the averages of the smaller counties to agree with annual data. Alternatively, they can choose to use various combinations of multiyear averages. As many federal agencies, as well as state and local governments, will be using the ACS data for allocating funds, Census Bureau guidance would reduce the inconsistent use of the multiple estimates. HUD is a major user of Decennial Census long-form data for various program applications. Its contract with ORC Macro to review how the ACS will affect HUD programs that previously relied on the Decennial Census long form for geographic area data resulted in a report that made two points about the multiyear averages, in addition to raising the previously discussed issues on the inflation adjustment to income. One of these issues related to interpretations of changes in the multiyear averages and their stability; the other related to the availability of multiple estimates for the same area. The ORC Macro report noted that year-to-year stability is important and needs to be addressed. It warned that the "differences in the precision of estimates or year-to-year changes in estimates can create problems for HUD applications."[Footnote 143] The report used eligibility and level of benefits as an example of what could vary because of the effect of sampling variability on these changes. ORC Macro also stated: "The ACS will report data using different reporting periods for different sized areas. Inconsistent or multiple reporting periods can create problems for HUD applications."[Footnote 144] ACS data for many geographic areas will be available in terms of annual estimates and 3- and 5-year averages, and the annual and 3-year averages (for larger areas) will be available before estimates for smaller areas. As a result, HUD will have to choose from multiple measures for some geographic areas. The study noted that HUD might decide to (1) continue to use 2000 long-form data until 2008, when the first 5-year average data will be available for all levels of geography, or (2) use the most recently available data in all cases.[Footnote 145] ORC Macro's report also expressed concern about the amount of annual ACS data that the Census Bureau will release for areas with populations smaller than 65,000, whose accuracy the Census Bureau has found does not meet publication standards. According to the study, the Census Bureau informed HUD that beginning in 2008, it would provide researchers and planners a "research file" containing annual ACS data for areas of all sizes, including census tracts. ORC Macro recommended that if the Census Bureau does release these data, HUD consider using these "unofficial" research file results in some of its applications. The study noted, however, that if HUD decided to use these unofficial data but other agencies decided not to use them, there would be no standardization across government programs in funding allocation where the same ACS items were used. The most recent request for the Census Bureau to provide users with guidance on using multiyear averages came in the July 2003 report by the NAS Panel on Research on Future Census Methods. The panel stated that "The Census Bureau should issue a user's guide that details the statistical implications of the difference between point-in-time and moving average estimates for various uses."[Footnote 146] In the report's executive summary, the panel stated that "The Census Bureau must do significant work in informing data users and stakeholders of the features and the problems of working with moving average-based estimates."[Footnote 147] It also expressed particular concern about the use of the multiyear (moving) averages in fund allocation formulas, noting that a multiyear average: "is a smoothed estimate; by averaging a particular time period's data observation with those within a particular time window, the resulting estimate is meant to follow the general trend of the series but not be as extreme as any of the individual points. The ramifications of this basic concept emerge when moving average estimates are entered into sensitive allocation formulas or compared against strict eligibility cutoffs. A smoothed estimate may mask or smooth over an individual year drop in level of need, thus keeping the locality eligible for benefits; conversely, it may also mask individual-year spikes in activity and thus disqualify an area from benefits. It is clear that the use of smoothed estimates is neither uniformly advantageous nor disadvantageous to a locality; what is not clear is how often major discrepancies may occur in practice."[Footnote 148] On another issue with multiyear averages, the panel noted, as the Westat report had done, the issue of interpreting year-to-year changes, stating, "It is incorrect to use annual estimates based on moving averages over several years when assessing change since some of the data are from overlapping time periods and hence identical. At the least, the results will yield incorrect estimates of the variance of the estimates of change. Therefore, users should be cautioned about this aspect of the use of moving averages."[Footnote 149] External Consultation: During the past decade's development of the ACS, the Census Bureau has had many opportunities to consult with and take account of input from stakeholders and users in making key decision on the programs. It has (1) sponsored NAS panels, (2) held user conferences, (3) hired consultants to organize two conferences, (4) met regularly with its advisory committees and other user groups, and (5) encouraged its staff to present reports at ASA meetings and meetings of similar professional organizations. In the past several years, we and other federal agencies have reported on the ACS and provided recommendations to the Census Bureau. It established the ACS Federal Agency Information Program in 2003, responding to a recommendation we had made.[Footnote 150] It also announced last year that it was looking into establishing a partnership with the Congress and its oversight entities.[Footnote 151] Despite these opportunities, many stakeholders have observed that these consultations have not been successful. NAS noted the Census Bureau's lack of response to recommendations in last year's report on the 2010 Census. The Panel on Research on Future Census methods offered the following comment by referring to a 1995 NAS report: "Eight years later, faced with the task of offering advice on making the vision of continuous measurement a reality in the 2010 census, the similarity between the arguments then and now is uncanny. Similar, too, are the points of concern; the current panel is hard-pressed to improve upon the basic summary of concerns outlined by our predecessors. We are, however, much more sanguine that a compelling case can be made for the ACS and that it is a viable long-form replacement in the 2010 census."[Footnote 152] The Census Bureau has neither responded to the panel's first interim report in 2000 nor indicated that the recommendations were being adopted. The Census Bureau also has not responded to recommendations and issues raised by HUD and BLS. For example, it has not responded to HUD's recommendations on the ACS adjustments to dollar-denominated items or to BLS's recommendations on the ACS labor force questions. (On the issue of dollar-denominated items, we found no indication that the Census Bureau had ever consulted outside experts about either the methodology or the implementation.) Census Bureau summaries of discussion about the ACS at its Advisory Committee meetings from October 2000 to April 2003 also indicate a lack of responsiveness.[Footnote 153] During this period, committee members raised concerns about the ACS. In particular, they made recommendations about many of the issues we have discussed in this report, including the evaluations of ACS and long-form comparisons, the new residence rules, independent controls, ICPE, group quarters, and Spanish-language questionnaires. At the April 2003 meeting, ASA committee members also requested a change in the structure of the Advisory Committee meetings, asking the Census Bureau to spend less time on update sessions and more time on sessions devoted to gathering more detailed input, commentary, and recommendations on topics the Census Bureau needs and wants advice on. Although the Census Bureau has addressed issues related to ICPE and Spanish-language questionnaires, the meeting summaries do not report that it followed recommendations in other areas. [End of section] Appendix VI: Comments from the Department of Commerce: THE SECRETARY OF COMMERCE: Washington, D.C. 20230: August 2, 2004: Mr. Robert P. Parker: Chief Statistician: U.S. Government Accountability Office: 441 G Street, NW: Washington, DC 20548: Dear Mr. Parker: The U.S. Department of Commerce appreciates the opportunity to comment on the U.S. Government Accountability Office draft report entitled American Community Survey: Key Unresolved Issues (GAO-04-42). The Department's comments on this report are enclosed. Sincerely, Signed by: Donald L. Evans: Enclosure: Comments from the U.S. Department of Commerce Regarding the U.S. Government Accountability Office (GAO) Draft Report Entitled American Community Survey: Key Unresolved Issues: GAO's draft report identifies five outstanding issues that are believed to jeopardize the American Community Survey (ACS) replacement of the long form." Our comments are framed around these five issues. Issue 1: A Methodology for Independent Controls for Population and Housing Characteristics is Lacking. A methodology is in place to use independent housing unit and population estimates in the weighting process to produce ACS population and housing characteristics. These methods have been used during the extensive testing of the ACS. The Census Bureau agrees with the GAO recommendation that research should be conducted to assess whether there is a need for changes to these independent estimates or other aspects of the weighting process to deal with the issue of current residence. The GAO asserts that this must take place in the next year, with a decision made in time for producing 2005 estimates. The Census Bureau intends to conduct research over the next few years, with a decision in place for weighting and estimation of the 2008 estimates. It is important to note that the Census Bureau has not concluded that methodological changes are needed nor that the changes that might be considered would require revisions in the Intercensal Population Estimates (ICPB). Research may determine that the ICPH which are currently used as population controls in the ACS should be revised or that additional information from other sources (e.g., seasonal housing questions from the ACS or administrative records) be used in combination with the existing ICPE to deal with this current residence issue. Research may also conclude that no general changes are warranted but that methods need to be developed for specific areas that exhibit effects of the difference between "current" and "usual" residence. Research is not currently planned to develop population estimates (ICPE estimates) below the county level. As mentioned above, research may result in the development of alternative population controls below the county level, using the seasonal housing questions from the ACS or administrative records. The ICPE estimates, which refer to a July I point in time, are considered roughly equivalent to an "average population for the year." For this reason, the Census Bureau does not plan to conduct research on adjusting the July I centered population estimates for weighting of the ACS annual averages to address what the report refers to as the "reference date." Issue 2. Operational Issues Have Not Been Addressed. The report identifies a limited number of operational concerns. The major operational issue raised throughout the report deals with the process for revising the specific questions included in the ACS. The report states that questionnaire changes, identified in cognitive testing in 2001, have not been incorporated. The Census Bureau has developed a plan to delay all questionnaire changes until 2008. This will allow all changes to be consolidated and to minimize operational complexities in producing new materials, as well as technical complexities in the production of annual estimates. Statements in this report about the 2010 decennial census schedules, and their lack of fit with the ACS, require clarification. In some parts of the report, the GAO correctly notes that schedules have been developed to facilitate having final content in place for the 2008 ACS. In other sections, the report states that the schedules are in conflict. Issue 3: Plans for Valuation and Presentation of Dollar-Denominated Data Items Are Questionable: The report raises concern about the inflation-adjustment methods used in the ACS. The ACS uses the Consumer Price Index (CPI), a generally accepted measure of inflation. The Census Bureau acknowledges that the inflation-adjustment issue deserves additional research and will investigate alternatives if funding permits. We note, however, that geographic differences in price levels are not accounted for by most federal programs and guidelines; for example, neither the official poverty thresholds nor the federal tax exemption for dependents uses such a distinction. Should the methodology for incorporating inflation be changed, the Census Bureau could recompute past data values if needed. In addition to standard data products, we expect to provide some additional information that could include unadjusted measures. Issue 4; Evaluations of Comparisons Are Incomplete and Users Lack Information on ACS Time-Series Consistency. The GAO states that the comparisons of Census 2000 and ACS results are limited and reflect major scaling back from original plans. The report states that we have, ". . . missed the opportunity to test (1) distortion and stability in multiyear averages, (2) differences between multiple estimates for the same geographic areas, and (3) the use of annual ACS data for small geographic areas." The Census Bureau has just completed a series of comparison reports and is planning additional research to assess the reliability and stability of ACS estimates over time. Evaluations of single-year versus multiyear estimates for the same areas are also planned. Issue 5: Users Are Not Informed on Key Properties of Multiyear Averages. The Census Bureau agrees with the need to develop such user guidance, and efforts have begun to educate users on the interpretation of multiyear averages. Although additional efforts are expected, the Census Bureau is currently working through the National Academy of Sciences (NAS) panel, the American Statistical Association Professional Advisory Committee, and the Federal Agency Information Program to more completely identify user needs and best practices. It is our intention to develop appropriate guidelines and tools. We do not see this as a process that needs to be in place in the next year; rather, it is required by the time the first set of multiyear averages will be available in 2008. Conclusion/Recommendation: -Many of the points raised in the report request additional information on the timing of ACS research and development, specifically related to independent housing unit and population estimates, weighting methods, comparison studies, and questionnaire changes. The Census Bureau agrees that more detailed documentation is needed about research and development plans, which includes schedules for when the ACS intends to introduce revised methodologies into production. Such clarification might allow the GAO and other users a clearer picture of our intentions. The first recommendation is that the Census Bureau revise the ACS evaluation and testing plan to focus on the issues in this report. The current ACS testing and evaluation plan includes all of these issues, along with additional ones. The Census Bureau continues to review evaluation priorities and will share this information with the GAO. The second recommendation is to provide key stakeholders, such as the NAS, with meaningful and timely input on decisions relating to these issues. The Census Bureau solicits input from many stakeholders, including the three panels that the Census Bureau has asked the NAS to convene, the Census Bureau's advisory committees, and federal agencies. We plan to continue to make use of these lines of communication. The final recommendation in this report is a request that the Census Bureau make public the information underlying the Census Bureau's decisions on these issues when it makes the decisions. The Census Bureau will produce documentation for key operational and technical decisions and research papers that detail the reasons for decisions on issues such as these. Additional Specific Comments: The Census Bureau also has the following 62 specific comments about statements in the draft report. p. 1, para. 2...mailed to a sample of 15 million households." The Census 2000 sample data were compiled from a sample of approximately 19 million housing units that received the census long form questionnaire. 2. p. 2, para. 1 "...populations of 20,000 to 65,000 by cumulating ACS results for 1 year..." Clarify this statement to be either "less than 20,000 by cumulating— 5 years" or "greater than 65,000 by cumulating...1 year" 3. p. 2, para. 2 "3 year averages for geographic areas with populations between 20,000 and 65,000 will begin,..." Throughout this report, the data product plan is misstated. In general, it should state that single-year estimates will be produced for areas with 65,000 or more population; 3-year estimates will be produced for areas with 20,000 or more population; and 5-year estimates will be produced for all areas down to the block group/tract level. 4. p. 2, para. 2 "The 5 -year averages for 2008-2012 to be published in 2013 will replace...and closely reflect the population and housing characteristics data from the 2010 Decennial Census short form." The degree to which the 2008-2012 estimates reflect the 2010 estimates is dependent on several issues, including the impact of residence rules differences, weighting research, and annexations between 2010 and 2012. 5. p. 2, para. 2...annual estimates for large geographic areas and estimates for smaller areas in terms of 3 year or 5 -year averages." Replace "annual" with "single-year." Note that annual estimates will be produced for smaller areas too-not only for large areas. 6. p. 3, para. 3 and para. 4: Several times on this page and later pages statements are made about the ACS residence concepts that apply only to the ACS population controls. The ACS is not introducing a new concept of residence, nor does it plan to adopt a concept of "current residence" that is new. The ACS has employed a current residence concept in data collection since its inception. This includes having a full set of rules to determine current residence. The work that has not, as yet, been done is the work to assess the need and methods, if required, to adjust the existing usual residence-based population estimates to current residence-based estimates. This is research that has been identified but has not been completed. In addition, other research is being planned on how the housing unit and population controls used in the ACS weighting process might be modified to address the issue of different residence rules between the ACS (current residence) and the intercensal population estimates program (usual residence). 7. p. 4, para. 1 "...does not plait to incorporate improved questions until 1010." The cited research was undertaken to revise the "multiple residence questions" that are needed for the research on current residence-based controls. Based on this research, changes were made to the 2003 ACS questionnaire with the new questions added. In the course of that research, additional changes to other parts of the questionnaire were identified. To preserve content consistency, the ACS wants to minimize the frequency of questionnaire wording changes. The Census Bureau plans to incorporate all such changes in the 2008 questionnaire and to limit content changes for at least five years after that. The suggestions about the residence questions, along with other content changes (such as those needed for disability) will be integrated into the 2008 forms and procedures-The "2010" timing is therefore incorrect. 8. p. 4, para 2 "The Census Bureau has not developed a methodology for using ICPE...to derive controls consistent with the ACS residence concept and ACS reference period or at the same level of geography used for the 2000 Census long form, " It is correct to say that methods have not, as yet, been developed for deriving controls consistent with the ACS residence concept. Census staff have identified this inconsistency as a weighting issue and plan to conduct required research to assess the extent of the issue and, if necessary, to develop approaches to address it. Multiple options could be considered to deal with this issue. The Census Bureau has not concluded that the existing population estimates must be changed to deal with seasonal populations in the ACS. Related to the residence concept is the "reference period." The ICPE estimates refer to a July 1 point in time and are roughly equivalent to an "average population for the year," which is considered to be the ACS reference period-For this reason, the Census Bureau does not see the existing population controls as being inconsistent. The third dimension noted, level of geography, is recognized by the Census Bureau as inconsistent with the long-form estimation methods. However, there is currently no methodology to develop estimates with age, sex, race, and Hispanic-origin characteristics for geographies lower than county level nor methodologies to develop estimates of total population for geographies lower than functioning governmental units. The Census Bureau plans to investigate alternative sources that might make it possible to provide some subcounty-level controls. 9. p. 4, para. 3 "Before data for 2005...a methodology is needed to provide controls..." The Census Bureau does not agree that revisions to the controls are needed for 2005. The Census Bureau has concluded that a decision on the methodology and the associated research and development activities should be completed prior to production of 2008 data. 10. p. 4, para. 4 "Census Bureau Officials told us that they agreed with the need for such guidance but had no plans for its contents." The Census Bureau has begun efforts to produce additional information for users on multiyear averages and to better understand bow they (federal agencies) currently apply census sample data. Planning efforts began over a year ago with the fast Federal Agency Information meeting. The report is correct that there is no formal plan or schedule, but incorrect that there are no plans for its content. A contract is in place with the NAS to convene a panel of experts to provide input and recommendations into the content of information that should be provided to ACS users transitioning from the long form to the ACS. 11. p. 4, para. 5 " ...latest schedule for the 2010 Decennial Census does not provide adequate time for the Census Bureau to incorporate into the full ACS program changes necessary for the ACS data far 2008- 2012 to be reliable enough..." It isn't clear what 2010 activities the report is referring to and how they impact the ACS, Schedules have been defined to ensure that 2010 content is defined no later than January 2007 to allow those content changes to be reflected in the 2008 ACS. If the report is referring to activities outside of the 2010 process, such as the development of revised population controls, it should state that. The report should also clarify what is meant by "reliable enough." 12, p. 5, para. 3 "...ACS Supplementary Survey..." The Supplementary Surveys have always been called the Census Supplementary Surveys, not the ACS Supplementary Surveys. 13. p. 5, para. 3 "...will incorporate population and housing characteristics data from the 2010 Decennial Census short form." Clarify that this statement refers only to using 2010 results in the form of population controls. As written, the statement suggests that actual data from the 2010 short form will be incorporated in the ACS. Also, see Item 4. 14. p. 6, para. 3 "...decennial sample of about 15 million households" See Item 1. 15. p. 6, para. 4: The report refers several times to the residence rules being used for "seasonal residents." The residence rules are applied to all people, and although many of the rules impact seasonal residents, they also impact people who might spend time in a Group Quarter and people with multiple residences that aren't seasonal. 16, p. 7, para. 2 "...change the concept to current residence." Same comment as earlier about explanation of residence rules in place for the ACS. It is incorrect to say that the ACS is changing the concept to current residence. The ACS has been collecting the data using current residence rules since its inception in 1996. Using the example in this report, people who live in both Florida and New Hampshire for parts of the year (assuming more than two months in each place) could be interviewed in both states in the ACS, unlike the census that would have to pick one-New Hampshire. 17. p. 7, para. 3 Reference Period: Again, clarify that this explanation applies only to the controls. The reference periods for data collection are driven by several factors, the controls are only one of them. Reference period includes period of data collection and the reference requested for reporting (e.g., last week, last three months). 18. p. 7, para. 4 "The ACS will use population characteristics-not from the results collected in the survey." Clarify that the report is referring to controlling the final data. The ACS will certainly be using the population and housing characteristics collected in the ACS in the final data products. 19. p. 8, para. 3 "...nonresponse follow-up will he conducted far a sample of one-third of all nonrespondents." The one-third subsample is not technically correct for full implementation. In full implementation, nonrespondents can be sampled at different rates, such as 1-in-2, 2-in-5 or 1-in-3 rates. Addresses that were unmailable are sampled at 2-in-3. 20. p. 10, para. 2 "...new ACS data and corresponding data from the 2000 Census long-form and 2004 supplementary survey.,." We would appreciate clarification of this recommendation. What is meant by "new ACS data?" What specific comparisons are being suggested? 21. p. 10, para. 3 "...ACS implementation schedule is not synchronized..." Same as comment regarding page 4 of the draft report about the schedule. See comment 11 above. 22. p. 12, para. 2 ",.. no plans to maintain time-series consistency..." and "...not planning such revisions..." The ACS has plans to make appropriate changes to the population controls when necessary, including the possibility of reweighting the data around the 2010 time period and for all multiyear estimates. 23. p. 14, para. 1 "...would be addressed before implementation of the full ACS." Questionnaire design/wording changes will be implemented in 2008. Some specific testing is underway (e.g., disability) in conjunction with the ACS OMB Content Group. 24, p. 14, para. 2 "...no plans to provide for external consultations..." This statement is incorrect. All question design changes are currently being worked through the ACS OMB Content Group to ensure that all federal agencies are aware of and, if they choose, can participate in the process for making revisions. The Census Bureau has no plans to make content changes to the questionnaire without such changes working through this group. Plans for testing race and Hispanic origin questions have been shared with multiple stakeholders, including advisory committees and federal agencies. 25. p, 14, para. 3 ".., current time schedule does not call for resolving issues such as questionnaire design before 2008." The Census Bureau has a set of schedules for content changes that require that all changes be identified/finalized no later than January 2007 in order that those changes be reflected in all questionnaires, translations, and related materials. A 2010 decision memorandum documents that requirement for all items common to 2010 and the ACS. 26. p, 14, para, 5 "do not include time for consulting with stakeholders and users..." Consultation and updates are provided regularly at meetings of the census advisory committees, State Data Centers, Census Information Centers, and the ACS OMB Content Group. Additional communication occurs through the Federal Agency Information meetings. Also, see response to Item 10. 27. p. 15, para. 2 "—monthly values collected in the ACS were adjusted to their values for December of the data collection year, using the CPL" The basis for dollar adjustments in the ACS would more accurately be described as the difference between the average CPI over a household's 12-month reference period and the calendar year of the interview. Seethe "Adjusting Dollar Amounts" section at the URL below for a more detailed description of the method. 28. p. 17, para. 1 "...no documentation explaining the rationale for the adjustment..." and "—stakeholders and users had not been consulted..." There is an OMB Directive to use the CPI to update the official poverty thresholds every year. While no formal rationale is documented, users recognize the need for such an adjustment when trends are being produced and compared, and the CPI is the generally accepted method. Census Bureau staff had talked extensively with economic data users about ACS income data since 1996, bringing up this issue, The Census Bureau also discussed the appropriate use of the CPT with the Census Advisory Committee of Professional Associations. They endorsed its use to inflate historical income data to improve year-to-year comparability, 29, p. 17, para. 1 "...this type of adjustment is not used for the SIPP..." SIPP reports that look at economic trends, such as the recent income dynamics report from the SIPP (P70-95), use CPI adjustments in order to make consistent comparisons over time. Also, SIPP wealth reports, like P70-88, use CPI adjustments in discussing whether median wealth has increased or declined over time in real (after inflation) terms. These uses are consistent with the way that the CPS and: the ACS use the CPT when assessing if incomes have risen/fallen over time in real terms. 30, p.18, para. 1 "...scope has been reduced in terms of levels, data items, and time period " It isn't clear what this statement is based on. The scope of these reports was defined over time based on consultation with advisory committees and users. The final choice of having the single-year reports focus on national and selected subnational (county) results was at the suggestion of the Census Advisory Committee of Professional Associations. The specific items in scope were defined to parallel the data in the four decennial census profile reports. Those were included in the final reports. The 3-year comparison reports included county and tract-level data for all test sites with the exception of Houston, TX. Means and medians were included in each of these reports. It is unclear what is meant by "time period." 31. p. 18, para. 2 "...no plans to evaluate the comparisons for 27 of the 31 test sites." All 31 sites were included in the two 3-year averages reports issued by the Census Bureau in June 2004. 32. p. 18, para. 2 "...4 test sites to be studied will not cover subcounty local government units. Although some proposed comparisons of subcounty data (i.e., MCD data in Vilas and Oneida, WI) could not be analyzed, the four test site reports that were issued in June 2004, studied a large amount of subcounty data including census tract data and neighborhood data in the case of the Bronx. 33. p. 18, footnote "...delay in completing the planned evaluation studies resulted from...2000 Census Accuracy, Coverage, and Evaluation program..." Staff working on the A.C.E. evaluations were never scheduled to participate in any of the ACS comparison studies. The delays in these comparison studies were solely driven by the need to test the impact of using voluntary methods in the ACS. 34. p. 19, para. 2 "...delays in completing the evaluations...likely to affect the use of the ACS in improving the small geographic area estimates of unemployment and poverty." The delays in the comparison studies have not affected [MISSING WORD] of work on using ACS data to improve small area poverty estimates program (SAIPE). Work has already begun on developing models that utilize ACS data. 35. p. 20, para. 2 "..,missed the opportunity to test..." The Census Bureau plans to conduct research into some of these areas and does not see that the window of opportunity for such testing is now closed. Several of these types of evaluations warrant the use of data that are only recently available. 36, p. 20, para. 3 "..no indication that the Census Bureau will be following this advice..." "—not yet followed similar advice from us, other government agencies, or its own staff." The various parts of this report are inconsistent with this statement. Appendix 11 and other similar records clearly show that advice given to the Census Bureau from NAS, Census Advisory Committees, and federal agencies has in many ways defined Census Bureau priorities and methods for testing. This includes the NAS request for information on quality comparisons, which were documented in 2002 in the report series, in several American Statistical Association papers in 2003, and in additional report series reports in 2004. The NAS also requested that data files be made available for users to conduct their own comparisons. This request required considerable resources but was given priority and data comparing Census 2000 and test site, 3-year averages were recently released on the ACS Web site. There are many additional examples that could be cited. The statement would be correct if it said that not all recommendations have been followed. 37. p. 20, para. 4 "...no plans to seek advice..." "—will receive little of no input..." The ACS has plans in place to solicit input from' users and stakeholders, including the NAS through a variety of means including the recently initiated NAS panel, the ACS OMB Content Group, the Federal Agency Information meetings, and the regular Census Advisory Committee meetings. 38. p, 22, para. 1 "...2004 and 2006 test plans for the 2010 Decennial census...until 2005. The point of this final phrase is unclear, The relationship between the ACS and these tests is limited to certain content testing, which is scheduled to coincide with ACS schedules for final content definition in January of 2007. 39. p. 22, Table 1: Numerous references to unresolved residence rules should be clarified. 40, p, 22, Table 1 (Q4, 2006): The level of geographic detail for the 2006 release or 2065 data is not an unresolved issue. 41, p. 23, Table l "...changes to operational procedures, such as sampling rate far nonresponse followup" The proposed change to the personal visit followup subsampling rate will be implemented in 2005. The required research was complete 3 early this year and the results are currently being documented. There are no plans to revise this plan unless minor refinements are required based on observation in 2005. 42. p. 24, para, 1 "...Census Bureau has announced that comparisons...will he limited." The Census Bureau does not characterize the comparison reports as limited. The Census Bureau released a complete series of comparison studies this spring-six detailed reports conducted by staff at the Census Bureau, another four reports conducted by experts outside of the Census Bureau.' As these reports are reviewed and discussed, additional research efforts maybe identified. 43, p. 25, para. 1 "...the ACS implementation plan and the 2010 Decennial Census test programs are not synchronized..." The plans are synchronized by design. 44. p. 25, para. 1 "Without prompt resolution of issues such as those relating to the calculation of independent controls...the ACS will not he an adequate replacement for the long farm..." The Census Bureau does not agree that these issues must be immediately resolved, nor that they are critical to ensuring that the ACS can replace the long form. The fourth report is scheduled for release the last week of July: 45, p. 43, para. 1 "They reported that the Census Bureau was planning to conduct research using data from the 1996 test sites to produce controls at the census tract and block group level" The paper states that, "The Bureau's demographic estimates staff are researching ways to snake more detailed estimates for the 1996 CM test areas." This refers to intercensal population estimates for counties broken down by-sex, age, race and Hispanic origin. Up to that time only the total population for counties was produced. There was no suggestion of producing estimates for sub-county areas. 46. p. 52, para. 1 "...had not been released when we prepared our final plan of this report." The release dates for the various comparison reports are as follows: Comparing Basic Demographic and Housing Characteristics With Census 2000 - May 2004: Comparing Economic Characteristics With Census 2000 - May 2004: Comparing Social Characteristics With Census 2000 - June 2004: Comparing Housing Characteristics With Census 2000 - expected in July 2004: Comparisons of the ACS 3-Year Average and the Census 2000 Sample for a Sample of Counties and Tracts - June 2004: Comparing Quality Measures: The American Community Survey's 3-Year Averages and Census 2000's Long-Form Sample Estimates - June 2004: 47. p. 53, para. 1 "...to control the estimates to an independent source, it would be necessary to achieve agreement between the ACS estimates of occupied housing units, households, and householders at all geographic levels." The desire to use a single weight to make these three estimates equal is not a prerequisite to controlling estimates to an independent source. 48. p. 53, para. 3 "...data on group quarters were collected at the test sites beginning with 1999..." Group quarters data were collected in the test sites in 1999 and 2001 only. 49. p. 57, para. 1 "...and housing characteristics (occupied and vacant units) for the full ACS..." The ACS uses total housing units in the population controls, not information on occupied and vacant units. 50. p. 57, para, 1 "Controls will he needed at the same level of geographic area detail as those that were used far the 2000 Census long form and will need,." The same level of geographic area controls are desirable but not necessary. The quality of controls is very important for their use in the ACS. If quality estimates at lower levels of geography are available, the ACS will consider their use. A similar set of statements also appear on page 60 (paragraph 2) and page 61 (paragraph 1). 51. p. 58, Table 3: The table should be clarified to note that the residence concepts that are being discussed are not the residence rules for data collection but the residence basis for the population controls. The term, "reference period" should be clarified. In addition, the current plans do not envision having revised control methodologies in place for weighting of the 2005 data. 52. p. 59, Table 3 (2005 - 09d): Weighting area options in this table should be revised. There are no plans to control to the same areas as the Census 2000 long form or to use detail from the Census 2000. 53. p. 60, para. 1 "Using ICPE for the ACS supplementary surveys, the Census Bureau prepared controls for places with a population of 250,000 or more." This statement is incorrect. The weighting area (as indicated in Table 1) was county or county combinations. 54. p. 61, para. 1 "...to provide ACS-defined controls for all places..." This statement misrepresents the weighting process. Controls are only used for counties or groups of counties, not for other places. 55. p. 61, para. 1 "For the 2008-2012 estimates, a methodology for controls down to the block level will he needed" The Census Bureau does not agree with this statement and has not stated a requirement for block-level controls. 56. p. 67, para. 3 "—this team will focus on group quarters in the context of the 2010 census short form." This group has a charter to support the group quarters needs of both the ACS and the 2010 decennial census. 57. p.68, para. 1 "...current time schedule does not allow for issues such as questionnaire design to he resolved unlit 2010" Same comments as earlier. See Item 11. 58. p. 73, para. 1 "...first report focused on evaluations of differences in the A C5 and census short form data items." The first report focused on the operational feasibility of conducting the ACS. It did not include evaluations of differences in any data items. 59. p. 73, para. 2 and para. 3 "...will not be issued until mid-2004." "...has yet to publish comparisons..." See Item 46 above for release schedule for comparison study reports. 60. p. 75, para. 4 "The author was unable to determine which estimate was more reliable..." The author did make such a determination, concluding that problems during Census 2000 nonresponse follow-up led to the higher than expected disability rates in Census 2000 relative to the ACS. 61 p. 83, para. 3 "...indicate a lack of responsiveness." The Census Bureau has responded to all Census Advisory Committee recommendations and have provided all briefings that they have requested. It is true that not all recommendations have been followed and also true that within the committees, recommendations are not always consistent. When the Census Bureau decides not to pursue a recommended action, explanations are provided to the committees. 62 GAO highlights (cover page) characterize the ACS as a mail survey. It more accurately is a multimode survey using mail, telephone, and personal visit modes of data collection. Also, 3-year averages will be produced for areas of 20,000 or more, not areas of 20,000 to 65,000. [End of section] Appendix VII: GAO Contacts and Staff Acknowledgments: GAO Contacts: Robert P. Parker (202) 512-9750, parkerr@gao.gov. Christopher Moriarity (202) 512-5420, moriarityc@gao.gov: Staff Acknowledgments: Additional staff who made major contributions to this report were Heather Von Behren, Penny Pickett, Mitchell Karpman, Michael Volpe, Andrea Levine, Patricia Dalton, and Robert Goldenkoff. [End of section] Bibliography: The first section in this bibliography lists documents on the history of the long form and mid-decade census. The remaining works are divided between numerous types of Census Bureau reports and papers, Association of Public Data Users papers, congressional hearings and testimony, and other reports and papers. Recent reports from the National Academy of Sciences are discussed in appendix II. Related GAO Products are listed in a separate section at the end of this report. History of the Long Form and Mid-Decade Census: Alexander, Charles H. "Still Rolling: Leslie Kish's 'Rolling Samples' and the American Community Survey." In Proceedings of Statistics Canada Symposium 2001: October 16-19. Ottawa: Statistics Canada, 2002. Anderson, Margo J., ed. Encyclopedia of the U.S. Census. Washington, D.C.: CQ Press, 2000. House of Representatives, Committee on Post Office and Civil Service, Subcommittee on Census and Population. Review of Major Alternatives for the Census in the Year 2000. Serial 102-25. Washington, D.C.: August 1, 1991. House of Representatives, Committee on Post Office and Civil Service. Census Confidentiality/Mid-Decade Sample Survey Bill. Report 93-246. Washington, D.C.: June 4, 1973. House of Representatives, Committee on Post Office and Civil Service. Mid-Decade Censuses of Population, Unemployment, and Housing. Report 780. Washington, D.C.: August 12, 1965. Salvo, Joseph, and Arun Peter Lobo. The American Community Survey: Quality of Response by Mode of Data Collection in the Bronx Test Site. Presented at 2002 Joint Statistical Meetings, New York City, August 14, 2002. Census Bureau ACS Reports: American Community Survey Operations Plan, Release 1. Washington, D.C.: March 2003. Meeting 21st Century Demographic Data Needs: Implementing the American Community Survey. Report 1. Demonstrating Operational Feasibility. Washington, D.C.: July 2001. Meeting 21st Century Demographic Data Needs: Implementing the American Community Survey. Report 2. Demonstrating Survey Quality. Washington, D.C.: May 2002. Meeting 21st Century Demographic Data Needs: Implementing the American Community Survey. Report 3. Testing the Use of Voluntary Methods. Washington, D.C.: December 2003. Meeting 21st Century Demographic Data Needs: Implementing the American Community Survey. Report 4. Comparing General Demographic and Housing Characteristics With Census 2000. Washington, D.C.: May 2004. Meeting 21st Century Demographic Data Needs: Implementing the American Community Survey. Report 5. Comparing Economic Characteristics With Census 2000. Washington, D.C.: May 2004. Meeting 21st Century Demographic Data Needs: Implementing the American Community Survey. Report 6. The 2001-2002 Operational Feasibility Report of the American Community Survey. Washington, D.C.: May 2004. Meeting 21st Century Demographic Data Needs: Implementing the American Community Survey. Report 7. Comparing Quality Measures: The American Community Survey's Three-Year Averages and Census 2000's Long Form Sample Estimates. Washington, D.C.: June 2004. Meeting 21st Century Demographic Data Needs: Implementing the American Community Survey. Report 8. Comparison of the American Community Survey Three-Year Averages and the Census Sample for a Sample of Counties and Tracts. Washington, D.C.: June 2004. Meeting 21st Century Demographic Data Needs: Implementing the American Community Survey. Report 9. Comparing Social Characteristics With Census 2000. Washington, D.C.: June 2004. Meeting 21st Century Demographic Data Needs: Implementing the American Community Survey. Report 10. Comparing Housing Characteristics With Census 2000. Washington, D.C.: July 2004. Census Bureau Advisory Committee Presentations: The presentations in this section were made by the Census Bureau's Decennial Census Advisory Committee, Professional Association Advisory committees, and Race and Ethnic Advisory Committees. The ACS: Data Products to Meet User Needs. Race and Ethnic Advisory Committee meeting, Washington, D.C., March 14, 2001. Alexander, Charles, Alfredo Navarro, and Deborah Griffin. Update on ACS Evaluations. Decennial Census Advisory Committee meeting, Washington, D.C., November 5, 2001. Gordon, Nancy. The American Community Survey. Joint Meeting of the Census Bureau Advisory Committees, Washington, D.C., July 28, 2000. Gordon, Nancy. The American Community Survey. Decennial Census Advisory Committee meeting, Washington, D.C., September 21-22, 2000. Gordon, Nancy. The American Community Survey. Race and Ethnic Advisory Committees meeting, Washington, D.C., November 2, 2000. Gordon, Nancy. American Community Survey Update. Decennial Census Advisory Committee meeting, Washington, D.C., May 2, 2002. Griffin, Deborah. An Overview of the Research and Evaluation Program for the American Community Survey. Decennial Census Advisory Committee meeting, Alexandria, Virginia, October 2-4, 2002. Griffin, Deborah H. Comparing Characteristics from the American Community Survey and Census 2000: Methodology. Census Advisory Committee of Professional Associations meeting, Washington, D.C., April 10-11, 2003. Navarro, Alfredo. American Community Survey: Use of Population Estimates as Controls in the ACS Weighting. Census Advisory Committee of Professional Associations meeting, Washington, D.C., October 23, 2003. Navarro, Alfredo. A Discussion of the Quality of Estimates from the American Community Survey for Small Population Groups. Census Advisory Committee of Professional Associations meeting, Washington, D.C., October 2-3, 2002. Weidman, Lynn, and Signe Wetrogan. Enhancing the Intercensal Population Estimates Program with ACS Data: Summary of Research Projects. Census Advisory Committee of Professional Associations meeting, Washington, D.C., October 23, 2003. Census Bureau Continuous Measurement Series: The memorandums listed here from the 20 in the Continuous Measurement series are those most directly related to topics we review in this report. Alexander, Charles H. A Continuous Measurement Alternative for the U.S. Census. CM-10, October 28, 1993. CM-11 summarized this paper at the 1993 annual meeting of the American Statistical Association, San Francisco, California, August 10, 1993. Alexander, Charles H. Further Exploration of Issues Raised at the CNSTAT Requirements Panel Meeting. CM-13. Internal Census Bureau memorandum, Washington, D.C., January 31, 1994. Alexander, Charles H. A Prototype Continuous Measurement System for the U.S. Census of Population and Housing. CM-17. Presented at the annual meeting of the Population Association of America, Miami, Florida, May 5, 1994. Alexander, Charles H. Some Ideas for Integrating the Continuous Measurement System into the Nation's System of Household Surveys. CM- 19A. Internal Census Bureau memorandum, Washington, D.C., January 6, 1995. Census Bureau Internal Reports: 2004 Census Test Operational Plan. Washington, D.C.: September 29, 2003. 2010 Census Decision Memorandum Series No. 5, Finalizing Content for the 100 Percent Items in the 2010 Census and the American Community Survey. Washington, D.C.: June 3, 2004. 2010 Census Planning Memorandum Series No. 24, Action Plan: 2010 Research and Development Planning Group on Race and Ethnic Data Collection, Tabulation, and Editing. Washington, D.C.: June 9, 2004. 2010 Census Planning Memorandum Series No. 26, Action Plan: 2010 Research and Development Planning Group on Special Places/Group Quarters Development and Testing. Washington, D.C.: March 8, 2004. ACS-2010 Consistency Review Plan. Washington, D.C.: October 1, 2003. American Community Survey Development Report Series Program Plan. Washington, D.C.: rev. June 12, 2002. Census 2000 Evaluation Reports: Abramson, Florence. Special Place/Group Quarters Enumeration. Census 2000 Testing, Experimentation, and Evaluation Program, Topic Report No. 5. U.S. Census Bureau, Washington, D.C., February 2004. Adlakha, Arjun, J. Gregory Robinson, Kirsten West, and Antonio Bruce. Assessment of Consistency of Census Data with Demographic Benchmarks at the Subnational Level. Census 2000 Evaluation O.20. U.S. Census Bureau, Washington, D.C., August 18, 2003. Clarke, Sandra, John Iceland, Thomas Palumbo, Kirby Posey, and Mai Weismantle. Comparing Employment, Income, and Poverty: Census 2000 and the Current Population Survey. Census 2000 Auxiliary Evaluation. U.S. Census Bureau, Washington, D.C., September 2003. Palumbo, Thomas and Paul Siegel. Accuracy of Data for Employment Status as Measured by the CPS-Census 2000 Match. Census 2000 Evaluation B.7. U.S. Census Bureau, Washington, D.C., May 4, 2004. Schneider, Paula. Content and Data Quality in Census 2000. Census 2000 Testing, Experimentation, and Evaluation Program, Topic Report No. 12. U.S. Census Bureau, Washington, D.C., January 22, 2004. Census Bureau 2003 JSM Staff Papers: Bureau staff presented many ACS-related papers at the August 2003 Joint Statistical Meetings in San Francisco, California. We reviewed the papers in this section in detail because they were related to comparisons between ACS estimates and 2000 Census results. Boggess, Scott, and Nikki L. Graf. Measuring Education: A Comparison of the Decennial Census and the American Community Survey. Presented at Joint Statistical Meetings, San Francisco, California, August 7, 2003. Dye, Jane Lawler. Grandparents Living with and Providing Care for Grandchildren: A Comparison of Data from Census 2000 and 2000 American Community Survey. Presented at Joint Statistical Meetings, San Francisco, California, August 7, 2003. Love, Susan, and Deborah Griffin. A Closer Look at the Quality of Small Area Estimates from the American Community Survey. Presented at Joint Statistical Meetings, San Francisco, California, August 4, 2003. Posey, Kirby G., Edward Welniak, and Charles Nelson. Income in the American Community Survey: Comparisons to Census 2000. Presented at Joint Statistical Meetings, San Francisco, California, August 7, 2003. Raglin, David A., Theresa F. Leslie, and Deborah H. Griffin. Comparing Social Characteristics between Census 2000 and the American Community Survey. Presented at Joint Statistical Meetings, San Francisco, California, August 3, 2003. Stern, Sharon M. Counting People with Disabilities: How Survey Methodology Influences Estimates in Census 2000 and the Census 2000 Supplementary Survey. Presented at Joint Statistical Meetings, San Francisco, California, August 7, 2003. Other Census Bureau Staff Research Papers: Alexander, Charles H. American Community Survey Data for Economic Analysis (October 2001). Presented at the Census Advisory Committee of the American Economic Association meeting, Washington, D.C., October 18-19, 2001. Alexander, Charles H. Recent Developments in the American Community Survey. Presented at the 1998 Joint Statistical Meetings, Dallas, Texas, August 12, 1998. Alexander, Charles H., Sharon Brown, and Hugh Knox. American Community Survey Data for Economic Analysis (December 2001). Presented at the Federal Economic Statistics Advisory Committee meeting, Washington, D.C., December 14, 2001. Alexander, Charles H., Scot Dahl, and Lynn Weidman. Making Estimates from the American Community Survey. Presented at the 1997 Joint Statistical Meetings, Anaheim, California, August 13, 1997. Alexander, Charles H., and Signe Wetrogan. Integrating the American Community Survey and the Intercensal Demographic Estimates Program. Presented at the 2000 Joint Statistical Meetings, Indianapolis, Indiana, August 14, 2000. Butani, Shail, Charles Alexander, and James Esposito. Using the American Community Survey to Enhance the Current Population Survey: Opportunities and Issues. Presented at the 1999 Federal Committee on Statistical Methodology Research Conference, Arlington, Virginia, November 15-17, 1999. Davis, Mary Ellen, and Charles H. Alexander, Jr. The American Community Survey: The Census Bureau's Plan to Provide Timely 21st Century Data. Missouri Library World, Spring 1997. DeMaio, Theresa J., and Kristen A. Hughes. Report of Cognitive Research on the Residence Rules and Seasonality Questions on the American Community Survey. U.S. Bureau of the Census, Statistical Research Division, Washington, D.C., July 2003. Love, Susan, Donald Dalzell, and Charles Alexander. Constructing a Major Survey: Operational Plans and Issues For Continuous Measurement. Presented at the 1995 Joint Statistical Meetings, Orlando, Florida, August 16, 1995. Nelson, Charles, and Kathleen Short. The Distributional Implications of Geographic Adjustment of Poverty Thresholds. U.S. Bureau of the Census, Housing and Household Economics Statistics Division, Washington, D.C., December 2003. Posey, Kirby G., and Edward Welniak. Income in the ACS: Comparisons to the 1990 Census. Presented at the American Community Survey Symposium, Suitland, Maryland, March 1998. Salvo, Joseph, and Arun Peter Lobo. The American Community Survey: Quality of Response by Mode of Data Collection in the Bronx Test Site. Presented at the 2002 Joint Statistical Meetings, New York, August 14, 2002. Smith, Amy Symens. The American Community Survey and Intercensal Population Estimates: Where Are the Crossroads? Technical Working Paper 31, U.S. Census Bureau, Population Division, Washington, D.C., December 1998. Association of Public Data Users Papers: Davis, Mary Ellen. The American Community Survey Data Products, Alexandria, Virginia, October 20, 2003. Gage, Linda, State of California, Department of Finance. American Community Survey: Research by the Data User Community. Alexandria, Va.: October 20, 2003. Petroni, Rita. How Do 3-Year Averages from the ACS Compare to Census 2000 Data? (Preliminary Results). Alexandria, Va.: October 20, 2003. Salvo, Joseph, City of New York, Planning Department. American Community Survey: Research by the Data User Community. Alexandria, Va.: October 20, 2003. Scarr, Harry A. Deputy Director, Census Bureau. Continuous Measurement. Association of Public Data Users, Washington, D.C; October 16, 1994. Congressional Hearings and Testimony: Barron, William Jr., Acting Director, U.S. Bureau of the Census, before the U.S. House of Representatives, Committee on Government Reform, Subcommittee on the Census. The Census Bureau's Proposed American Community Survey (ACS), Serial 107-9. Washington, D.C.: June 13, 2001. Kincannon, Charles Louis, Director, U.S. Bureau of the Census, before the U.S. House of Representatives, Subcommittee on Technology, Information Policy, Intergovernmental Relations, and the Census. The American Community Survey: The Challenges of Eliminating the Long Form from the 2010 Census, Serial 108-97. Washington, D.C.: May 13, 2003. Prewitt, Kenneth, Director, U.S. Bureau of the Census, before the U.S. House of Representatives, Committee on Government Reform, Subcommittee on the Census. House Hearing on ACS July 20, 2000. The American Community Survey: A Replacement for the Census Long Form? Serial 106- 246. Washington, D.C.: July 20, 2000. Other Reports and Papers: Kalton, Graham, and others. The American Community Survey: The Quality of Rural Data, Report of a Conference. Rockville, Md.: Westat, June 29, 1998. Nardone, Thomas, and others. Examining the Discrepancy in Employment Growth between the CPS and the CES. Washington, D.C.: FESAC, October 17, 2003. National Council on Disability. Improving Federal Disability Data. Washington, D.C.: January 9, 2004. ORC Macro. The American Community Survey: Challenges and Opportunities for HUD. Calverton, Md.: September 27, 2002. Vroman, Wayne. Comparing Labor Market Indicators from the CPS and ACS. Washington, D.C.: Urban Institute, September 2003. Westat Inc. The American Community Survey: A Report on the Use of Multi-Year Averages. Rockville, Md.: April 30, 1999. [End of section] Related GAO Products: 2010 Census: Cost and Design Issues Need to Be Addressed Soon. GAO-04- 37. Washington, D.C.: January 15, 2004. Medicaid Formula: Differences in Funding Ability among States Often Are Widened. GAO-03-620. Washington, D.C.: July 10, 2003. Formula Grants: 2000 Census Redistributes Federal Funding Among States. GAO-03-178. Washington, D.C.: February 24, 2003. Major Management Challenges and Program Risks: Department of Commerce. GAO-03-97. Washington, D.C.: January 1, 2003. The American Community Survey: Accuracy and Timeliness Issues. GAO-02- 956R. Washington, D.C.: September 30, 2002. Legal Authority for American Community Survey. B-289852. Washington, D.C.: April 4, 2002. Medicaid Formula: Effects of Proposed Formula on Federal Shares of State Spending. GAO/HEHS-99-29R. Washington, D.C.: February 19, 1999. Decennial Census: Overview of Historical Census Issues. GAO/GGD-98-103. Washington, D.C.: May 1, 1998. Poverty Measurement: Adjusting for Geographic Cost-of-Living Difference. GAO/GGD-95-64. Washington, D.C.: March 9, 1995. Status of the Statistical Community after Sustaining Budget Reductions. GAO/IMTEC-84-17. Washington, D.C.: July 18, 1984. FOOTNOTES [1] We discuss the other operations, which relate to the address list and the short-form census, in full in GAO, 2010 Census: Cost and Design Issues Need to Be Addressed Soon, GAO-04-37 (Washington, D.C.: Jan. 15, 2004). [2] We discuss the relative quality of the ACS and the long form in GAO, The American Community Survey: Accuracy and Timeliness Issues, GAO-02-956R (Washington, D.C.: Sept. 30, 2002), pp. 8-13. [3] The CPI is a national-level price index that BLS compiles. It also compiles separate price indexes for selected geographic areas, but these indexes do not measure differences in the level of prices among areas. [4] U.S. Census Bureau, American Community Survey Operations Plan, Release 1 (Washington, D.C.: March 2003), pp. 52-53. [5] This group's objective is to provide an open dialogue between its members and federal statistical agencies. See Council of Professional Associations on Federal Statistics, http://www.copafs.org (May 10, 2004). [6] See our recommendations in GAO-02-956R, pp. 25-26. For information on the Federal Agency Information Program, see Census Bureau, American Community Survey, http://www.census.gov/acs/www (May 10, 2004). [7] ICPE develops and disseminates annual "official" estimates of the total population and the distribution by age, sex, race, and Hispanic origin for the nation, state, counties, and functioning government units. The program is authorized by 13 U.S.C. §181, which requires the production of "current data on total population and population characteristics." The estimates of population and housing characteristics are as of July 1 of each year, using the usual resident concept for seasonal residents. For details on subcounty estimates, see U.S. Census Bureau, "Estimates and Projections Area Documentation: Subcounty Total Population Estimates," http://www.census.gov. [8] Susan Love, Donald Dalzell, and Charles Alexander, "Constructing a Major Survey: Operational Plans and Issues for Continuous Measurement," presented at the annual American Statistical Association meeting, Orlando, Florida, August 1995. [9] Charles H. Alexander and Signe Wetrogan, "Integrating the American Community Survey and the Intercensal Demographic Estimates Program," presented at the Joint Statistical Meetings, Indianapolis, Indiana, August 14, 2000. [10] See Bureau of Labor Statistics, "Labor Force and Employment Estimates Smoothed for Population Adjustments, 1990-2003," Washington, D.C., March 3, 2004. U.S. Department of Labor, Bureau of Labor Statistics, Demographics, Demographic Characteristics of the Labor Force (Current Population Survey), http://www.bls.gov/cps/cpspopsm.pdf (May 10, 2004). [11] Theresa J. DeMaio and Kristen A. Hughes, "Report of Cognitive Research on the Residence Rules and Seasonality Questions on the American Community Survey," U.S. Census Bureau, Statistical Research Division, Washington, D.C., July 2003. [12] DeMaio and Hughes, pp. 9-10. [13] Benjamin F. King, Chair, Panel on Research on Future Census Methods, National Academy of Sciences, letter to William Barron, Acting Director, U.S. Bureau of the Census, Washington, D.C., February 15, 2001, pp. 3-4. The National Academies, National Academies Press, 2010 Census Panel Letter Report (2001), http://books.nap.edu/html/ 2010_census_panel/letterreport.pdf (May 10, 2004). [14] BLS makes a similar adjustment to the average weekly earnings data from the monthly establishment survey. [15] For the HUD report, see ORC Macro, The American Community Survey: Challenges and Opportunities for HUD (Calverton, Md.: Sept. 27, 2002). For a complete discussion of the role of the inflation adjustment in differences between the ACS and CPS measures of income, see Kirby G. Posey, Edward Welniak, and Charles Nelson, "Income in the ACS: Comparisons to Census 2000," presented at the Joint Statistical Meetings, San Francisco, California, August 7, 2003. [16] For the procedures the Internal Revenue Service and the Department of Labor use, see SOI 2000: Corporation Income Tax Returns (Washington, D.C.: September 2003). Internal Revenue Service, Tax Statistics, Statistics of Income, SOI Products and Services, Corporation Tax Statistics--Complete Report Publications, http://www.irs.gov/taxstats/ article/0,,id=112834,00.html (May 10, 2004), and U.S. Department of Labor, Pension and Welfare Benefits Administration, Private Pension Plan Bulletin: Abstract of 1998 Form 5500 Annual Reports, no. 11 (Washington, D.C.: winter 2001-02). http://www.dol.gov/ebsa/PDF/ 1998pensionplanbulletin.pdf (May 10, 2004). [17] For example, in GAO, Poverty Measurement: Adjusting for Geographic Cost-of-Living Difference, GAO/GGD-95-64 (Washington, D.C.: Mar. 9, 1995), we noted that experts generally agreed that it is appropriate to adjust state-level poverty counts for cost-of-living differences but that they differed on the most appropriate method of making such adjustments. In Medicaid Formula: Differences in Funding Ability among States Often Are Widened, GAO-03-620 (Washington, D.C.: July 10, 2003), we showed that using different cost-of-living adjustments at the state level significantly affected the amount of federal funding. [18] Charles Nelson and Kathleen Short, "The Distributional Implications of Geographic Adjustment of Poverty Thresholds," U.S. Census Bureau, Housing and Household Economics and Statistics Division, Washington, D.C., December 2003. [19] Kenneth Prewitt, Director, U.S. Bureau of the Census, before the U.S. House of Representatives, Committee on Government Reform, Subcommittee on the Census, Summary of House Hearing on ACS July 20, 2000, The American Community Survey: A Replacement for the Census Long Form? Serial 106-246 (Washington, D.C.: July 20, 2000). [20] Census Bureau officials indicated that some of the delay in completing the planned evaluation studies may have resulted from the Census Bureau's need to devote additional resources to completing the evaluation of the 2000 Census Accuracy, Coverage, and Evaluation program and to a survey to test the effect of conducting the ACS as a voluntary survey. [21] See Daniel L. Cork, Michael L. Cohen, and Benjamin F. King, eds., Planning the 2010 Census: Second Interim Report (Washington, D.C.: National Academies Press, 2003), p. 99. [22] In Statistical Policy Directive 14, OMB designated the CPS as the official source of statistical measures of poverty. The U.S. Department of Health and Human Services also designated the CPS as the source of poverty measures for its programs in "Annual Update of the HHS Poverty Guidelines," 67 Fed. Reg. 6931 (Feb. 14, 2002). [23] Cork, Cohen, and King, pp. 99-100. [24] Cork, Cohen, and King, p. 6. [25] See, for example, Charles Alexander, "A Prototype Continuous Measurement System for the U.S. Census of Population and Housing," CM- 17, presented at the annual meeting of the Population Association of America, Miami, Florida, May 5, 1994, and Harry A. Scarr, Deputy Director, Census Bureau, "Continuous Measurement," Association of Public Data Users, Fredericksburg, Virginia, October 16, 1994. [26] GAO-04-37, pp. 11-12. [27] See Barry Edmonston and Charles Schultze, eds., Modernizing the U.S. Census (Washington, D.C.: National Academies Press, 1995), p. 9. The panel was mandated by Public Law 102-125 and funded by the Census Bureau. [28] Edmonston and Schultze, p. 3. [29] See National Research Council, Committee on National Statistics, The American Community Survey: Summary of a Workshop (Washington, D.C.: National Academy Press, 2001); Michael L. Cohen and Benjamin F. King, eds., Designing the 2010 Census: First Interim Report (Washington, D.C.: National Academy Press, 2000); Benjamin F. King, National Academy of Sciences, to William Barron, U.S. Bureau of the Census, February 15, 2001; Daniel L. Cork, Michael L. Cohen, and Benjamin F. King, eds., Planning the 2010 Census: Second Interim Report (Washington, D.C.: National Academies Press, 2003); Daniel L. Cork, Michael L. Cohen, and Benjamin F. King, eds., Reengineering the 2010 Census: Risks and Challenges (Washington, D.C.: National Academies Press, 2004); and Constance F. Citro, Daniel L. Cork, and Janet L. Norwood, eds., The 2000 Census: Counting under Adversity (Washington, D.C.: National Academies Press, 2004). [30] See, for example, Constance F. Citro and Graham Kalton, eds., Small-Area Income and Poverty Estimates: Priorities for 2000 and Beyond (Washington, D.C.: National Academy Press, 2000), and Thomas B. Jabine, Thomas A. Louis, and Allen L. Schirm, eds., Choosing the Right Formula: Initial Report (Washington, D.C.: National Academy Press, 2001). [31] The exceptions are the Census Bureau's inflation adjustments for dollar-denominated data items and the specific use of 2010 Census population and housing controls. [32] Charles Alexander, of the Census Bureau's Demographic Statistical Methods Division, who had directed most of its research on the ACS, prepared comments for the Census Bureau. See National Research Council, Committee on National Statistics, The American Community Survey: Summary of a Workshop, p. 5. [33] National Research Council, pp. 48-49. [34] National Research Council, pp. 1 and 3. [35] Charles Alexander, Technical Paper, prepared for Workshop on the American Community Survey, Committee on National Statistics, National Academy of Sciences, Washington, D.C., September 1998, p. 3-2. [36] National Research Council, p. 4. [37] National Research Council, pp. 4 and 5. The Census Bureau provided funding for the development of this model in fiscal year 1999, but there is no report that the model was completed. [38] Alexander, Technical Paper, pp. 3-7. [39] National Research Council, p. 26. [40] National Research Council, p. 48. [41] Michael L. Cohen and Benjamin F. King, eds., Designing the 2010 Census: First Interim Report (Washington, D.C.: National Academy Press, 2000), p. 2. The three other recommendations covered the master trace sample database, the 2000 Census administrative records research program, and the activities of local organizations that helped with the census count. [42] Cohen and King, p. 34. [43] Cohen and King, pp. 34-35. [44] Cohen and King, eds., p. 38. The Census Bureau has established the Federal Agency Information Program on the ACS in response to a recommendation we made in GAO-02-956R, p. 25. Information is at the Census Bureau's Web site at http://www.census.gov/acs/www/. [45] These recommendations were similar to recommendations we made in GAO-04-37, pp. 33-34. [46] King, to Barron, p. 3. [47] King, to Barron, p. 4. The other studies related to the 2000 Census mailing list and the effect of local partnerships on the 2000 Census collection process. [48] King, to Barron, p. 2. [49] Cork, Cohen, and King, p. 6. [50] Cork, Cohen, and King, pp. 99-102. As we noted above, we discussed issues related to planning for the 2010 Census in GAO-04-37. [51] Cork, Cohen, and King, p. 99. [52] Cork, Cohen, and King, p. 86. [53] Cork, Cohen, and King, p. 87. [54] Cork, Cohen, and King, p. 98. [55] Citro, Cork, and Norwood, eds., The 2000 Census: Counting under Adversity, p. 1. [56] Citro, Cork, and Norwood, pp. 2-3. [57] Citro, Cork, and Norwood, pp. 10-11. [58] Citro, Cork, and Norwood, p. 11. [59] Citro, Cork, and Norwood, p. 301. See recommendations 7.1 and 7.3. [60] The long form is also discussed in Joseph Salvo and Arun Peter Lobo, "The American Community Survey: Quality of Response by Mode of Data Collection in the Bronx Test Site," presented at 2002 Joint Statistical Meetings, New York, August 14, 2002; Margo Anderson, ed., Encyclopedia of the U.S. Census (Washington, D.C.: CQ Press, 2000); and GAO, Decennial Census: Overview of Historical Census Issues, GAO/ GGD-98-103 (Washington, D.C.: May 1, 1998). [61] See GAO-02-956R. [62] Data items similar or identical to those collected on the long form are collected by the Annual Demographic Survey (CPS's March supplement), Annual Housing Survey, and other surveys. However, these surveys' samples limit the data they provide to the national level and selected states and metropolitan areas. Annual data for all small geographic areas are available from (1) administrative records on unemployment insurance and wages and federal income tax and Medicare records; (2) statistical series the Bureau of Economic Analysis (BEA) prepares, such as local area personal income data; and (3) model-based series such as the Census Bureau's Small Area Income and Poverty Estimates program. [63] Harry A. Scarr presented this proposal as Deputy Director of the Census Bureau at an Association of Public Data Users conference in Fredericksburg, Virginia, on October 16, 1994. [64] The Census Bureau also prepared a number of internal papers that evaluated the results of the 2000 Census long form and recommended changes that applied to the ACS. We discuss some of these papers in this report. [65] Most of the papers in this series are not available to the public (a few are on the Census Bureau's ACS Web site at http:// www.census.gov/acs/www/), but the Census Bureau provided us with a complete set. [66] Some of the papers from this March 25, 1998, symposium are available on the Census Bureau's ACS Web site; several presented information from the 1996 ACS testing. [67] Susan Love, Donald Dalzell, and Charles Alexander, "Constructing a Major Survey: Operational Plans and Issues for Continuous Measurement," presented at the annual American Statistical Association meeting, Orlando, Florida, August 16, 1995. [68] Charles H. Alexander and Signe Wetrogan, "Integrating the American Community Survey and the Intercensal Demographic Estimates Program," presented at the Joint Statistical Meetings, Indianapolis, Indiana, August 14, 2000. [69] Charles H. Alexander, Scot Dahl, and Lynn Weidman, "Making Estimates from the American Community Survey," presented at the Annual American Statistical Association Meeting, Anaheim, California, August 13, 1997. [70] Mary Ellen Davis and Charles H. Alexander, "The American Community Survey: The Census Bureau's Plan to Provide Timely 21st Century Data," Delaware Dataline, Summer 1997. U.S. Census Bureau, American Community Survey, Advanced Methodology, Papers and Presentations, http:// www.census.gov/acs/www (June 3, 2004). [71] Charles Alexander, "Recent Developments in the American Community Survey," presented at the American Statistical Association Meeting, Dallas, Texas, August 12, 1998. U.S. Census Bureau, American Community Survey, Advanced Methodology, Papers and Presentations, http:// www.census.gov/acs/www (June 3, 2004). [72] Posey and Welniak, "Income in the ACS: Comparisons to the 1990 Census." [73] Shail Butani, Charles Alexander, and James Esposito, "Using the American Community Survey to Enhance the Current Population Survey: Opportunities and Issues," presented at the 1999 Federal Committee on Statistical Methodology Research Conference, Arlington, Virginia, November 15-17, 1999. [74] To assist in this research, BLS wrote a contract with Wayne Vroman of the Urban Institute for his 2003 report, "Comparing Labor Market Indicators from the CPS and ACS." [75] Charles H. Alexander, "Still Rolling: Leslie Kish's 'Rolling Samples' and the American Community Survey," in Proceedings of Statistics Canada Symposium 2001: October 16-19 (Ottawa, Canada: Statistics Canada, 2002). [76] U.S. Census Bureau, Meeting 21st Century Demographic Data Needs, Report 1, Demonstrating Operational Feasibility (Washington, D.C.: July 2001). C2SS, conducted as part of the 2000 Census, was a national survey of about 700,000 households and designed to test the operational feasibility of collecting long-form data at the same time as, but separately from, the Decennial Census. Its questionnaire was essentially the same as the long form. The survey has been conducted annually since 2000. [77] U.S. Census Bureau, Meeting 21st Century Demographic Data Needs, Report 2, Demonstrating Survey Quality (Washington, D.C.: May 2002). [78] U.S. Office of Management and Budget, Measuring and Reporting Sources of Errors in Surveys, Statistical Policy Working Paper 31 (Washington D.C.: July 2001). [79] In 2000, the Census Bureau established an ACS Research and Evaluation Steering Committee to develop a series of reports on key results from the ACS development program. A team was to manage the program and identify key questions whose answers would demonstrate the adequacy of the ACS as a replacement for the Decennial Census long form. See U.S. Census Bureau, American Community Survey Development Report Series Program Plan (Washington, D.C.: rev. June 12, 2002), p. 5. [80] U.S. Census Bureau, American Community Survey Operations Plan, Release 1. [81] In April 2003, the Census Bureau let a contract with local experts to study and evaluate selected differences between 1999-2001 averages from four test sites and corresponding 2000 long-form data. [82] U.S. Census Bureau, American Community Survey Operations Plan, Release 1, p. 36. [83] Deborah H. Griffin, "Comparing Characteristics from the American Community Survey and Census 2000: Methodology," presented at Census Advisory Committee of Professional Associations Meetings, Washington, D.C., April 10-11, 2003, p. 2. [84] U.S. Census Bureau, ACS-2010 Consistency Review Plan (Washington, D.C.: October 1, 2003). [85] Lynn Weidman and Signe Wetrogan, "Enhancing the Intercensal Population Estimates Program with ACS Data: Summary of Research Projects," Census Advisory Committee of Professional Associations meeting, Washington, D.C., October 23, 2003. [86] Navarro, "American Community Survey: Use of Population Estimates as Controls in the ACS Weighting," presented at Census Bureau Advisory Committee of Professional Associations meeting, Washington, D.C., October 23, 2003. [87] Paula Schneider, Content and Data Quality in Census 2000, Census 2000 Testing, Experimentation, and Evaluation Program Topic Report No. 12 (Washington, D.C.: U.S. Census Bureau, January 22, 2004). [88] ICPE develops and disseminates annual estimates of the total population and the distribution by age, sex, race, and Hispanic origin for the nation, state, counties, and functioning government units. The program is authorized by 13 U.S.C. §181, which requires the production of "current data on total population and population characteristics." [89] For additional details, see U.S. Census Bureau, "Estimates and Projections Area Documentation: Subcounty Total Population Estimates," http://www.census.gov. [90] The initial ACS estimates for 2010 are to be released before the 2010 Census-based ICPE estimates are available. [91] Susan Love, Donald Dalzell, and Charles Alexander, "Constructing a Major Survey: Operational Plans and Issues for Continuous Measurement," presented at the annual American Statistical Association meeting, Orlando, Florida, August 16, 1995. [92] This May 14-15, 1998, conference was held in response to U.S. Senate Appropriations Committee Report 105-48, 105th Cong., 1st sess. (July 16, 1997), title II, p. 64, which had stated that "The outside evaluator should review the ACS to determine whether there is an antirural bias in its design." Graham Kalton and others prepared the conference report for the Census Bureau: The American Community Survey: The Quality of Rural Data (Rockville, Md.: Westat, June 29, 1998). [93] Alexander and Wetrogan, "Integrating the American Community Survey and the Intercensal Demographic Estimates Program." [94] ACS data on foreign-born persons were used to estimate the national levels of international migration that were incorporated into the intercensal estimates for 2003, based on the 2000, 2001, and 2002 ACS. [95] Navarro, "American Community Survey." [96] Weidman and Wetrogan, "Enhancing the Intercensal Population Estimates Program with ACS Data." [97] Schneider, Content and Data Quality in Census 2000. [98] Theresa J. DeMaio and Kristen A. Hughes, "Report of Cognitive Research on the Residence Rules and Seasonality Questions on the American Community Survey," U.S. Census Bureau, Statistical Research Division, Washington, D.C., July 2003. [99] DeMaio and Hughes, "Report of Cognitive Research on the Residence Rules and Seasonality Questions," pp. 9-10. [100] In contrast, a 2000 Census questionnaire received in late April 2000 that listed a resident aged 11 with an April 15, 1989, birthdate would be considered inconsistent because the person was aged 10 on census day. [101] Kirby G. Posey, Edward Welniak, and Charles Nelson, "Income in the American Community Survey: Comparisons to Census 2000," presented at the Joint Statistical Meetings, San Francisco, California, August 7, 2003. [102] Sharon M. Stern, "Counting People with Disabilities: How Survey Methodology Influences Estimates in Census 2000 and the Census 2000 Supplementary Survey," presented at the Joint Statistical Meetings, San Francisco, California, August 7, 2003. [103] Although data on group quarters were collected at the ACS test sites in 1999 and 2001, data on them were not collected in the ACS supplementary surveys, which began in 2000. The Census Bureau made this decision to avoid duplication with the 2000 Decennial Census and because it lacked funding to cover them in subsequent years. [104] Census Bureau, American Community Survey Operations Plan, Release 1. [105] U.S. Census Bureau, ACS-2010 Consistency Review Plan. [106] U.S. Census Bureau, 2010 Decision Memorandum Series No. 5. (Washington, D.C.: June 3, 2004) and 2010 Planning Memorandum Series No. 24 (Washington, D.C.: June 9, 2004). [107] For a complete discussion, see ORC Macro, The American Community Survey: Challenges and Opportunities for HUD (Calverton, Md.: Sept. 27, 2002). For a complete discussion of the role of the inflation adjustment in differences between the ACS and CPS measures of income, see Posey, Welniak, and Nelson, "Income in the American Community Survey." [108] ORC Macro, p. 16. [109] ORC Macro, p. 44. [110] ORC Macro, pp. 16-17. [111] ORC Macro, p. 44. [112] ORC Macro, p. 208. [113] See Posey, Welniak, and Nelson. [114] Posey, Welniak, and Nelson, p. 14. [115] Posey, Welniak, and Nelson, p. 15. [116] Charles Nelson and Kathleen Short, "The Distributional Implications of Geographic Adjustment of Poverty Thresholds," U.S. Census Bureau, Housing and Household Economics and Statistics Division, Washington, D.C., December 2003. [117] 64 Fed. Reg. 64 48759-48760 (Sept. 8, 1999). The Census Bureau made similar statements about the importance of comparisons in its request to OMB to extend approval of the supplementary survey forms in 67 Fed. Reg. 67 21629 (May 1, 2002). [118] Kenneth Prewitt, Director, U.S. Bureau of the Census, before the U.S. House of Representatives, Committee on Government Reform, Subcommittee on the Census, Summary of House Hearing on ACS July 20, 2000, The American Community Survey: A Replacement for the Census Long Form? Serial 106-246 (Washington, D.C.: July 20, 2000). [119] U.S. Census Bureau, Meeting 21st Century Demographic Data Needs: Implementing the American Community Survey, Report 1, Demonstrating Operational Feasibility. [120] U.S. Census Bureau, Meeting 21st Century Demographic Data Needs: Implementing the American Community Survey, Report 2, Demonstrating Survey Quality. [121] U.S. Census Bureau, American Community Survey Development Report Series Program Plan, an internal report prepared a month later, had called for completing one of these reports by the end of 2002. [122] U.S. Census Bureau, American Community Survey Operations Plan, Release 1. [123] GAO-02-956R, p. 11. [124] GAO-02-956R, p. 12. [125] Wayne Vroman, Comparing Labor Market Indicators from the CPS and ACS (Washington, D.C.: Urban Institute, September 2003). [126] Vroman, p. 23. [127] Schneider. [128] Posey, Welniak, and Nelson, "Income in the American Community Survey," p. 14. [129] Stern, "Counting People with Disabilities." [130] Lex Frieden, Chair, National Council on Disability, "Improving Federal Disability Data," Washington, D.C., January 8, 2004. National Council on Disability, Newsroom, Publications, 2004, http:// www.ncd.gov/newsroom/publications/2004/publications.htm (May 11, 2004). [131] Cork, Cohen, and King, p. 99. [132] Cork, Cohen, and King, p. 99. [133] Graham Kalton and others, The American Community Survey: The Quality of Rural Data, A Report of a Conference (Rockville, Md.: Westat, June 29, 1998), p. 3. [134] Kalton and others, p. 12. [135] Kalton and others, p. 13. [136] Westat Inc., The American Community Survey: A Report on the Use of Multi-Year Averages (Rockville, Md.: April 30, 1999), p. 12. [137] Alexander, p. 6. [138] Alexander, p. 6. [139] Charles Alexander, "A Discussion of the Quality of Estimates from the American Community Survey for Small Population Groups," written August 26, 2002, for the Census Advisory Committee of Professional Associations meeting, Washington, D.C., October 2-3, 2002, p. 3. [140] GAO-02-956R and ORC Macro. [141] GAO-02-956R, p. 25. [142] GAO-02-956R, p. 15. [143] ORC Macro, p. vi. [144] ORC Macro, p. vi. [145] Since ORC Macro's study was issued, full implementation of the ACS has been delayed; the first 5-year averages will not be available until 2010. [146] Cork, Cohen, and King, pp. 99-100. [147] Cork, Cohen, and King, p. 6. [148] Cork, Cohen, and King, p. 86. [149] Cork, Cohen, and King, p. 87. [150] GAO-02-956R, pp. 25-26. Information on the Federal Agency Information Program is at Census Bureau, American Community Survey, http://www.census.gov/acs/www (May 11, 2004). [151] Census Bureau, American Community Survey Operations Plan, Release 1, p. 56. [152] Cork, Cohen, and King, p. 98. [153] The record of committee recommendations for the October 2003 meetings was not available. GAO's Mission: The Government Accountability Office, the investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO's commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through the Internet. GAO's Web site ( www.gao.gov ) contains abstracts and full-text files of current reports and testimony and an expanding archive of older products. The Web site features a search engine to help you locate documents using key words and phrases. You can print these documents in their entirety, including charts and other graphics. Each day, GAO issues a list of newly released reports, testimony, and correspondence. GAO posts this list, known as "Today's Reports," on its Web site daily. The list contains links to the full-text document files. To have GAO e-mail this list to you every afternoon, go to www.gao.gov and select "Subscribe to e-mail alerts" under the "Order GAO Products" heading. Order by Mail or Phone: The first copy of each printed report is free. Additional copies are $2 each. A check or money order should be made out to the Superintendent of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or more copies mailed to a single address are discounted 25 percent. Orders should be sent to: U.S. Government Accountability Office 441 G Street NW, Room LM Washington, D.C. 20548: To order by Phone: Voice: (202) 512-6000: TDD: (202) 512-2537: Fax: (202) 512-6061: To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Web site: www.gao.gov/fraudnet/fraudnet.htm E-mail: fraudnet@gao.gov Automated answering system: (800) 424-5454 or (202) 512-7470: Public Affairs: Jeff Nelligan, managing director, NelliganJ@gao.gov (202) 512-4800 U.S. Government Accountability Office, 441 G Street NW, Room 7149 Washington, D.C. 20548:

The Justia Government Accountability Office site republishes public reports retrieved from the U.S. GAO These reports should not be considered official, and do not necessarily reflect the views of Justia.