2010 Census
Key Efforts to Include Hard-to-Count Populations Went Generally as Planned; Improvements Could Make the Efforts More Effective for Next Census
Gao ID: GAO-11-45 December 14, 2010
To overcome the long-standing challenge of enumerating hard-to-count (HTC) groups such as minorities and renters, the U.S. Census Bureau (Bureau), used outreach programs, such as paid advertising, and partnered with thousands of organizations to enlist their support for the census. The Bureau also conducted Service-Based Enumeration (SBE), which was designed to count people who frequent soup kitchens or other service providers, and the Be Counted/Questionnaire Assistance Center (QAC) program, designed to count individuals who believed the census had missed them. As requested, GAO assessed how the design of these efforts compared to 2000 and the extent to which they were implemented as planned. GAO reviewed Bureau budget, planning, operational, and evaluation documents; observed enumeration efforts in 12 HTC areas; surveyed local census office managers; and interviewed Bureau officials.
The Bureau better positioned itself to reach out to and enumerate HTC populations in 2010 in part by addressing a number of key challenges from 2000. The Bureau's outreach efforts were generally more robust compared to 2000. For example, compared to 2000, the Bureau used more reliable data to target advertising; focused a larger share of its advertising dollars on HTC groups, such as non-English-speaking audiences; and strengthened its monitoring abilities so that the Bureau was able to run additional advertising in locations where mail response rates were lagging. The Bureau also significantly expanded the partnership program by hiring about 2,800 partnership staff in 2010 compared to around 600 in 2000. As a result, staff were not spread as thin. The number of languages they spoke increased from 35 in 2000 to 145 for the 2010 Census. Despite these enhancements, the outreach efforts still faced challenges. For example, while most of the partnership staff GAO interviewed reported having mutually supportive relationships with local census offices, about half of the local census office managers surveyed were dissatisfied with the level of coordination, noting duplication of effort in some cases. Additionally, a tracking database that partnership staff were to use to help manage their efforts was not user-friendly nor was it kept current. The Bureau also improved the key enumeration programs aimed at HTC groups and the efforts were generally implemented as planned, but additional refinements could improve them for 2020. For example, the Bureau expanded SBE training by teaching staff how to enumerate all types of SBE facilities, which gave the Bureau more flexibility in scheduling enumerations, and advance visits helped enhance service providers' readiness for the enumeration. Nevertheless, while most local census office managers were satisfied with SBE staffing levels, pockets of dissatisfaction existed and observers noted what appeared to be a surplus of enumerators with little work to do in some locations. While overstaffing can lead to unnecessarily higher labor costs, understaffing can also be problematic because it can affect the accuracy of the overall count, and it will be important for the Bureau to review the results of SBE to staff SBE efficiently in 2020. For the Be Counted/QAC program, the Bureau addressed visibility and site selection challenges from 2000 by developing banners to prominently display site locations and hours of operation and updating site selection guidance. For 2010, the Bureau opened around 38,000 sites and completed the monthlong operation under budget. However, the Bureau experienced recurring challenges with ensuring that the sites were visible from street level and were in areas with potential for high levels of activity, and the overall effort was resource intensive relative to the average of 20 forms that were returned and checked in from each site. Moving forward, it will be important for the Bureau to explore ways to maximize the program's ability to increase the number of forms checked in for 2020. GAO recommends that the Bureau take steps to improve the effectiveness of its outreach and enumeration activities aimed at HTC groups, including developing a predictive model to better allocate paid advertising funds, improving coordination between partnership and local census staff, revisiting SBE staffing guidance, and ensuring Be Counted/QAC sites are more visible and optimally located. Commerce generally agreed with the overall findings and recommendations.
Recommendations
Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.
Director:
Robert N. Goldenkoff
Team:
Government Accountability Office: Strategic Issues
Phone:
(202) 512-2757
GAO-11-45, 2010 Census: Key Efforts to Include Hard-to-Count Populations Went Generally as Planned; Improvements Could Make the Efforts More Effective for Next Census
This is the accessible text file for GAO report number GAO-11-45
entitled '2010 Census: Key Efforts to Include Hard-to-Count
Populations Went Generally as Planned; Improvements Could Make the
Efforts More Effective for Next Census' which was released on December
14, 2010.
This text file was formatted by the U.S. Government Accountability
Office (GAO) to be accessible to users with visual impairments, as
part of a longer term project to improve GAO products' accessibility.
Every attempt has been made to maintain the structural and data
integrity of the original printed product. Accessibility features,
such as text descriptions of tables, consecutively numbered footnotes
placed at the end of the file, and the text of agency comment letters,
are provided but may not exactly duplicate the presentation or format
of the printed version. The portable document format (PDF) file is an
exact electronic replica of the printed version. We welcome your
feedback. Please E-mail your comments regarding the contents or
accessibility features of this document to Webmaster@gao.gov.
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed
in its entirety without further permission from GAO. Because this work
may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this
material separately.
United States Government Accountability Office:
GAO:
Report to Congressional Requesters:
December 2010:
2010 Census:
Key Efforts to Include Hard-to-Count Populations Went Generally as
Planned; Improvements Could Make the Efforts More Effective for Next
Census:
GAO-11-45:
GAO Highlights:
Highlights of GAO-11-45, a report to congressional requesters.
Why GAO Did This Study:
To overcome the long-standing challenge of enumerating hard-to-count
(HTC) groups such as minorities and renters, the U.S. Census Bureau
(Bureau), used outreach programs, such as paid advertising, and
partnered with thousands of organizations to enlist their support for
the census. The Bureau also conducted Service-Based Enumeration (SBE),
which was designed to count people who frequent soup kitchens or other
service providers, and the Be Counted/Questionnaire Assistance Center
(QAC) program, designed to count individuals who believed the census
had missed them. As requested, GAO assessed how the design of these
efforts compared to 2000 and the extent to which they were implemented
as planned. GAO reviewed Bureau budget, planning, operational, and
evaluation documents; observed enumeration efforts in 12 HTC areas;
surveyed local census office managers; and interviewed Bureau
officials.
What GAO Found:
The Bureau better positioned itself to reach out to and enumerate HTC
populations in 2010 in part by addressing a number of key challenges
from 2000. The Bureau‘s outreach efforts were generally more robust
compared to 2000. For example, compared to 2000, the Bureau used more
reliable data to target advertising; focused a larger share of its
advertising dollars on HTC groups, such as non-English-speaking
audiences; and strengthened its monitoring abilities so that the
Bureau was able to run additional advertising in locations where mail
response rates were lagging. The Bureau also significantly expanded
the partnership program by hiring about 2,800 partnership staff in
2010 compared to around 600 in 2000. As a result, staff were not
spread as thin. The number of languages they spoke increased from 35
in 2000 to 145 for the 2010 Census.
Despite these enhancements, the outreach efforts still faced
challenges. For example, while most of the partnership staff GAO
interviewed reported having mutually supportive relationships with
local census offices, about half of the local census office managers
surveyed were dissatisfied with the level of coordination, noting
duplication of effort in some cases. Additionally, a tracking database
that partnership staff were to use to help manage their efforts was
not user-friendly nor was it kept current.
The Bureau also improved the key enumeration programs aimed at HTC
groups and the efforts were generally implemented as planned, but
additional refinements could improve them for 2020. For example, the
Bureau expanded SBE training by teaching staff how to enumerate all
types of SBE facilities, which gave the Bureau more flexibility in
scheduling enumerations, and advance visits helped enhance service
providers‘ readiness for the enumeration. Nevertheless, while most
local census office managers were satisfied with SBE staffing levels,
pockets of dissatisfaction existed and observers noted what appeared
to be a surplus of enumerators with little work to do in some
locations. While overstaffing can lead to unnecessarily higher labor
costs, understaffing can also be problematic because it can affect the
accuracy of the overall count, and it will be important for the Bureau
to review the results of SBE to staff SBE efficiently in 2020.
For the Be Counted/QAC program, the Bureau addressed visibility and
site selection challenges from 2000 by developing banners to
prominently display site locations and hours of operation and updating
site selection guidance. For 2010, the Bureau opened around 38,000
sites and completed the monthlong operation under budget. However, the
Bureau experienced recurring challenges with ensuring that the sites
were visible from street level and were in areas with potential for
high levels of activity, and the overall effort was resource intensive
relative to the average of 20 forms that were returned and checked in
from each site. Moving forward, it will be important for the Bureau to
explore ways to maximize the program‘s ability to increase the number
of forms checked in for 2020.
What GAO Recommends:
GAO recommends that the Bureau take steps to improve the effectiveness
of its outreach and enumeration activities aimed at HTC groups,
including developing a predictive model to better allocate paid
advertising funds, improving coordination between partnership and
local census staff, revisiting SBE staffing guidance, and ensuring Be
Counted/QAC sites are more visible and optimally located. Commerce
generally agreed with the overall findings and recommendations.
View GAO-11-45 or key components. For more information, contact Robert
Goldenkoff at (202) 512-2757 or goldenkoffr@gao.gov.
[End of section]
Contents:
Letter:
Background:
The Bureau's Outreach and Promotion Efforts Were Generally More Robust
Compared to Those in 2000 and Were Implemented as Planned, but They
Could Be Further Improved:
The Bureau Enhanced Enumeration Programs Aimed at HTC Groups;
Additional Refinements Could Improve Them for 2020:
Conclusions:
Recommendations for Executive Action:
Agency Comments and Our Evaluation:
Appendix I: Comments from the Department of Commerce:
Appendix II: GAO Contact and Staff Acknowledgments:
Related GAO Products:
Tables:
Table 1: Comparison of 2000 and 2010 Census Paid Media Budget:
Table 2: 2010 Census Paid Media Budget by Target Audience:
Table 3: Comparison of 2000 and 2010 Paid Media Activities:
Table 4: 2010 Partnership Activities Compared to Those in 2000:
Table 5: Comparison of 2000 and 2010 SBE Operations:
Table 6: Comparison of 2000 and 2010 Be Counted/QAC Programs:
Figures:
Figure 1: Reporting Structure for Regional Census Centers:
Figure 2: Be Counted Forms Prominently Displayed at Brooklyn Be
Counted/QAC Site:
Figure 3: Be Counted Forms Not Prominently Displayed at Fresno Be
Counted/QAC Site:
[End of section]
United States Government Accountability Office:
Washington, DC 20548:
December 14, 2010:
The Honorable Thomas R. Carper:
Chairman:
The Honorable John McCain:
Ranking Member:
Subcommittee on Federal Financial Management, Government Information,
Federal Services, and International Security:
Committee on Homeland Security and Governmental Affairs:
United States Senate:
The Honorable Darrell E. Issa:
Ranking Member:
Committee on Oversight and Government Reform:
House of Representatives:
The Honorable William Lacy Clay:
Chairman:
The Honorable Patrick T. McHenry:
Ranking Member:
Subcommittee on Information Policy, Census, and National Archives:
Committee on Oversight and Government Reform:
House of Representatives:
A complete and accurate census is becoming an increasingly daunting
task, in part because the nation's population is growing larger, more
diverse, and more reluctant to participate. When the census misses a
person who should have been included, it results in an undercount; an
overcount occurs when an individual is counted more than once. Such
errors are particularly problematic because of their differential
impact on various subgroups. Minorities, renters, and children, for
example, are more likely to be undercounted by the census while more
affluent groups, such as people with vacation homes, are more likely
to be enumerated more than once. As census data are used to apportion
seats in Congress, redraw congressional districts, and allocate
billions of dollars in federal assistance to states and local
governments, improving coverage and reducing the differential
undercount[Footnote 1] are critical.
To help reduce the undercount for the 2010 Census, the U.S. Census
Bureau (Bureau) embarked on a number of outreach and enumeration
activities aimed at getting the hard-to-count (HTC) populations to
participate in the census. On the outreach side, the Bureau
implemented a communications campaign that included paid media and
partnership activities (among others) to target advertisements and
engage government and community organizations in support of the
census. On the enumeration side, the Bureau relied on such efforts as
Service-Based Enumeration (SBE) to enumerate individuals residing in
less conventional housing, such as shelters and tent encampments, and
the Be Counted/Questionnaire Assistance Centers (QAC) programs to
count people who believed they did not receive a census form.
One key to a successful census is a high mail participation rate,
which helps the Bureau obtain more accurate data and reduce costs. The
mail participation rate--which the Bureau defines as the percentage of
forms mailed back by households that received them--was 74 percent for
2010, the same as in 2000.[Footnote 2] Considering the nation's
diversity and other sociodemographic trends that adversely affect
participation rates, this was an important accomplishment.
Because of your interest in the Bureau's efforts to boost census
participation and reduce the differential undercount, we reviewed the
design and implementation of key outreach and enumeration programs
aimed at HTC populations. In so doing, we paid particular attention to
assessing (1) how the design of these programs compared to 2000 and
(2) the extent to which the Bureau implemented these programs as
planned and where refinements might be needed should these efforts be
used in the 2020 Census.
This report is one of three we are releasing today.[Footnote 3] Of the
other two, one assesses the implementation of key field data
collection operations, and the other examines the implementation of
operations aimed at reducing census coverage errors. Both reports
identify preliminary lessons learned, as well as potential focus areas
for improvement for the 2020 Census.
To assess how the Bureau's efforts to reach out to and enumerate HTC
populations compared to 2000, we reviewed and analyzed budget,
planning, operational, and evaluative data and documents for the 2000
and 2010 paid media, partnership, SBE, and Be Counted/QAC activities.
We chose these activities because they constitute the majority of the
budget for outreach efforts or, according to the Bureau, were
enumeration activities that contributed to reducing the differential
undercount in 2000. For example, paid advertising accounted for
approximately 39 percent ($258,738,551) of the Bureau's originally
planned $660 million communication campaign effort, and the
partnership program accounted for over 56 percent ($364,331,089) of
the campaign.[Footnote 4] According to the Bureau, the Be Counted/QAC
program was an important part of the Bureau's efforts to enumerate
people often missed by the census, including people who had no usual
residence on Census Day, such as transients, migrants, or seasonal
farm workers. In addition, we attended presentations on the paid media
program by the Bureau and its contractor, DraftFCB, which assisted the
Bureau with creating promotional campaigns to research, develop, and
target the paid advertising efforts. We also reviewed Bureau,
Department of Commerce Inspector General, and our reports on the 2010
and 2000 censuses, and interviewed cognizant Bureau officials at
headquarters and local census offices.
To evaluate whether implementation proceeded as planned and identify
areas for improvement in 2020, we conducted 78 observations of
enumerators as they visited SBE facilities, including 22 targeted non-
sheltered outdoor locations (TNSOL)--such as parks and under bridges
where people experiencing homelessness were sometimes counted. We
interviewed enumerators in 12 urban local census offices across the
country, such as those in Boston, Chicago, Dallas, and Los Angeles,
and interviewed enumerators' supervisors, known as crew leaders, in
some of the local census offices we visited.[Footnote 5] Further, we
conducted observations of 51 Be Counted/QAC sites in 12 urban areas.
For the SBE and Be Counted/QAC observations, we selected offices
located in HTC areas as determined by data from the 2000 Census. While
these sites were not selected randomly, we considered factors such as
ethnic and geographic diversity in selecting them.
To gain greater insight on the partnership program, we interviewed 11
partnership staff who represented historically HTC populations and
different ethnic groups in the Bureau's Atlanta, Charlotte,
Philadelphia, and Los Angeles regions. We selected these regions based
on, in part, the allocation of partnership staff, but the sites were
not randomly selected and results cannot be generalized nationwide.
To obtain information on the local implementation of the Bureau's
outreach and enumeration efforts, we surveyed the Bureau's entire
population of 494 local census office managers (LCOM) using a series
of online questionnaires about their experience in managing local
census office activities and enumeration efforts. The surveys were
conducted in six waves from March through September 2010. Each survey
had a response rate of at least 70 percent and was thus sufficiently
reliable for providing evidence to support our findings, conclusions,
and recommendations.
We analyzed Bureau data on the distribution of Be Counted/QAC sites
among HTC census tracts and local census offices. We analyzed cost and
progress data for SBE and Be Counted/QACs and analyzed data on
partnership and Be Counted/QAC activities from the automated system
the Bureau used to track its partnership contacts, the Integrated
Partnership Contact Database.[Footnote 6] To further identify and
assess the Bureau's outreach and enumeration efforts for HTC
populations, we interviewed Bureau officials to obtain additional
details about paid media, partnerships, SBE, and Be Counted/QAC.
This report is part of our larger review of lessons learned from the
2010 Census that can help inform the Bureau's planning efforts for
2020. The Bureau is also evaluating its efforts to reach out to and
enumerate HTC populations and plans to issue the results by December
2012.
We conducted this performance audit from January 2010 to December 2010
in accordance with generally accepted government auditing standards.
Those standards require that we plan and perform the audit to obtain
sufficient, appropriate evidence to provide a reasonable basis for our
findings and conclusions based on our audit objectives. We believe
that the evidence obtained provides a reasonable basis for our
findings and conclusions based on our audit objectives.
On December 8, 2010, the Secretary of Commerce provided written
comments on the draft report, which are reprinted in appendix I. The
Department of Commerce generally agreed with the overall findings and
recommendations of the report.
Background:
To improve participation in the census among HTC groups as well as the
general population, the Bureau implemented a number of outreach and
enumeration activities from January 2008 through September 2010. In
this report, we focus on the following four efforts:
* paid media,
* partnerships,
* SBE, and:
* Be Counted/QAC.
The four components of the outreach efforts, known collectively as the
Integrated Communications Campaign, were paid media, a partnership
program, public relations and an educational program called Census in
Schools. According to Bureau officials, the components were designed
to work together to unify census messages and communicate them to
diverse audiences via various outlets in order to improve mail
response and reduce the differential undercount. An appropriation in
the American Recovery and Reinvestment Act of 2009 (Recovery Act)
allowed the Bureau to increase the communications campaign's initial
budget of $410 million by an additional $220 million.[Footnote 7]
The Bureau's regional census centers (RCC) were responsible for
administering the partnership program, with partnership coordinators
and team leaders at each RCC overseeing the work of the partnership
specialists and partnership assistants. Local census offices played a
more limited role in outreach efforts, and while the local census
offices reported to RCCs, they had a different reporting structure
than the partnership program.
SBE was meant to help ensure that people without conventional housing
were included in the count. From March 28 through March 30, 2010, the
Bureau attempted to enumerate those without conventional housing at
facilities where they received services or at outdoor locations, such
as parked cars, tent encampments, and on the street. The Bureau
developed a list of potential outdoor locations based on several
sources, including 2000 Census data and input from community leaders.
The Bureau's Be Counted program, which ran from March 19 to April 19,
2010, was designed to reach those who may not have received a census
questionnaire, including people who did not have a usual residence on
April 1, 2010, such as transients, migrants, and seasonal farm
workers.[Footnote 8] The program made questionnaires available at
community centers, libraries, places of worship, and other public
locations throughout the country. Individuals were to pick up the
forms from these sites and mail the completed questionnaires to the
Bureau. Some of the sites also included a staffed QAC to help people,
especially those with limited English proficiency, complete their
questionnaires.
The Bureau's Outreach and Promotion Efforts Were Generally More Robust
Compared to Those in 2000 and Were Implemented as Planned, but They
Could Be Further Improved:
Paid Media Plans Built in Better Targeting:
The Bureau refined its paid media efforts for 2010, in part to address
challenges from the 2000 Census. For example, in 2000, to target
advertising to certain population groups and areas, the Bureau used
data on measures of civic participation, such as voting in elections.
However, the Bureau noted that civic participation did not appear to
be a primary indicator of an individual's willingness to participate
in the census. To better motivate participation among different
population groups, for 2010 the Bureau used, among other data sources,
actual participation data from the 2000 Census, as well as market and
attitudinal research that identified five mindsets people have about
the census. These mindsets ranged from the "leading edge" (those who
are highly likely to participate) to the "cynical fifth" (those who
are less likely to participate because they doubt the census provides
tangible benefits and are concerned that the census is an invasion of
privacy and that the information collected will be misused). The
Bureau used this information to tailor its paid media efforts.
Moreover, in 2000 the Bureau did not buy additional paid media in
areas with unexpectedly low participation rates. For 2010, the Bureau
set aside more than $7 million to rapidly target paid media in
response to specific events leading up to the census or to areas with
unexpectedly low mail participation rates.
Overall, the Bureau budgeted about $297.3 million on paid media in
2010, about $57 million (24 percent) more than in 2000 in constant
2010 dollars. The Bureau's 2010 paid media budget reflected several
increases. On a unit cost basis, spending increased from an average of
about $2.05 per housing unit in 2000 to $2.25 per housing unit in
2010, in constant 2010 dollars. Also, the Bureau increased the
percentage of the budget for media development costs from 33 percent
in 2000 to 43 percent in 2010. Table 1 compares the paid media
spending in 2000 to 2010.
Table 1: Comparison of 2000 and 2010 Census Paid Media Budget:
Component: Total paid media;
2000 paid media[A](In 2010 dollars): $240,593,921;
2010 paid media[B]: $297,346,773;
Difference: $56,752,852 (24 percent).
Component: Paid media development (production, labor, research, and
other costs);
2000 paid media[A](In 2010 dollars): 80,187,677;
2010 paid media[B]: $129,025,327;
Difference: $48,837,650 (61 percent).
Component: Paid media buys;
2000 paid media[A](In 2010 dollars): 160,406,244;
2010 paid media[B]: $168,321,446;
Difference: $7,915,202 (4.9 percent).
Source: U.S. Census Bureau data.
[A] These are 2000 paid media actual costs.
[B] These are 2010 paid media estimated budget costs.
[End of table]
According to the Bureau, the cost increased for paid media development
in part because of the extensive research done to target the media to
specific groups and areas and because advertising was created in 12
more languages than in 2000. For example, to determine where paid
media efforts may have the greatest impact, the Bureau developed
predictive models based on 2000 census data and the evaluations of the
partnership and paid media efforts from 2000. The models were provided
to its contractor, DraftFCB, to aid in making paid media decisions. By
better targeting paid media buys by area and message, the Bureau
expected to more effectively reach those who have historically been
the hardest to count. However, according to the Bureau, two factors--
the use of evaluations from 2000 that did not isolate the impact of
paid media from other components of the Bureau's outreach efforts,
such as the partnership program, and the age of the data used--may
have limited the model's ability to predict where paid media efforts
had the greatest impact.
In a further effort to reach HTC groups, in 2010 the Bureau budgeted
more for paid media that targeted HTC groups, like non-English-
speaking audiences, than on the national audience, which was not the
case in 2000, as shown in table 2.
Table 2: 2010 Census Paid Media Budget by Target Audience:
Component: Total paid media buys;
2000 paid media[A] (in 2010 dollars): $160,406,244;
2010 paid media[B]: $168,321,446;
Difference: $7,915,202 (4.9 percent).
Component: Mass audience (general population);
2000 paid media[A] (in 2010 dollars): 84,441,528;
2010 paid media[B]: $81,915,970;
Difference: $-2,525,558 (-3 percent).
Component: Ethnic/language audience;
2000 paid media[A] (in 2010 dollars): 75,964,716;
2010 paid media[B]: $86,405,476;
Difference: $10,440,760 (14 percent).
Source: U.S. Census Bureau data.
[A] These are 2000 paid media actual costs.
[B] These are 2010 paid media estimated budget costs.
[End of table]
Additionally, the Bureau strengthened its outreach efforts in 2010 by
improving its monitoring and evaluation activities. For example,
throughout the census the Bureau monitored the public's awareness and
attitudes toward the census via surveys and by tracking relevant
blogs. The Bureau used five sources of information, including national
polls and actual mail participation rates, to monitor metrics such as
individuals' understanding of the census, perceived benefits from
participating in the census, and barriers to participating in the
census. As a result, the Bureau used this information to identify
markets and groups where additional outreach was needed. Table 3
compares key aspects of the 2000 and 2010 paid media activities.
Table 3: Comparison of 2000 and 2010 Paid Media Activities:
Paid media activities: Campaign development and targeting;
2000 Census: Targeted advertisements by segmenting the population into
three groups of census participation likelihood, based on measures of
civic participation in an area, such as school board involvement;
2010 Census: Targeted advertisements based in part on actual 2000
participation rates and attitudinal research.
Paid media activities: Campaign development and targeting;
2000 Census: Developed paid media messages in 16 languages;
2010 Census: Developed paid media messages in 28 languages.
Paid media activities: Campaign development and targeting;
2000 Census: No electronic and Web-based communications made available;
2010 Census: Electronic and Web-based communications made available.
Paid media activities: Campaign development and targeting;
2000 Census: Targeted the majority of paid media resources to national
mass audience;
2010 Census: Targeted the majority of paid media resources to
ethnic/non-English language audiences.
Paid media activities: Implementation;
2000 Census: Did not establish a media contingency fund for unexpected
events;
2010 Census: Established a $7.4 million rapid response/media
contingency fund to address unexpected events, such as lower response
rates in certain areas.
Paid media activities: Monitoring;
2000 Census: Did not have the ability to measure the effectiveness of
paid media during the census;
2010 Census: Used national polling and other methods to measure the
effectiveness of paid media during the census.
Paid media activities: Evaluation;
2000 Census: Evaluated the impact of the communications campaign as a
whole on awareness of the census;
2010 Census: Conducted controlled experiments measuring the impact of
increased paid media exposure on mail response and made plans to
evaluate the impact of individual components of the communications
campaign, including paid media, on awareness and likelihood to
participate in the census.
Source: GAO analysis of U.S. Census Bureau information.
[End of table]
Paid Media Used Market Research to Better Target HTC Populations:
The Bureau generally implemented its 2010 paid media campaign as
planned, targeting different segments of the HTC population. For
example, to reach younger audiences, which are typically hard to
count, the Bureau used new methods such as podcasts, YouTube videos,
and social media networks such as Facebook and Twitter in addition to
traditional TV and radio broadcasts. To reach people with limited
English proficiency, the Bureau ran banner advertisements on, for
example, Chinese language Web sites that linked directly to the
Chinese language page of the Bureau's own Web site and targeted local
radio advertisements to various ethnic audiences. Moreover, to reach
audiences through their media habits and interests, the Bureau
integrated census messages into regularly scheduled television
programming in an attempt to appeal to people in new and more personal
ways. For example, a Spanish-language soap opera made one of its
characters an enumerator.
The Bureau also took advantage of its improved monitoring capacity and
implemented a rapid response initiative to address markets with
lagging mail participation rates or unforeseen events that might have
affected response rates in certain markets. For example, as Census Day
approached, the Bureau continuously tracked the public's attitudes
toward the census to help determine the impact of its outreach
activities. The Bureau found that while the percentage of people
saying they would definitely participate in the census increased from
about 50 percent in December 2009 to about 89 percent in March 2010,
the data indicated that specific populations would have lower
participation rates. As a result, the Bureau ran additional
advertising targeted at the following groups, among others:
* 18-to 24-year-olds whose attitudes on their intent to participate in
the census were not changing over time;
* English-speaking Hispanics who appeared less likely than Spanish-
speaking Hispanics to understand the benefits of census participation;
and:
* Hasidic Jews in Brooklyn, New York, because mail participation rates
were lagging in neighborhoods known to have significant Hasidic
populations.
Further, in late March, the Bureau identified 23 specific media
markets with mail participation rates significantly below the national
average. Following rapid response efforts in these areas, 13 of these
markets showed a significant increase in mail participation rates
compared to the national average.
The Bureau originally budgeted $7.4 million for its rapid response
efforts, but added approximately $28 million from a separate
management reserve fund as data analysis showed a need for media
intervention, for a total of about $35 million. Of this $35 million,
about $31.8 million was allocated to new media purchases and about $3
million went to media production and other costs. Of the $31.8
million, the Bureau budgeted about $17.3 million (54 percent) of the
rapid response paid media funding for the general population and $14.5
million (45 percent) for specific ethnic and language audiences.
The Bureau plans to assess the impact of the communications campaign
on respondent attitudes and behaviors. For example, to determine how
much it should invest in the paid media campaign, the Bureau held an
experiment in 2010 where it flooded certain markets with more paid
advertising than was used in other, similar markets. When the
evaluation of this research is completed as scheduled in 2012, it
could help the Bureau better determine whether greater levels of
advertising would be cost-effective in terms of increasing the mail
response rate of various races and ethnic groups. Moving forward, it
will be important for the Bureau to use these evaluation results not
only for planning 2020 Census-taking activities, but, as was the case
for 2010, also for aiding in the development of a predictive model
that could help the Bureau determine which media outlets provide the
best return on investment in terms of raising awareness of the census
and encouraging participation for specific demographic groups. The
model could combine data from the 2000 and 2010 enumerations and
inform allocation decisions for paid media.
Partnership Efforts Were More Comprehensive Than in the 2000 Census:
In designing the 2010 partnership program, the Bureau took a number of
steps aimed at expanding its reach and addressing challenges from the
2000 Census. For example, in 2000, the Bureau hired about 600
partnership staff in the field who were responsible for mobilizing
local support for the census by working with local organizations to
promote census participation. However, we reported in 2001 that
partnership specialists' heavy workload may have limited the level of
support they were able to provide individual local census offices.
[Footnote 9] To help improve its ability to mobilize local support for
2010, the Bureau created a new position, the partnership assistant,
[Footnote 10] and hired about 2,800 partnership staff, about five
times the number of partnership staff hired in 2000.[Footnote 11]
Thus, the Bureau increased the ratio of partnership staff per county
and staff were not spread as thin.
Additionally, for 2000, the Bureau developed a database to track,
plan, and analyze partnership efforts. We reported that the database
was not user-friendly, which led to inefficiencies and duplication of
effort.[Footnote 12] For 2010, the Bureau revamped the partnership
database to make it more user-friendly and to improve management's
ability to use the information to monitor the progress of partnership
activities. For example, while the 2000 database was mainly a catalog
of census partner organizations, the 2010 database was designed to
enable the Bureau to more actively manage the program in part by
generating reports on value-added goods and services that partners
provided, such as free training space. Table 4 compares key aspects of
the 2000 and 2010 partnership activities.
Table 4: 2010 Partnership Activities Compared to Those in 2000:
Partnership program activities: Implementation;
2000 Census: Hired about 600 partnership staff;
2010 Census: Hired about 2,800 partnership staff.
Partnership program activities: Implementation;
2000 Census: Recruited about 140,000 partner organizations;
2010 Census: Recruited more than 255,000 partner organizations.
Partnership program activities: Implementation;
2000 Census: Partnership staff spoke 35 languages;
2010 Census: Partnership staff spoke 145 languages.
Partnership program activities: Monitoring;
2000 Census: Did not establish real-time metrics to measure value-
added and limited real-time tracking of partnership activities;
2010 Census: Established metrics to measure value-added contributions
of partners and real-time tracking of partnership activities.
Partnership program activities: Monitoring;
2000 Census: Developed a partnership database to track partnership
efforts. Bureau staff reported that the database was cumbersome and
not user-friendly;
2010 Census: Revamped partnership database by, among other things,
allowing for up-to-date monitoring of partner activity and a new Web-
based interface.
Partnership program activities: Evaluation;
2000 Census: Evaluated the impact of the communications campaign as a
whole on awareness of the census, but had no ability to isolate the
effect of partnership efforts;
2010 Census: Plans to evaluate the impact of individual components of
the communications campaign on awareness of and likelihood to
participate in the census, including the impact of the partnership
program on raising awareness and affecting the participation rate.
Source: GAO analysis of U.S. Census Bureau information.
[End of table]
The Partnership Program Was Significantly Expanded, but Coordination
and Monitoring Issues from 2000 Persisted:
Aided by the Recovery Act funding that allowed the Bureau to increase
its presence in local communities, the Bureau's outreach efforts
resulted in recruiting over 100,000 more partners and increasing by
over 100 the number of languages spoken by partnership staff. The
Bureau estimated that it would spend about $280 million on partnership
program costs from fiscal years 2007 through 2011, including $120
million from the Recovery Act--an increase of 54 percent from 2000.
[Footnote 13] To expand partnership activities in HTC areas, the
Bureau used its allocation of Recovery Act-funded partnership staff in
regions with large HTC populations. As a result, while in 2000 the
average ratio was one partnership staff member for every five
counties, in 2010 the average ratio was almost one partnership staff
member for every county.
Partnership specialists conducted outreach activities that addressed
the concerns of HTC communities in their areas. For example, one
partnership specialist in the Atlanta region organized a conference of
leaders in the Vietnamese community to ease their concerns about the
confidentiality of census data. Another partnership specialist in the
Los Angeles region leveraged the credibility of several large national
Iranian and Arab organizations to help convince local community
leaders that the census was mandated by law and that their
constituents should complete and return census forms. Further, an LCOM
in the Dallas region told us that partnership specialists worked to
get a letter from the Mayor that helped enumerators gain access to
local gated communities and apartment complexes.
Coordination Issues Persisted Despite Additional Bureau Guidance:
During the 2000 Census, LCOMs we surveyed said that the reporting
structure for partnership specialists may have led to communication
and coordination hurdles between the partnership staff and local
census office staff. As a result, we recommended that the Bureau
explore ways to increase the coordination and communication between
the partnership specialists and the LCOMs.[Footnote 14] To address
coordination and communication challenges in 2010, the Bureau
developed additional guidance for partnership specialists and LCOMs,
revised partnership training materials, and held meetings between
regional operations staff and partnership staff to discuss ways to
enhance communications. For example, the Bureau revised the LCOMs'
handbook to explain that partnership specialists and local census
office staff have a responsibility to work together to ensure that
they do not duplicate each others' efforts. In addition, the
partnership training manual specifically stated that partnership
specialists should participate in local census office management
meetings, provide management teams with their schedules of planned
meetings and activities in advance, and update LCOMs on their
completed activities.
Moreover, most of the partnership staff we interviewed reported
working closely or having mutually supportive relationships with local
census office staff. For example, partnership staff in the Atlanta and
Charlotte regions said that they attended training with local census
office staff, and one partnership specialist told us that training
gave them a better understanding of the roles and responsibilities of
local census offices.
However, LCOMs we surveyed provided a more mixed view of the
coordination and communication between the partnership program and
local census offices. On the one hand, 39 percent of 395 LCOMs
responding to our March survey said they were generally or very
satisfied with partnership staff's assistance with local
challenges.[Footnote 15] In addition, some managers provided positive
comments in the open ended section of the survey about partnership
staff's assistance. For example, one LCOM commented that partnership
staff assisted with local census office recruiting activities, such as
setting up and providing materials for promotional events. In another
example, a manager from the Boston region said that the local census
office staff and the partnership specialist worked as one team and
contributed to the success of the census. These results varied
regionally, with more satisfaction in the Bureau's Boston, Los
Angeles, and Dallas regions than in the Philadelphia and New York
regions.
On the other hand, the results of our survey of LCOMs also highlight
areas for improvement. In March, 50 percent of 393 LCOMs responding
said they were generally or very dissatisfied with coordination
between local census offices and partnership staff and a similar level
of dissatisfaction was found in a follow-up survey we conducted in May
after the nonresponse follow-up operation started.[Footnote 16] Among
the responses of those LCOMs who elaborated on their satisfaction with
coordination between local census offices and partnership staff, a key
theme was a lack of cooperation or interaction between the partnership
and local census office staffs. A manager from the Chicago region said
that though the partnership specialist was good, the organizational
structure and upper management did not allow for proper interaction.
The manager said that at first, communication between the local census
office staff and the partnership specialist was prohibited by the
partnership specialist team leader, which impeded the local census
office's ability to make valuable community connections.
One reason for the coordination challenges between local census
offices and partnership staff could be their different reporting
structures. As shown in figure 1, LCOMs and partnership specialists
report to different officials, and the official who oversees both
positions is two levels above the LCOM and three levels above the
partnership specialist.
Figure 1: Reporting Structure for Regional Census Centers:
[Refer to PDF for image: organizational structure]
Top level:
Regional directors.
Second level, reporting to Regional directors:
Assistant regional census managers.
Third level, reporting to Assistant regional census managers:
* Area manager:
- Local census office manager;
* Partnership coordinator:
- Senior partnership specialist; Partnership specialist.
Source: U.S. Census Bureau.
[End of figure]
According to Bureau officials, this reporting structure was
established to allow partnership specialists to coordinate their
efforts with other partnership specialists in the same geographical
areas and share common problems and solutions. Further, some
partnership specialists were responsible for reaching out to specific
ethnic groups in areas covered by different local census offices,
making it logistically difficult for the specialists to report to one
local census office.
But among the LCOMs who elaborated on their responses to our survey, a
key theme was dissatisfaction with this reporting structure. For
example, one manager reported that the partnership program and local
census office operations are too disconnected, adding that at times
both partnership staff and local census office staff were doing the
same tasks. The manager said that the partnership program was an
essential part of a successful census, but only when performed in
conjunction with local census office operations. Another manager said
that the partnership program needs a direct link to the local census
office and suggested that a position such as an assistant manager for
partnership be added to the local census office staff. Such a
position, the manager explained, would solidify the communication
between the partnership program and the local census office.
Regardless of the management structure, what is clear is that more
positive experiences seemed to result when LCOMs and partnership
specialists dovetailed their efforts. Better communication between
partnership specialists and LCOMs may have enhanced the Bureau's
capacity to reduce duplicative efforts, close any gaps in outreach to
community organizations with significant HTC populations, and leverage
opportunities to achieve a more complete and accurate count.
Despite Revamping, the Partnership Database Remained Problematic:
The partnership tracking database could also benefit from refinements.
Despite improvements, partnership staff raised concerns about its user-
friendliness similar to those reported in 2000. In 2010, all the
partnership specialists we interviewed reported that data entry was
time consuming, and 8 of the 11 partnership staff we interviewed
reported that they needed help with data entry in order to keep the
database current. The Bureau expected to use the partnership database
to more accurately monitor and improve partnership efforts nationally;
thus the difficulty partnership staff found in updating the system is
noteworthy.
Initially, no partnership assistants were authorized to access the
database because the Bureau wanted to ensure that data were entered
into the system consistently. The Bureau was also concerned about the
additional costs associated with purchasing licenses for the large
number of partnership assistants. However, in response to regional
partnership staff's concerns over the partnership specialists'
struggles to update the database in a timely manner, the Bureau
procured approximately 400 licenses for select partnership assistants
in August 2009. But in interviews with partnership specialists from
March through May 2010, they told us that they continued to experience
difficulty meeting the data entry requirements.
Further, Bureau managers could not be sure if information in the
partnership database was up-to-date. Bureau officials told us that
they expected partnership specialists to immediately log any contact
they had with a partner into the database. However, our analysis of
reports from the database showed, on average, that about 35 percent of
users did not update the database on a weekly basis from March 4
through April 22, 2010. According to Bureau headquarters officials
responsible for managing the partnership program, because the
partnership data were not always current, they took the extra step of
organizing weekly telephone calls between headquarters and regional
partnership staff in order to gain the most up-to-date information on
partnership activities. More current information during a crucial time
period around Census Day, April 1, could have better positioned the
Bureau to quickly identify and address problem areas. Further, Bureau
managers would likely have had better data for redeploying partnership
resources to low responding areas with significant HTC populations
during different census operations.
Aligning the Delivery of Promotional Materials with the Hiring of
Partnership Staff Could Foster More Effective Relationships with
Partner Organizations:
Although the Bureau developed English and foreign language promotional
materials--both in hard copy and for the Bureau's Web page--for
partnership specialists and assistants to use when recruiting partner
organizations, the materials were not available when partnership
specialists were first hired. Eight of the 11 partnership specialists
and assistants we interviewed reported that because promotional
materials were not available when needed, it was more difficult for
them to build relationships with potential partners. Specifically, the
Bureau began hiring partnership specialists in January 2008. However,
delivery of the promotional materials did not start until April 2009,
more than a year after partnership specialists first came on board.
Although this still left a year until Census Day, by not having
promotional materials on hand when partnership staff first began their
work, the Bureau may have missed opportunities to develop and
strengthen relationships with organizations that had the ability to
influence census participation among HTC groups.
Further, three of the eight partnership staff who worked with non-
English-speaking communities said it was difficult to obtain in-
language materials when needed. For example, one partnership employee
in the Los Angeles region reported being unable to engage Korean
churches until after January 2010 when the needed in-language
materials first became available (according to Bureau officials, in-
language materials took longer to develop than English language
materials because of the need to ensure accurate translations).
Bureau officials acknowledged that the schedule for hiring partnership
staff and the delivery of promotional materials were not well aligned.
In the interim, the Bureau provided partnership staff with talking
points to help them reach out to organizations in the early phase of
the program.
Moving forward, it will be important for the Bureau to take a fresh
look at recurring problems in the partnership program, as well as
reconsider time frames for the availability of promotional materials.
Through improving communication and coordination between partnership
and local census office staff, developing a user-friendly database to
more effectively monitor the program's progress, and ensuring that
promotional materials are available for distribution when partnership
specialists are first hired, the Bureau would better position itself
to promote the census to HTC populations.
The Bureau Enhanced Enumeration Programs Aimed at HTC Groups;
Additional Refinements Could Improve Them for 2020:
Aspects of 2010 SBE Were Refined to Address Implementation Issues from
2000 and Better Enumerate HTC Groups:
To improve its ability to count individuals without conventional
housing, the Bureau made a number of improvements to SBE, many of
which were designed to address challenges experienced in 2000. For
example, in 2000, SBE enumerators were not trained to enumerate all
types of SBE facilities, which limited the times when enumeration
could occur. In response to service providers' requests for more
flexibility on scheduling enumeration during the 3-day operation, the
Bureau trained census workers to enumerate all types of SBE
facilities. This change made training more consistent nationwide and
enabled the Bureau to better accommodate last-minute schedule changes.
Further, in some cases in 2000, the supply of census forms and
training materials provided to the local offices was not adequate. In
2010, the Bureau reduced the number of form types used for enumerating
individuals at SBE facilities from four to a single multipurpose form.
According to Bureau officials, this change allowed them to provide an
adequate number of forms to local census offices and also helped
increase efficiency.
The Bureau took several steps that helped it identify a larger number
of SBE facilities in 2010 than in 2000, thereby positioning the Bureau
to conduct a more complete count. The actual number of SBE facilities
the Bureau enumerated in 2000 was 14,817, whereas for 2010 the Bureau
had plans to enumerate 64,626 sites--four times more than previously
enumerated.[Footnote 17] The steps included working more closely with
local and national partner organizations and assigning partnership
assistants a role in identifying service-providing facilities. The
Bureau also developed better guidance for partnership assistants to
identify TNSOLs, relying in part on input from partner organizations,
such as church groups and service providers that were familiar with
outdoor areas where people often spent the night. Further, the Bureau
used public mailings and technology, such as the Internet, to find a
broader spectrum of facilities, as compared to local telephone
listings that were used in 2000. Table 5 compares key aspects of the
2000 and 2010 SBE operations.
Table 5: Comparison of 2000 and 2010 SBE Operations:
SBE activities: Planning and training;
2000 Census: Used four different types of questionnaires to enumerate
SBE facilities;
2010 Census: Used one questionnaire to minimize confusion and
facilitate the availability of supplies in a timely manner.
SBE activities: Planning and training;
2000 Census: Did not consolidate training for SBE facilities;
2010 Census: Consolidated training for staff enumerating people living
in group situations such as those in SBE facilities, thereby enabling
enumerators to work on multiple operations and all types of SBE
facilities.
SBE activities: Planning and training;
2000 Census: Questionnaires and training materials were insufficient,
untimely, or both;
2010 Census: Materials were generally timely and sufficient.
SBE activities: Planning and training;
2000 Census: Conducted advance visits to identify the population to be
enumerated and issues that could affect enumeration;
2010 Census: Same as 2000.
SBE activities: Planning and training;
2000 Census: Made no additions to list of SBE facilities and TNSOLs
after the enumeration date;
2010 Census: Allowed additions to list of SBE facilities and TNSOLs
through the last day of SBE enumeration.
SBE activities: Planning and training;
2000 Census: Allowed no flexibility for facilities on when they would
be enumerated;
2010 Census: Provided facilities with flexibility on when they would
be enumerated.
SBE activities: Planning and training;
2000 Census: Identified SBE sites by working with local governments
and community-based organizations, reviewing facility listings from
other census operations, and having local staff review the yellow
pages;
2010 Census: Expanded efforts to identify SBE sites by providing
partnership staff with more guidance, including identifying TNSOLs,
and by having headquarters staff work more closely with regional and
local staff to develop a more complete list.
SBE activities: Evaluation;
2000 Census: Assessment included an examination of duplicate
questionnaires and quality assurance procedures. Used results for
future planning;
2010 Census: Assessment will include (1) final workload volumes,
costs, and quality assurance results; (2) information collected from
debriefings; and (3) lessons learned. Plans to use results for future
planning.
Source: GAO analysis of U.S. Census Bureau information.
[End of table]
The Bureau Generally Implemented SBE Consistent with Its Operation
Plans but Experienced Continuing Challenges:
The Bureau generally implemented the SBE operation as planned,
completing the 3-day operation on schedule, and spending $10.9
million, slightly more than the $10.6 million budgeted for the
operation. However, while the overall budget estimate for the 2010 SBE
operation was more accurate than in 2000, the actual costs for local
census offices in urban HTC areas was almost double the amount
budgeted---$1.9 million compared to the actual cost of $3.6 million.
[Footnote 18] Bureau officials said they will examine the data further
to determine why the budget was exceeded in urban HTC areas. We have
noted the Bureau's difficulties in developing accurate cost estimates
for several other Bureau operations, and the cost overrun in urban HTC
areas is another example of this.[Footnote 19]
As in 2000, our observers noted that enumerators were professional,
responsible, knowledgeable, and highly committed to fulfilling their
responsibilities. For example, during heavy rain in the Boston area,
enumerators remained focused on counting individuals living under
overhangs and stairwells, despite the difficult conditions. Our
observers in Brooklyn reported the same of enumerators there, although
enumeration of the outdoor locations was delayed one night because of
adverse weather conditions. Further, one of our observers reported
that in Los Angeles, cultural advocates--individuals the Bureau hired
to accompany enumerators and facilitate access to certain communities--
helped ease potentially tense situations.
As described below, based on our observations and the results of the
LCOM survey, SBE generally went well, and in some areas the Bureau
appears to have addressed challenges it experienced in 2000.
Enumeration Supplies Were Generally Adequate:
Enumerators we spoke with reported having enough forms in 68 of 78
sites we visited. Also, 76 percent of 359 LCOMs who responded to our
question on the timing of the delivery of questionnaires and other
enumerations supplies were generally or very satisfied. In contrast,
during the 2000 Census, our observers noted that the timing of
questionnaires and training materials was not always adequate at the
locations they visited, which impeded enumerators' ability to conduct
their work in a timely manner.
Advanced Visits Helped Enhance Service Providers' Readiness for
Enumeration:
Our observers reported that facilities were prepared for SBE
enumeration in 35 of 56 visits to SBE facilities. Furthermore, 73
percent of 356 LCOMs who responded to our question about the readiness
of SBE facilities were generally or very satisfied. In instances where
facilities were not prepared, there appears to have been an
expectation or communication gap. Despite advance visits from the
Bureau, one representative at a Baltimore facility said she was not
aware that census workers were expected, and would not allow
enumeration to take place because it would disrupt the individuals'
dinner and medication treatments. She was not receptive to the workers
returning later the same evening. In another case, a Boston facility
manager was not aware that the enumeration was to take place, but
allowed the census workers to proceed. Bureau officials said that in
some instances facility staff may not have communicated previous
agreements for conducting the enumeration to new or other staff on
duty at the time of the enumeration.
Training Material Was Tailored to Accommodate Local Conditions:
Of the LCOMs we surveyed, 65 percent of 359 LCOMs were generally or
very satisfied that the content of SBE training materials was tailored
to accommodate local conditions, such as taking into account whether
an area was urban or rural. In 2000, enumerators expressed concern
that the training they received did not always adequately prepare them
for the wide range of scenarios they encountered.
Despite these successes, the Bureau experienced some procedural and
operational challenges during SBE implementation, some of which were
similar to the Bureau's experience in 2000.
Enumerators Did Not Always Follow Procedures:
The Bureau's policy referred to in its SBE enumeration manual
stipulates that when individuals state that they have already been
enumerated elsewhere, the enumerator still must attempt to complete a
questionnaire.[Footnote 20] While enumerators adhered to this
procedure at about two-thirds of the facilities we visited, we found
that in 26 of 78 visits enumerators did not attempt to enumerate
individuals who told them they had already completed a questionnaire
at another location. When individuals refuse to be enumerated,
regardless of the reason, the Bureau's guidance instructs enumerators
to ask the facility's contact person for information about the
individual. If a contact person is not available, the enumerator
should attempt to complete as much of the questionnaire as possible
through observation. By not always following these procedures,
enumerators may have missed individuals who should have been
enumerated and the extent to which accuracy of the count was affected
is unknown.
Enumerators Did Not Always Fulfill Agreements:
As mentioned previously, Bureau officials visited SBE facilities to
make agreements with service providers on conducting the actual
enumeration. Our observers noted that in 15 of 78 site visits,
enumerators did not arrive as scheduled at shelter locations. One of
these instances occurred in Washington, D.C., where the facility
manager had instructed the clientèle who typically frequent that
location to make an effort to be present when the enumerator arrived.
According to the facility manager, the enumerator did not arrive at
the scheduled time. In another instance, a facility manager at a
Boston site told our observers that she was concerned that enumerators
had arrived earlier than the agreed-upon time. She explained that her
clientèle consisted of emotionally disturbed women, many of whom had
fears of authority. Thus, she said she would have preferred more time
to prepare the women for the impending visit.
When enumerators do not fulfill commitments, the missed appointments
and the need to reschedule could make the enumeration more burdensome
to service providers and detract from the Bureau's reputation.
Determining Appropriate Staffing Levels for SBE Sites Was Sometimes
Problematic:
The mobile nature of the SBE population and other factors make it
difficult to precisely determine the number of enumerators that should
be sent to a particular site, and sending either too many or too few
enumerators each has its consequences. Although the Bureau has
guidance on staffing ratios for enumerating different types of group
quarters, including service-based facilities, it did not always result
in optimal levels of staffing at shelters and TNSOLs. Overstaffing can
lead to unnecessarily higher labor costs and poor productivity, while
understaffing can affect the Bureau's ability to obtain a complete
count at a particular site.
Our observers and those in the Department of Commerce's Office of
Inspector General both reported overstaffing as an issue at SBE
locations. For example, at one of our SBE site visits, approximately
30 enumerators reported to the same shelter in Atlanta to conduct the
enumeration. Unsure of how to proceed, the census enumerators waited
for over an hour before a crew leader instructed over half of the
enumerators present to leave, at which point no work had taken place.
Similarly, the Department of Commerce Inspector General's staff
observed long periods of inactivity at sites and increased operational
costs as a result.[Footnote 21]
Also, while most LCOMs we surveyed were satisfied with SBE staffing
levels, pockets of dissatisfaction existed at some locations. Of the
LCOMs responding to our survey in April, 81 percent of 361 were
generally or very satisfied with the number of enumerators hired to
complete the SBE workload, 10 percent of managers said they were
generally or very dissatisfied, and 9 percent of managers said they
were neither satisfied nor dissatisfied. Of the responses from
managers who elaborated on our question about their satisfaction level
with the SBE operation, a key theme that emerged was overstaffing. One
manager, elaborating on his response, said that he sent a detailed
cost and benefit document to higher-level Bureau officials to
demonstrate that the number of enumerators needed for the SBE
operation in his local area should be reduced, but his request was
denied. In another instance, a manager said he was required to train
and hire at least 100 more enumerators than he felt were necessary.
Given the Bureau's constitutional mandate to enumerate the country's
entire population and the difficulty of enumerating the SBE
population, it is not unreasonable for the Bureau to err on the side
of over-rather than understaffing SBE to help ensure a complete count.
Going forward, as part of the Bureau's plans to examine SBE costs,
schedule, training, and staffing, it will be important for the Bureau
to determine the factors that led to less-than-optimal staffing levels
and use the information to help determine staffing levels for SBE in
2020.
Be Counted/QAC Programs Were Implemented as Planned, but Visibility
Issues Remain a Concern:
For 2010, the Bureau developed plans that according to Bureau
officials, were designed to address challenges that the Be Counted/QAC
programs faced during the 2000 Census, such as (1) visibility of
sites, (2) ability of the public to find where the Be Counted/QAC
sites were located, and (3) monitoring of site activity. In 2000, for
example, several sites we visited lacked signs publicizing the sites'
existence, which greatly reduced visibility. In some sites, census
questionnaires were in places where people might not look for them,
such as the bottom of a shelf. We reported that the Bureau had
problems with keeping site information current, and as a result,
changes in the information about the program's site location or points
of contact were not always available to the public.[Footnote 22] To
address these issues, in 2010, the Bureau created banners for display
in public areas of Be Counted/QAC sites, developed a Web page with
locations and hours of the sites, and updated the guidance for site
selection. Table 6 compares key aspects of the 2000 and 2010 Be
Counted/QAC programs.
Table 6: Comparison of 2000 and 2010 Be Counted/QAC Programs:
Be Counted/QAC activities: Planning and site selection;
2000 Census: Selected sites via a joint effort between partnership
specialists and partner organizations. No role for local census office
staff;
2010 Census: Selected sites via joint effort between partnership
specialists and local census office staff with input from partner
organizations.
Be Counted/QAC activities: Implementation;
2000 Census: Had a goal to establish about 66,000 locations. Census
data indicated that 28,632 were established;
2010 Census: Had a goal to establish 40,000 Be Counted and QAC sites.
Census preliminary data indicated that 38,827 sites were established.
Be Counted/QAC activities: Implementation;
2000 Census: Staffed sites with paid employees and volunteers, which
led to inconsistent service;
2010 Census: Staffed sites solely with paid employees to ensure
consistent service.
Be Counted/QAC activities: Implementation;
2000 Census: Forms available in 6 languages--English, Spanish,
Chinese, Korean, Vietnamese, and Tagalog. Language assistance guides
available in 37 languages;
2010 Census: Forms available in 6 languages--English, Spanish,
Chinese, Vietnamese, Korean, and Russian. Language assistance guides
available in 59 languages.
Be Counted/QAC activities: Implementation;
2000 Census: Did not provide Web page for public to locate Be Counted/
QAC locations;
2010 Census: Established a Web page that helped the public locate Be
Counted/QAC locations.
Be Counted/QAC activities: Implementation;
2000 Census: Did not issue official signage identifying Be Counted/QAC
sites;
2010 Census: Issued uniform signage for prominent display at sites.
Be Counted/QAC activities: Monitoring;
2000 Census: Attempted to monitor site performance, but the number of
Be Counted/QAC sites was more than could be handled;
2010 Census: Monitored sites by designating Be Counted clerks in local
census offices to regularly visit sites and check staffing and
adequacy of materials.
Be Counted/QAC activities: Evaluation;
2000 Census: Relied on cost and workload data;
2010 Census: Same as 2000.
Be Counted/QAC activities: Evaluation;
2000 Census: Assessment included final workload volumes, costs, and
quality assurance results. Used for future planning;
2010 Census: Same as 2000.
Source: GAO analysis of U.S. Census Bureau information.
[End of table]
The Bureau generally implemented the Be Counted/QAC program as
planned. The Bureau opened around 38,000 sites, conducted the Be
Counted/QAC program as scheduled from March 19 through April 19,
[Footnote 23] and completed the Be Counted/QAC program under budget.
The Bureau reported spending $38.7 million versus the $44.2 million
budgeted. Bureau officials commented that the program came in under
budget in part because the Bureau staffed the sites with one QAC
representative for 15 hours a week, rather than with 1.5
representatives, as originally budgeted. This allowed the Bureau to
spend less on payroll and training, according to officials.
Overall, the majority of the 51 sites we visited were staffed as
planned and census materials and forms were available at most sites in
multiple languages. Further, the Bureau's preliminary data on 2010
show overall activity at Be Counted/QAC sites increased, with about 1
million more forms picked up in 2010, compared to the approximately
1.7 million forms in 2000--an increase of 62 percent.
Visibility of Be Counted/QAC Sites Was Poor at Many Sites Visited:
Visibility is key to the effectiveness of Be Counted/QAC sites because
it is directly related to people's ability to find them. According to
the Bureau's Be Counted job aid guidance, Be Counted clerks in local
census offices were responsible for monitoring sites and ensuring that
banners were displayed at Be Counted/QAC locations. In many locations
we visited, the Bureau's efforts to raise the visibility of sites were
evident to our observers. For example, 23 of the 51 Be Counted/QAC
sites visited were displaying the banners the Bureau developed to
advertise the existence of the Be Counted/QAC sites. More generally,
however, there were areas for improvement. For example, our observers
noted problems with "street-level" visibility in 26 of 51 Be Counted/
QAC sites visited. At one site in Atlanta, for instance, no signs were
visible from the main road to publicize the existence of the Be
Counted site. In addition, our observers visited two sites in Brooklyn
that were not visible from the street. In some cases, the banners
provided by the Bureau to advertise the location of a site were not
used or displayed prominently upon entering a location that housed a
site. At another site in Washington, D.C., our observers noted that
the banner was rolled up and leaning against a file cabinet and
consequently was not clearly visible to the public.
In addition, Be Counted/QAC sites were sometimes in obscure locations
within the buildings in which they were housed. For example, at sites
located in the basement or rear of the building, we observed no
signage directing people to the Be Counted/QAC site. Further, forms
and materials available at Be Counted/QAC sites were not always
clearly identified and thus could have been overlooked. Figure 2 is an
example of a Be Counted site in Brooklyn that was prominently visible
at a library. Importantly, the banner was clearly displayed to draw
attention to the site, and the time that staff would be in attendance
was also obvious.
Figure 2: Be Counted Forms Prominently Displayed at Brooklyn Be
Counted/QAC Site:
[Refer to PDF for image: photograph]
Notation on photograph: Be Counted forms are clearly displayed on the
table.
Source: GAO.
[End of figure]
In contrast, figure 3 shows a Be Counted site in Fresno, California,
that was difficult to find in a barbershop. Note that the area had no
signage to draw attention to the site and the forms were scattered
about and difficult to find.
Figure 3: Be Counted Forms Not Prominently Displayed at Fresno Be
Counted/QAC Site:
[Refer to PDF for image: photograph]
Notation on photograph: Be Counted forms are stored under the shelf.
Source: GAO.
[End of figure]
In those instances when the Be Counted/QAC sites were not clearly
visible to the public, the Bureau may have missed one of the last
opportunities to directly enumerate individuals. Moving forward, the
Bureau should consider more effective ways to monitor site visibility
at Be Counted/QAC sites. For example, the Bureau could include
visibility as one of the areas to monitor when census staff conduct
their regular monitoring of the Be Counted sites.
Site Selection Guidance Does Not Consider Potential Activity Levels:
Along with visibility, the procedures used to select Be Counted/QAC
sites are also key to the effectiveness of the program because they
affect the extent to which sites are easily accessible to targeted
populations. To improve selection of Be Counted/QAC sites in 2010, the
Bureau revised its guidance on Be Counted/QAC site criteria by
emphasizing locating sites in HTC areas and specifying the types of
local census office areas where sites should be located (e.g., urban/
HTC and urban/metropolitan). However, the guidance did not provide
direction on identifying sites in locations with the likelihood of
higher levels of activity, which would increase the potential for
individuals to pick up Be Counted forms. Moreover, Bureau officials
said they encouraged staff to take advantage of locations that were
free of charge as well as locations with the likelihood of higher
levels of activity.
Activity levels at the Be Counted/QAC sites varied based on
information from Bureau staff and our observations. QAC
representatives at 8 of 43 QAC-only sites visited told us that their
sites had moderate to high levels of activity while 12 of 43 QAC
representatives told us their sites had low levels of activity.
[Footnote 24] For example, a QAC representative at one facility in
Phoenix and another in Atlanta said they had to frequently restock Be
Counted forms and that they provided many people with assistance.
Another QAC representative in Dallas said that he assisted up to 30
people in one day at the Be Counted/QAC site he staffed. Conversely, a
QAC representative in Miami said that the LCOM was considering the
site for closure because very few people visited the location and used
the services. Similarly, a firefighter at a Dallas QAC site observed
that the site was open for 11 days and no one visited the site during
this time and the box containing materials accompanying the
questionnaires (i.e., pens and language reference documents) was
unopened. Additionally, during a June debriefing, where QAC
representatives discussed their experiences with Bureau officials, the
QAC representatives commented on the problem of low activity at some
Be Counted/QAC sites, according to Bureau officials.
Preliminary data on forms returned and checked in also revealed
changes in activity levels at Be Counted/QAC sites for 2010. For
example, an average of 20 forms were returned and checked in from each
Be Counted/QAC site in 2010, down from an average of 28 in 2000. Given
that the operation was conducted over a 30-day period, that translates
to less than 1 form per day per site. While this difference might
reflect the fact that the address list in 2010 was better than in 2000
and that fewer households were missed, it also indicates that the
operation was very resource intensive relative to the number of forms
that were returned.
According to Bureau planning guidance, both local census office staff
and partnership specialists were jointly responsible for identifying
Be Counted/QAC sites, and local census office staff were responsible
for monitoring the sites. However, a number of LCOMs we surveyed in
May expressed concern about assistance from partnership specialists in
identifying Be Counted/QAC sites. While 32 percent of 369 LCOMs who
responded to our survey were generally or very satisfied with the
assistance they received from partnership specialists for identifying
sites, 57 percent of managers responding indicated that they were
generally or very dissatisfied. Among the responses of those LCOMs who
elaborated on their satisfaction level with the partnership program,
one key theme that emerged was dissatisfaction with the Be Counted/QAC
sites identified. For example, one LCOM commented that many of the Be
Counted/QAC sites were in poor locations and were not in areas with
the highest need. To the extent that the Be Counted/QAC sites were
established in locations with low activity, the result was lower
productivity and higher costs to the Bureau in the form of wages paid
to census employees to staff and monitor the sites. There were also
opportunity costs in monitoring a site with low activity when a site
in a different location could have produced better results.
The Be Counted/QAC program, in concept, may be a reasonable effort to
include people who might have otherwise been missed by the census.
However, it was also a resource-intensive operation in which
relatively few questionnaires, on average per site, were generated,
once the cost and effort of identifying, stocking, staffing,
monitoring, and maintaining the sites are considered. More will be
known about the effectiveness of the Be Counted/QAC program when the
Bureau determines how many Be Counted/QAC forms resulted in adding
people and new addresses to the census. Similar to SBE, the Bureau
plans to assess the Be Counted/QAC program by examining costs,
schedule, training, and staffing. Moving forward, it will also be
important for the Bureau to explore ways to maximize the Be
Counted/QAC program's ability to increase the number of forms returned
and checked in from the target population for the 2020 Census and,
ultimately, determine whether fewer but more strategically placed
sites could produce more cost-effective results.
Conclusions:
In 2010, the Bureau was better positioned to reach out to and
enumerate HTC populations compared to 2000 in large part because its
plans addressed a number of the challenges experienced in the previous
decennial. For example, the Bureau focused more of its resources on
targeting paid media efforts to HTC groups, employed partnership staff
with a wider range of language capabilities, and developed a more
comprehensive list of service-providing facilities that likely
enhanced its capacity to enumerate people lacking conventional
housing. Further, from an operational perspective, the Bureau
generally implemented its HTC outreach and enumeration efforts
consistent with its operational plans, completing them within schedule
and budget. Overall, while the full impact of these efforts will not
be known until after the Bureau completes various assessments,
including an evaluation of the extent and nature of any under-and
overcounts, the Bureau's rigorous effort to raise awareness, encourage
participation, and enumerate HTC populations likely played a key role
in holding mail participation rates steady in 2010 for the overall
population, a significant achievement given the various factors that
were acting against an acceptable mail response in 2010.
Still, certain aspects of the Bureau's outreach and enumeration of HTC
populations need attention. Key focus areas for outreach efforts
include (1) ensuring the Bureau is using paid media efficiently to
improve response rates, (2) improving the coordination between
partnership and local census office staff to leverage opportunities to
achieve a more accurate and complete count, (3) improving the
partnership database to enhance its use as a management tool, and (4)
making promotional materials available to partnership staff when they
begin their work to improve their ability to develop relationships
with partner organizations. For enumeration activities, by determining
the factors that lead to the SBE staffing issues at some locations and
revising site selection guidance for Be Counted/QAC sites based on
visitation and other applicable data, the Bureau may increase the
overall value of special enumeration activities.
More generally, the Bureau invested more resources in reaching out to
and enumerating HTC groups in 2010 but achieved the same overall
participation rate as in 2000. This trend is likely to continue as the
nation's population gets larger, more diverse, and more difficult to
count. As the Bureau looks toward the next national headcount, it
plans to use the results of its evaluations for input into 2020
planning. At the same time, it will be important for the Bureau to go
beyond that and use 2010 evaluation results to gain a better
understanding of the extent to which the various special enumeration
activities aimed at HTC groups produced a more complete and accurate
census. More specifically, better information on the value added by
each special enumeration activity could help the Bureau allocate its
resources more cost effectively. This may include changing existing
programs to increase efficiency or undertaking new special enumeration
efforts altogether.
Recommendations for Executive Action:
To help improve the effectiveness of the Bureau's outreach and
enumeration efforts, especially for HTC populations, should they be
used again in the 2020 Census, we recommend that the Secretary of
Commerce require the Under Secretary for Economic Affairs as well as
the Director of the U.S. Census Bureau to take the following seven
actions:
To improve the Bureau's marketing/outreach efforts:
* Use evaluation results, response rate, and other data to develop a
predictive model that would inform decisions on how much and how best
to allocate paid media funds for 2020.
* Develop mechanisms to increase coordination and communication
between the partnership and local census office staff. Possible
actions include offering more opportunities for joint training,
establishing protocols for coordination, and more effectively
leveraging the partnership contact database to better align
partnership outreach activities with local needs.
* Improve the user-friendliness of the partnership database to help
ensure more timely updates of contact information and enhance its use
as a management tool.
* Ensure that promotional materials, including in-language materials
for the partnership program, are available when partnership staff are
first hired.
To improve some of the Bureau's key efforts to enumerate HTC
populations:
* Assess visitation, response rate, and other applicable data on Be
Counted/QAC locations and use that information to revise site
selection guidance for 2020.
* Determine the factors that led to the staffing issues observed
during SBE and take corrective actions to ensure more efficient SBE
staffing levels in 2020.
* Evaluate the extent to which each special enumeration activity
improved the count of traditionally hard-to-enumerate groups and use
the results to help inform decision making on spending for these
programs in 2020.
Agency Comments and Our Evaluation:
On December 8, 2010, the Secretary of Commerce provided written
comments on the draft report, which are reprinted in appendix I. The
Department of Commerce generally agreed with the overall findings and
recommendations of the report. In addition, the department noted that
its Economics and Statistics Administration (ESA) has management
oversight responsibility for the Bureau and asked that we include ESA
in our recommendation. We revised the report to reflect this comment.
We are sending copies of this report to the Secretary of Commerce, the
Director of the U.S. Census Bureau, the Under Secretary for Economic
Affairs, and interested congressional committees. The report also is
available at no charge on GAO's Web site at [hyperlink,
http://www.gao.gov].
If you or your staff have any questions about this report, please
contact me at (202) 512-2757 or goldenkoffr@gao.gov. Contact points
for our Offices of Congressional Relations and Public Affairs may be
found on the last page of this report. GAO staff who made major
contributions to this report are listed in appendix II.
Signed by:
Robert Goldenkoff:
Director Strategic Issues:
[End of section]
Appendix I: Comments from the Department of Commerce:
United States Department Of Commerce:
The Secretary of Commerce:
Washington, D.C. 20230:
December 8, 2010:
Mr. Robert Goldenkoff:
Director:
Strategic Issues:
United States Government Accountability Office:
Washington, DC 20548:
Dear Mr. Goldenkoff:
The Department of Commerce appreciates the opportunity to comment on
the United States Government Accountability Office (GAO) draft report
titled "2010 Census: Key Efforts to Include Hard-to-Count Populations
Went Generally as Planned; Improvements Could Make the Efforts More
Effective for Next Census" (GAO 11-45). Our comments on this report
are enclosed.
Sincerely:
Gary Locke:
Enclosures:
[End of letter]
Department of Commerce:
Comments on the United States Government Accountability Office Draft
Report Titled "2010 Census: Key Efforts to Include Hard-to-Count
Populations Went Generally as Planned; Improvements Could Make the
Efforts More Effective for Next Census" (GAO 11-45); December 2010:
The Department of Commerce thanks the GAO for their extensive efforts
in examining these 2010 Census activities and for their ongoing
efforts to help us develop a successful plan for the 2020 Census.
The Census Bureau generally agrees with the overall findings in this
report and with the recommendations regarding matters we should study
for the 2020 Census. Our comments follow.
* Page 44, first paragraph: "...we recommend that the Secretary of
Commerce require the Director of the U.S. Census Bureau to take the
following seven actions:..."
Response: The Secretary of Commerce should require the Under Secretary
for Economic Affairs as well as the Census Director. The Under
Secretary heads the Economics and Statistics Administration (ESA),
which has management oversight responsibility for the Census Bureau
and has been actively engaged in planning for the 2020 Census.
Page 45, second paragraph from bottom of page: "We are sending copies
of this report to the Secretary of Commerce, the Director of the U.S.
Census Bureau, and interested congressional committees."
Response: Please also send a copy of the report to the Under Secretary
for Economic Affairs.
[End of section]
Appendix II: GAO Contact and Staff Acknowledgments:
GAO Contact:
Robert Goldenkoff, (202) 512-2757 or goldenkoffr@gao.gov:
Staff Acknowledgments:
In addition to the contact named above, Signora May, Assistant
Director; Peter Beck; David R. Bobruff; Benjamin C. Crawford;
Shaunessye Curry; Kathleen Drennan; Elizabeth Fan; Robert Gebhart;
Guillermo Gonzalez; Thomas Han; Paul Hobart; Brian James; Paul Kinney;
Elke Kolodinski; Kirsten B. Lauber; Veronica Mayhand; Karine McClosky;
Catherine Myrick; Keith O'Brien; Michael Pahr; Melanie Papasian;
Rudolfo Payan; Stacy Spence; Barbara Steel-Lowney; Travis Thomson;
Cheri Y. Truett; Timothy Wexler; Monique B. Williams; Carla Willis;
and Katherine Wulff made key contributions to this report.
[End of section]
Related GAO Products:
2010 Census: Data Collection Operations Were Generally Completed as
Planned, but Long-standing Challenges Suggest Need for Fundamental
Reforms. GAO-11-193. Washington, D.C.: December 14, 2010.
2010 Census: Follow-up Should Reduce Coverage Errors, but Effects on
Demographic Groups Need to Be Determined. GAO-11-154. Washington,
D.C.: December 14, 2010.
2010 Census: Cooperation with Enumerators Is Critical to Successful
Headcount. GAO-10-665T. Washington, D.C.: April 30, 2010.
2010 Census: Plans for Census Coverage Measurement Are on Track, but
Additional Steps Will Improve Its Usefulness. GAO-10-324. Washington,
D.C.: April 23, 2010.
2010 Census: Data Collection Is Under Way, but Reliability of Key
Information Technology Systems Remains a Risk. GAO-10-567T.
Washington, D.C.: March 25, 2010.
2010 Census: Operational Changes Made for 2010 Position the U.S.
Census Bureau to More Accurately Classify and Identify Group Quarters.
GAO-10-452T. Washington, D.C.: February 22, 2010.
2010 Census: Efforts to Build an Accurate Address List Are Making
Progress, but Face Software and Other Challenges. GAO-10-140T.
Washington, D.C.: October 21, 2009.
2010 Census: Census Bureau Continues to Make Progress in Mitigating
Risks to a Successful Enumeration, but Still Faces Various Challenges.
GAO-10-132T. Washington, D.C.: October 7, 2009.
2010 Census: Communications Campaign Has Potential to Boost
Participation. GAO-09-525T. Washington, D.C.: March 23, 2009.
2010 Census: Fundamental Building Blocks of a Successful Enumeration
Face Challenges. GAO-09-430T. Washington, D.C.: March 5, 2009.
2010 Census: Census Bureau Needs Procedures for Estimating the
Response Rate and Selecting for Testing Methods to Increase Response
Rate. GAO-08-1012. Washington, D.C.: September 30, 2008.
2010 Census: The Bureau's Plans for Reducing the Undercount Show
Promise, but Key Uncertainties Remain. GAO-08-1167T. Washington, D.C.:
September 23, 2008.
[End of section]
Footnotes:
[1] The differential undercount describes subpopulations that are
undercounted at a different rate than the total population.
[2] The 2000 mail participation rate was 74 percent for the short-form
only. In 2000, the census included a long-form that asked for
information that was not included on the short-form. The 2000 mail
participation rate when including both the long-form and the short-
form was 69 percent. The 2010 census did not use a long-form.
[3] GAO, 2010 Census: Data Collection Operations Were Generally
Completed as Planned, but Long-standing Challenges Suggest Need for
Fundamental Reforms, GAO-11-193 (Washington, D.C.: Dec. 14, 2010).
GAO, 2010 Census: Follow-up Should Reduce Coverage Errors, but Effects
on Demographic Groups Need to be Determined, GAO-11-154 (Washington,
D.C.: Dec. 14, 2010). For additional products, see the Related GAO
Products section at the end of this report.
[4] The other approximately 5 percent of the communications campaign
budget was targeted to public relations, at about 3.9 percent
($25,610,360), and the Census in Schools Program, at about 1.7 percent
($11,320,000).
[5] Additionally, we visited local census offices in Atlanta,
Baltimore, Brooklyn, Fresno, Miami, Phoenix, San Francisco and
Washington, D.C. The Bureau had 494 local census offices nationwide.
Local census offices recruited and trained enumerators and checked in
completed questionnaires, among other tasks.
[6] The Integrated Partnership Contact Database tracks and monitors
activities of partner organizations. Available in January 2009, the
database contains real-time information on the number of partner
organizations, populations served, demographics, value-added
contributions, and constituent reach.
[7] Pub. L. No. 111-5, div. A, tit. II, 123 Stat. 115, 127. The Bureau
received $1 billion from the Recovery Act. In the conference report
accompanying the Act, the conferees stated that "of the amounts
provided, up to $250,000,000 shall be for partnership and outreach
efforts to minority communities and hard-to-reach populations." H.R.
Conf. Rep. No. 116-16 at 417 (2009). According to the Bureau, it
planned to use $220 million for expanding the communications campaign,
out of this amount, $120 million was to enhance the partnership
program. The Bureau planned to use $30 million for expanding its
coverage follow-up operation, where census workers follow up to
resolve conflicting information provided on census forms.
[8] Cases where the respondents indicated that they had no usual
address will be assigned to higher-level geographic units, such as
state and county, and are allocated to census counts accordingly.
[9] GAO, 2000 Census: Review of Partnership Program Highlights Best
Practices for Future Operations, GAO-01-579 (Washington, D.C.: Aug.
20, 2001).
[10] Partnership assistants were responsible for assisting partnership
specialists in scheduling and conducting outreach activities.
[11] The staffing level was substantially higher than the Bureau
originally planned for 2010 because of additional funds used to
enhance the partnership program from the Recovery Act. See footnote 7.
[12] GAO-01-579.
[13] From October 1997 through September 2000, the Bureau spent about
$182 million on its partnership program in constant 2010 dollars.
[14] GAO-01-579.
[15] The number of managers who responded to individual survey
questions varied by question.
[16] Nonresponse follow-up is the largest and most costly field
operation, where census workers follow up in person with households
that did not respond to the census forms that were mailed to them.
[17] At the time of our work, the Bureau had not yet produced a final
number of facilities actually enumerated.
[18] In 2000, the budget for SBE, in constant 2010 dollars, was $52.2
million, and the expended amount, in constant 2010 dollars, was $12.1
million. Budgeted and actual dollars spent for 2000 and 2010 were
rounded.
[19] GAO, 2010 Census: Efforts to Build an Accurate Address List Are
Making Progress, but Face Software and Other Challenges, GAO-10-140T
(Washington, D.C.: Oct. 21, 2009), and 2010 Census: Census Bureau
Should Take Action to Improve the Credibility and Accuracy of Its Cost
Estimate for the Decennial Census, GAO-08-554 (Washington, D.C.: June
16, 2008).
[20] The Bureau has procedures in place to remove duplications at a
later date.
[21] Department of Commerce, Office of Inspector General, 2010 Census:
Quarterly Report to Congress, Final Report No. OIG-197914 (May 2010).
[22] GAO, 2000 Census: Actions Taken to Improve the Be Counted and
Questionnaire Assistance Center Programs, GAO/GGD-00-47 (Washington,
D.C.: Feb. 25, 2000).
[23] The Bureau opened Be Counted sites on February 26, 2010, in areas
where Bureau staff were hand delivering questionnaires to housing
units with mostly rural route and PO Box addresses.
[24] QAC representatives at 23 of 43 sites did not comment on the
level of activity at their sites. Of the 51 Be Counted/QAC sites
visited, 43 were QAC-only sites and 8 were Be Counted-only sites.
[End of section]
GAO's Mission:
The Government Accountability Office, the audit, evaluation and
investigative arm of Congress, exists to support Congress in meeting
its constitutional responsibilities and to help improve the performance
and accountability of the federal government for the American people.
GAO examines the use of public funds; evaluates federal programs and
policies; and provides analyses, recommendations, and other assistance
to help Congress make informed oversight, policy, and funding
decisions. GAO's commitment to good government is reflected in its core
values of accountability, integrity, and reliability.
Obtaining Copies of GAO Reports and Testimony:
The fastest and easiest way to obtain copies of GAO documents at no
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each
weekday, GAO posts newly released reports, testimony, and
correspondence on its Web site. To have GAO e-mail you a list of newly
posted products every afternoon, go to [hyperlink, http://www.gao.gov]
and select "E-mail Updates."
Order by Phone:
The price of each GAO publication reflects GAO‘s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black and
white. Pricing and ordering information is posted on GAO‘s Web site,
[hyperlink, http://www.gao.gov/ordering.htm].
Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537.
Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional
information.
To Report Fraud, Waste, and Abuse in Federal Programs:
Contact:
Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]:
E-mail: fraudnet@gao.gov:
Automated answering system: (800) 424-5454 or (202) 512-7470:
Congressional Relations:
Ralph Dawn, Managing Director, dawnr@gao.gov:
(202) 512-4400:
U.S. Government Accountability Office:
441 G Street NW, Room 7125:
Washington, D.C. 20548:
Public Affairs:
Chuck Young, Managing Director, youngc1@gao.gov:
(202) 512-4800:
U.S. Government Accountability Office:
441 G Street NW, Room 7149:
Washington, D.C. 20548: