2000 Census
Best Practices and Lessons Learned for More Cost-Effective Nonresponse Follow-up
Gao ID: GAO-02-196 February 11, 2002
Nonresponse follow-up--in which Census Bureau enumerators go door-to-door to count individuals who have not mailed back their questionnaires--was the most costly and labor intensive of all 2000 Census operations. According to Bureau data, labor, mileage, and administrative costs totaled $1.4 billion, or 22 percent of the $6.5 billion allocated for the 2000 Census. Several practices were critical to the Bureau's timely competition of nonresponse follow-up. The Bureau (1) had an aggressive outreach and promotion campaign, simplified questionnaire, and other efforts to boost the mail response rate and thus reduce the Bureau's nonresponse follow-up workload; (2) used a flexible human capital strategy that enabled it to meet its national recruiting and hiring goals and position enumerators where they were most needed; (3) called on local census offices to identify local enumeration challenges, such as locked apartment buildings and gated communities, and to develop action plans to address them; and (4) applied ambitious interim "stretch" goals that encouraged local census offices to finish 80 percent of their nonresponse follow-up workload within the first four weeks and be completely finished by the end of the eighth week, as opposed to the ten-week time frame specified in the Bureau's master schedule. Although these initiatives were key to meeting tight time frames for nonresponse follow-ups, the Bureau's experience in implementing them highlights challenges for the next census in 2010. First, maintaining the response rate is becoming increasingly expensive. Second, public participation in the census remains problematic. Third, the address lists used for nonresponse follow-up did not always contain the latest available information because the Bureau found it was infeasible to remove many late-responding households. Fourth, the Bureau's stretch goals appeared to produce mixed results. Finally, there are questions about how reinterview procedures aimed at detecting enumerator fraud and other quality problems were implemented.
Recommendations
Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.
Director:
Team:
Phone:
GAO-02-196, 2000 Census: Best Practices and Lessons Learned for More Cost-Effective Nonresponse Follow-up
This is the accessible text file for GAO report number GAO-02-196
entitled '2000 Census: Best Practices and Lessons Learned for More
Cost-Effective Nonresponse Follow-up' which was released on February
11, 2002.
This text file was formatted by the U.S. General Accounting Office
(GAO) to be accessible to users with visual impairments, as part of a
longer term project to improve GAO products' accessibility. Every
attempt has been made to maintain the structural and data integrity of
the original printed product. Accessibility features, such as text
descriptions of tables, consecutively numbered footnotes placed at the
end of the file, and the text of agency comment letters, are provided
but may not exactly duplicate the presentation or format of the
printed version. The portable document format (PDF) file is an exact
electronic replica of the printed version. We welcome your feedback.
Please E-mail your comments regarding the contents or accessibility
features of this document to Webmaster@gao.gov.
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed
in its entirety without further permission from GAO. Because this work
may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this
material separately.
United States General Accounting Office:
GAO:
Report to Congressional Committees:
February 2002:
2000 Census:
Best Practices and Lessons Learned for More Cost-Effective Nonresponse
Follow-up:
GAO-02-196:
Contents:
Letter:
Results in Brief:
Background:
Scope and Methodology:
The Bureau Used an Aggressive Outreach and Promotion Campaign and
Other Strategies to Boost the Mail Response Rate but Public
Cooperation Remains Problematic:
Flexible Human Capital Strategies Helped the Bureau Meet Its
Recruitment Goals:
Local Census Offices Planned in Advance for Specific Enumeration
Challenges:
The Bureau's Stretch Goals to Complete Nonresponse Follow-up May Have
Produced Mixed Results:
Questions Surround Whether Certain Reinterview Procedures Were
Implemented as Intended:
Conclusions:
Recommendations for Executive Action:
Agency Comments and Our Evaluation:
Appendixes:
Appendix I: Local Census Offices Included in This Review:
Appendix II: Comments from the Secretary of Commerce:
Appendix III: GAO Contacts and Staff Acknowledgments:
Related GAO Products on the Results of the 2000 Census and Lessons
Learned for a More Cost-Effective Census in 2010:
Figures:
Figure 1: Local Census Offices Generally Completed Nonresponse Follow-
up Ahead of Schedule:
Figure 2: Nonresponse Follow-up Workload Completion Rates for the 1990
and 2000 Censuses:
Figure 3: Public Cooperation with the Census Has Steadily Declined:
Figure 4: 2000 Census Return Rates Declined in Most States Compared to
1990:
Figure 5: Local Managers' Perceptions of Recruiting and Hiring:
Figure 6: Local Managers' Perceptions of the Accuracy of Nonresponse
Follow-up Address Lists:
Figure 7: Local Managers' Perceptions of the Accuracy of Maps:
Figure 8: Local Managers' Views on the Impact of Scheduling Pressures
on the Quality of Nonresponse Follow-up:
Figure 9: Collection of Partial Interview and Closeout Data Remained
Relatively Constant Throughout Nonresponse Follow-up:
Figure 10: Percentage of Local Census Offices Collecting Less Complete
Data:
[End of section]
United States General Accounting Office:
Washington, D.C. 20548:
February 11, 2002:
The Honorable Joseph I. Lieberman:
Chairman:
The Honorable Fred Thompson:
Ranking Minority Member:
Committee on Governmental Affairs:
United States Senate:
The Honorable Dan Burton:
Chairman:
The Honorable Henry A. Waxman:
Ranking Minority Member:
Committee on Government Reform:
House of Representatives:
The Honorable Dave Weldon:
Chairman:
The Honorable Danny K Davis:
Ranking Minority Member:
Subcommittee on Civil Service and Agency Organization:
Committee on Government Reform:
House of Representatives:
Nonresponse follow-up”where enumerators from the Bureau of the Census
went door-to-door to count those individuals who did not mail back
their questionnaires”was the most costly and labor intensive of all
2000 Census operations. According to bureau data, labor, mileage, and
certain administrative costs alone amounted to about $1.4 billion, or
about 22 percent of the total $6.5 billion allocated for the 2000
Census from fiscal year 1991 through fiscal year 2003. In terms of
employment, the bureau hired about a half a million enumerators, which
temporarily made it one of the nation's largest employers, surpassed
by only a handful of big organizations like Wal-Mart and the U.S.
Postal Service. Moreover, the workload and schedule of nonresponse
follow-up”the need to collect data from about 42 million nonresponding
households within a 10-week time frame”made the conduct of this
operation extraordinarily difficult and complex.
In our prior work we noted that the success of nonresponse follow-up
would depend in large part on the bureau's ability to maintain data
quality while completing the operation on schedule, before error rates
increased as people moved or had trouble recalling who was living at
their homes on Census Day”April 1. Timeliness was also important for
keeping subsequent census operations on-track. In particular, this
included the Accuracy and Coverage Evaluation (A.C.E.), which was a
separate sample survey designed to assess the quality of the
population data collected in the 2000 Census. For methodological
reasons, the bureau needed to complete its field data collection
workload for nonresponse follow-up before A.C.E. field data collection
could begin.
To its credit, the bureau generally completed nonresponse follow-up
consistent with its operational plan. Nationwide, according to bureau
data, the 511 local census offices located in the 50 states generally
completed nonresponse follow-up in slightly less time than the
bureau's planned 10-week schedule. This was a noteworthy
accomplishment given the operational uncertainties the bureau faced,
and stands in sharp contrast to the bureau's 1990 experience when
nonresponse follow-up was hampered by unanticipated workload and
staffing problems and was completed 6 weeks behind schedule.
This report is the latest in our series of reviews that examine the
results of key census-taking operations and highlight opportunities
for reform (see the last page of this report for a list of products
issued to date). Our objectives were to identify (1) practices that
contributed to the timely completion of nonresponse follow-up and (2)
lessons learned in implementing these practices that the bureau may
want to consider as it plans for nonresponse follow-up during the next
census in 2010.
Several practices were critical to the bureau's timely completion of
nonresponse follow-up. The bureau:
* had an aggressive outreach and promotion campaign, simplified
questionnaire, and other efforts to boost the mail response rate and
thus reduce the bureau's nonresponse follow-up workload;
* used a flexible human capital strategy that enabled it to meet its
national recruiting and hiring goals and position enumerators where
they were most needed;
* called on local census offices to identify local enumeration
challenges, such as locked apartment buildings and gated communities,
and to develop action plans to address them; and;
* applied ambitious interim "stretch" goals that encouraged local census
offices to finish 80 percent of their nonresponse follow-up workload
within the first 4 weeks and be completely finished by the end of the
8th week, as opposed to the 10-week time frame specified in the
bureau's master schedule.
Although these initiatives were key to meeting nonresponse follow-up's
tight time frames, the bureau's experience in implementing them
highlights several significant challenges that lie ahead for the next
census in 2010. First, maintaining the response rate is becoming
increasingly expensive. While the bureau achieved similar response
rates in 1990 and 2000 (65 percent in 1990 and 64 percent in 2000),
the bureau spent far more money on outreach and promotion in 2000:
about $3.19 per household in 2000 compared to $0.88 in 1990 (in
constant fiscal year 2000 dollars). Moreover, given a variety of
social, demographic, and attitudinal trends, such as changes in
household makeup and stability, concerns over privacy, and an
increasing non-English-speaking population, achieving comparable
results in 2010 will likely require an even larger investment of
bureau resources.
Second, public participation in the census remains problematic.
Indeed, preliminary data on the mail return rate”a more precise
indicator of public cooperation with the census than the mail response
rate declined from 74 percent to 72 percent from 1990 to 2000.
[Footnote 1] Also, there still appears to be a large gap between the
relatively large number of people who were aware of the 2000 Census
and those that actually responded. Bridging this gap has been a
longstanding difficulty for the bureau.
Third, the address lists used for nonresponse follow-up did not always
contain the latest available information, in part because the bureau
found it was infeasible to remove many late-responding households. As
a result, enumerators needed to visit over 773,000 households that had
already mailed back their questionnaires”an effort that approached $22
million in additional costs for nonresponse follow-up, based on our
estimate, and confused respondents. An additional challenge was that
some of the maps enumerators used to help them find addresses during
nonresponse follow-up contained inaccuracies.
Fourth, the bureau's stretch goals appeared to produce mixed results.
On the one hand, on the basis of our survey of local census office
managers, we estimate that about 41 percent of managers believed
scheduling pressures had little or no impact on the quality of the
nonresponse follow-up operation. Another 17 percent of managers
believed that such pressure had a positive or significantly positive
impact. On the other hand, about 40 percent of the local census office
managers believed that scheduling pressures during nonresponse follow-
up had a negative or significantly negative impact on the quality of
the operation. A common concern appeared to be that scheduling
pressures created a culture that emphasized quantity over quality.
One indicator of the quality of nonresponse follow-up is the
completeness of the data collected by enumerators. During nonresponse
follow-up, a small number of local census offices”in some highly
publicized incidents”improperly collected less complete data and took
other shortcuts (which the bureau took steps to rectify). Nationally,
however, our analysis of bureau data found that those offices that
completed their follow-up workloads faster than the others did not
collect larger quantities of less complete data, such as partial
interviews.
Finally, questions surround the extent to which certain reinterview
procedures aimed at detecting enumerator fraud and other quality
problems were implemented throughout the entire nonresponse follow-up
operation as intended. The decision to subject enumerators' work to
these procedures was at the discretion of local census personnel.
Fifty-two local census offices (about 10 percent of all local offices)
did not conduct any reinterviews after a random check of enumerators'
initial work. A senior bureau quality assurance official expressed
concerns about the adequacy of quality assurance coverage toward the
end of nonresponse follow-up at these offices because once random
reinterviews were completed at those offices, there were no additional
checks specifically designed to detect fabricated data.
In light of these challenges, as the bureau plans for the next
national head count in 2010, we recommend that the Secretary of
Commerce ensure that the bureau:
* develop and refine the lessons learned from the nonresponse follow-
up effort and apply them to the planning efforts for 2010;
* assess, to the extent practicable, why people who were aware of the
census did not participate, and develop appropriate marketing
strategies;
* develop and test options that could generate more current
nonresponse follow-up address lists and maps;
* ensure that the bureau's procedures and incentives for the timely
completion of nonresponse follow-up emphasize the collection of
quality data and proper enumeration techniques as much as speed; and;
* ensure that the bureau's reinterview procedures, as implemented, are
sufficient for consistently and reliably detecting potential quality
problems throughout the full duration of enumerators' employment on
nonresponse follow-up.
The Secretary of Commerce forwarded written comments from the Bureau
of the Census on a draft of this report. The bureau concurred with all
five of our recommendations. The bureau also clarified several key
points and provided additional information and perspective, which we
incorporated in our report as appropriate.
Background:
In conducting nonresponse follow-up, the bureau has historically faced
the twin challenge of (1) collecting quality data (by obtaining
complete and accurate information directly from household members)
while (2) finishing the operation on schedule, before error rates can
increase as people move or have trouble recalling who was living at
their homes on Census Day (April 1), as well as keeping subsequent
operations on-track. Nonresponse follow-up was scheduled to begin on
April 27, 2000, and end 10 weeks later, on July 7, 2000.
Local census offices generally finished their nonresponse follow-up
workloads ahead of the bureau's 10-week schedule.[Footnote 2] As shown
in figure 1, of the bureau's 511 local offices in the 50 states, 463
(91 percent) finished nonresponse follow-up by the end of the eighth
week of the operation, consistent with the bureau's internal stretch
goals. Moreover, nine local offices completed their workloads in as
little as 5 weeks or less.
Figure 1: Local Census Offices Generally Completed Nonresponse Follow-
up Ahead of Schedule:
[Refer to PDF for image: vertical bar graph]
Week that nonresponse follow-up was finished: 4;
Number of local census offices: 2.
Week that nonresponse follow-up was finished: 5;
Number of local census offices: 7.
Week that nonresponse follow-up was finished: 6;
Number of local census offices: 59.
Week that nonresponse follow-up was finished: 7;
Number of local census offices: 222.
Week that nonresponse follow-up was finished: 8;
Number of local census offices: 173.
Week that nonresponse follow-up was finished: 9;
Number of local census offices: 48.
Source: GAO analysis of Census Bureau data.
[End of figure]
The timely completion of nonresponse follow-up in 2000 stands in sharp
contrast to the bureau's experience during the 1990 Census. As shown
in figure 2, at the end of the 6-week scheduled time frame for
nonresponse follow-up during the 1990 Census, the bureau had not
completed the operation. In fact, as of two days prior to the
scheduled end date, just two local census offices had completed the
operation and the bureau had only completed about 72 percent of its 34
million household follow-up workload. It took the bureau a total of 14
weeks to complete the entire operation. By comparison, as noted above,
the bureau completed nonresponse follow-up in less than 10 weeks
during the 2000 Census.
Figure 2 also highlights the drop-off in production that occurs during
the later weeks of nonresponse follow-up. According to the bureau, the
decline occurs because unresolved cases at the end of nonresponse
follow-up are typically the most difficult to reach, either because
they are uncooperative or are rarely at home and are unknown to
neighbors.
Figure 2: Nonresponse Follow-up Workload Completion Rates for the 1990
and 2000 Censuses:
[Refer to PDF for image: multiple line graph]
Depicted on the graph are the following:
2000 completion rate;
1990 completion rate;
As a percentage of nonresponse follow-up workload.
Also depicted are the following specific items:
Scheduled end of nonresponse follow-up 1990 (week 6);
Scheduled end of nonresponse follow-up 1990 (week 10).
Source: GAO analysis of Census Bureau data.
[End of figure]
Scope and Methodology:
To meet our objectives, we used a combination of approaches and
methods to examine the conduct of nonresponse follow-up. These
included statistical analyses; interviews with key bureau headquarters
officials, regional census center officials, and local census office
managers and staff; observations of local census offices' nonresponse
follow-up operations; and reviews of relevant documentation.
To examine the factors that contributed to the timely completion of
nonresponse follow-up, we interviewed local census office managers and
other supervisory staff at 60 local census offices we visited across
the country These offices generally faced specific enumeration
challenges when nonresponse follow-up began in late April, and were
thus prone to operational problems that could affect data quality (see
appendix I for a complete list of the offices we visited).
Specifically, these offices had (1) a larger nonresponse follow-up
workload than initially planned; (2) multiple areas that were
relatively hard-to-enumerate, such as non-English-speaking groups; and
(3) difficulties meeting their enumerator recruiting goals.
During these visits, which took place in June and July 2000, we also
observed office operations to see how office staff were processing
questionnaires; at 12 of these offices we attended enumerator
training; and at 31 offices we reviewed key reinterview documents in a
given week during nonresponse follow-up. The local census offices we
visited represent a mix of urban, suburban, and rural locations.
However, because they were judgmentally selected, our findings from
these visits cannot be projected to the universe of local census
offices.
To obtain a broader perspective of the conduct of nonresponse follow-
up, we used the results of our survey of a stratified random sample of
managers at 250 local census offices. The survey”which asked these
managers about the implementation of a number of key field operations”
is generalizable to the 511 local census offices located in the 50
states.[Footnote 3] We obtained responses from managers at 236 local
census offices (about a 94 percent overall response rate). All
reported percentages are estimates based on the sample and are subject
to some sampling error as well as nonsampling error. In general,
percentage estimates in this report for the entire sample have
confidence intervals ranging from about ± 4 to ± 5 percentage points
at the 95 percent confidence interval. In other words, if all managers
in our local census office population had been surveyed, the chances
are 95 out of 100 that the result obtained would not differ from our
sample estimate in the more extreme cases by more than ± 5 percent.
To examine whether the pace of nonresponse follow-up was associated
with the collection of less complete data, in addition to the efforts
described above, we analyzed bureau data on the weekly progress of
nonresponse follow-up. Specific measures we analyzed included the time
it took local census offices to finish nonresponse follow-up and the
proportion of their cases completed by (1) "close-out" interviews,
where questionnaires only contain basic information on the status of
the housing unit (e.g., whether it was occupied), or (2) "partial"
interviews, which contain more information than a close-out interview
but are still less than complete. The completeness of the data
collected by enumerators is one measure of the quality of nonresponse
follow-up, and these two measures were the best indicators of
completeness available from the database. We included data from the
511 offices located in the 50 states and controlled for enumeration
difficulty using an index measure developed by the bureau.[Footnote 4]
We did not include any outliers that the bureau identified as
erroneous (for example, outliers resulting from coding errors).
[Footnote 5]
We did our audit work at the local census offices identified in
appendix I and their respective regional census centers; bureau
headquarters in Suitland, Maryland; and Washington, DC, from March
2000 through September 2001. Our work was done in accordance with
generally accepted government auditing standards.
We requested comments on a draft of this report from the Secretary of
Commerce. On January 10, 2002, the Secretary forwarded the bureau's
written comments on the draft (see app. II) which we address at the
end of this report.
The Bureau Used an Aggressive Outreach and Promotion Campaign and
Other Strategies to Boost the Mail Response Rate but Public
Cooperation Remains Problematic:
Key to the bureau's timely completion of nonresponse follow-up in 2000
was a higher than expected initial mail response rate that decreased
the bureau's follow-up workload. In addition to reducing the staff,
time, and money required to complete the census count, the bureau's
past experience and evaluations suggest that the quality of data
obtained from questionnaires returned by mail is better than the data
collected by enumerators.
To help raise the mail response rate, the bureau (1) hired a
consortium of private-sector advertising agencies, led by Young &
Rubicam, to develop a national, multimedia paid advertising program,
and (2) partnered with local governments, community groups,
businesses, nongovernmental organizations, and other entities to
promote the census on a grassroots basis (we discuss the bureau's
partnership program in more detail in our August 2001 report).
[Footnote 6] The outreach and promotion campaign encouraged people to
complete their census questionnaires by conveying the message that
census participation helped their communities. The bureau also helped
boost the mail response rate by using simplified questionnaires, which
was consistent with our past suggestions,[Footnote 7] and by
developing more ways to respond to the census, such as using the
Internet.
The bureau achieved an initial mail response rate of about 64 percent,
which was about 3 percentage points higher than the 61 percent
response rate the bureau expected when planning for nonresponse follow-
up.[Footnote 8] This, in turn, resulted in a nonresponse follow-up
workload of about 42 million housing units, which was about 4 million
fewer housing units than the bureau would have faced under its
planning assumption of a 61 percent mail response rate.
In addition to surpassing its national response rate goals, the bureau
exceeded its own expectations at the local level. Of the 511 local
census offices, 378 (74 percent) met or exceeded the bureau's expected
response rate. In so doing, these offices reduced their nonresponse
follow-up workloads from the expected levels by between 54 and 58,329
housing units. The remaining 133 offices (26 percent) did not meet
their expected response rate and the workload at these offices
increased from their expected levels by between 279 and 33,402 housing
units.
Securing Public Participation While Controlling Costs Remains a
Considerable Challenge for the 2010 Census:
The bureau's success in surpassing its response rate goals was
noteworthy given the formidable societal challenges it faced. These
challenges included attitudinal factors such as concerns over privacy,
and demographic trends such as more complex living arrangements.
However, as the bureau plans for the next census in 2010, it faces the
difficulty of boosting public participation while keeping costs
manageable.
As we noted in our December 2001 report, although the bureau achieved
similar response rates in 1990 and 2000 (65 percent in 1990 and 64
percent in 2000), the bureau spent far more money on outreach and
promotion in 2000: about $3.19 per household in 2000 compared to $0.88
in 1990 (in constant fiscal year 2000 dollars), an increase of 260
percent.[Footnote 9] Moreover, the societal challenges the bureau
encountered in 1990 and 2000 will probably be more complex in 2010,
and simply staying on par with the 2000 response rate will likely
require an even greater investment of bureau resources.
Further, while the mail response rate provides a direct indication of
the nonresponse workload, it is an imperfect measure of public
cooperation with the census as it is calculated as a percentage of all
forms in the mail-back universe from which the bureau received a
questionnaire. Because the mail-back universe includes housing units
that the bureau determines are nonexistent or vacant during
nonresponse follow-up, a more precise measure of public cooperation is
the mail return rate, which excludes vacant and nonexistent housing
units. According to preliminary bureau data, the mail return rate for
the 2000 Census was 72 percent, a decline of 2 percentage points from
the 74 percent mail return rate the bureau achieved in 1990. As shown
in figure 3, in 2000, the bureau reduced, but did not reverse, the
steady decline in public cooperation that has occurred with each
decennial census since the bureau first initiated a national
mailout/mail-back approach in 1970. Bureau officials said they would
further examine the reasons for the decline in the return rate as part
of its Census 2000 evaluations.
Figure 3: Public Cooperation with the Census Has Steadily Declined:
[Refer to PDF for image: vertical bar graph]
Decennial year: 1970;
Mail return rate: 87%.
Decennial year: 1980;
Mail return rate: 83%.
Decennial year: 1990;
Mail return rate: 74%.
Decennial year: 2000;
Mail return rate: 72%.
Source: GAO analysis of Census Bureau data.
[End of figure]
In addition, as shown in figure 4, the results to date show that just
three states increased their mail return rates compared to the 1990
Census. Overall, preliminary bureau data shows the change in mail
return rates from 1990 through 2000 ranged from an increase of about 1
percentage point in Massachusetts and California to a decline of about
9 percentage points in Kentucky.
Figure 4: 2000 Census Return Rates Declined in Most States Compared to
1990:
[Refer to PDF for image: U.S. map and associated data.
Alabama: 67.6% (-4.4%);
Alaska: 61% (-4%);
Arizona: 70% (-4%);
Arkansas: 69.9% (-7.1%);
California: 72.6% (+0.6%);
Colorado: 73.9% (-3.1%);
Connecticut: 73.4% (+0.4%);
Delaware: 71.1% (-4.9%);
Florida: 71.2% (-2.8);
Georgia: 70.4% (-2.6):
Hawaii: 67.5% (-2.5);
Idaho: 72.5% (-4.5);
Illinois: 72.8% (-4.2);
Indiana: 74% (-7%);
Iowa: 78.9% (-5.1):
Kansas: 74.8% (-6.2%);
Kentucky: 70.4% (-8.6%);
Louisiana: 66.8% (-4.2%);
Maine: 70% (-3%);
Maryland: 72.8% (-4.2);
Massachusetts: 72.7% (+0.7%);
Michigan: 78.1% (-1.9%);
Minnesota: 78.4% (-5.6);
Mississippi: 67.5% (-4.5);
Missouri: 75.6% (-4.4%);
Montana: 74% (-1%);
Nebraska: 78.7% (-2.3%);
Nevada: 68.2% (-0.8%);
New Hampshire: 72.4% (-2.6%);
New Jersey: 72.7% (-2.3%);
New Mexico: 67.3% (-4.7%);
New York: 68.4% (-3.6%);
North Carolina: 68.1% (-4.9%);
North Dakota: 78.4% (-2.6%);
Ohio: 76.6% (-5.4%);
Oklahoma: 69.9% (-7.1%);
Oregon: 71.7% (-2.3%);
Pennsylvania: 75.8% (-5.2%);
Rhode Island: 71.3% (-0.7%);
South Carolina: 66.2% (-3.8%);
South Dakota: 78.9% (-2.1%);
Tennessee: 69.4% (-8.6%);
Texas: 67.7% (-6.3%);
Utah: 72.2% (-2.8%);
Vermont: 68.6% (-1.4%);
Virginia: 74.1% (-3.9%);
Washington: 69.8% (-5.2%);
West Virginia: 71.4% (-5.6%);
Wisconsin: 80% (-5%);
Wyoming: 73.2% (-0.8%).
Source: GAO analysis of preliminary Census Bureau data.
[End of figure]
The bureau‘s outreach and promotion efforts will also face the
historical hurdle of bridging the gap that exists between the public‘s
awareness of the census on the one hand, and its motivation to respond
on the other. Various polls conducted for the 2000 Census suggested
that the public‘s awareness of the census was over 90 percent; and
yet, as noted earlier, the actual return rate was much lower”72
percent of the nation‘s households. The bureau faced a similar issue
in 1990 when 93 percent of the public reported being aware of the
census, but the return rate was 74 percent. In our previous work, we
noted that closing this gap would be a significant challenge for the
bureau, and as the bureau plans for the 2010 Census, it will be
important for it to explore approaches that more effectively convert
the public‘s awareness of the census into a willingness to respond.
[Footnote 10]
Flexible Human Capital Strategies Helped the Bureau Meet Its
Recruitment Goals:
A second factor that was instrumental to the operational success of
nonresponse follow-up was an ample and sufficiently skilled enumerator
workforce. Based on anticipated turnover and the expected workload to
carry out its four largest field data collection operations”of which
nonresponse follow-up was the largest”the bureau set a recruitment
goal of 2.4 million qualified applicants.[Footnote 11] In addition to
the sheer volume of recruits needed, the bureau's efforts were
complicated by the fact that it was competing for employees in a
historically tight national labor market. Nevertheless, when
nonresponse follow-up began on April 27, the bureau had recruited over
2.5 million qualified applicants.
The bureau surmounted its human capital challenge with an aggressive
recruitment strategy that helped make the bureau a more attractive
employer to prospective candidates and ensured a steady stream of
applicants. Key ingredients of the bureau‘s recruitment efforts
included the following:
1. A geographic pay scale with wages set at 65 to 75 percent of local
prevailing wages (from about $8.25 to $18.50 per hour for
enumerators). The bureau also used its flexibility to raise pay rates
for those census offices that were encountering recruitment
difficulties.
For example, a manager at one of the Charlotte region‘s local census
offices told us that the office was having difficulty obtaining needed
staff in part because census wages were uncompetitive. According to
this manager, the region approved a pay increase for the office‘s
enumerators and office clerks, which helped the office obtain staff.
In all, when nonresponse follow-up began, the bureau raised pay rates
for field staff at eight local offices to address those offices‘
recruiting challenges.
2. Partnerships with state, local, and tribal governments, community
groups, and other organizations to help recruit employees and provide
free facilities to test applicants. For example, Clergy United, an
organization representing churches in the Detroit metropolitan area,
provided space for testing census job applicants in December 1998. The
organization even conducted pre-tests several days before each bureau-
administered test so those applicants could familiarize themselves
with the testing format.
3. A recruitment advertising campaign, which totaled over $2.3
million, that variously emphasized the ability to earn good pay, work
flexible hours, learn new skills, and do something important for one‘s
community. Moreover, the advertisements were in a variety of languages
to attract different ethnic groups, and were also targeted to
different races, senior citizens, retirees, and people seeking part-
time employment. The bureau advertised using traditional outlets such
as newspaper classified sections, as well as more novel media
including Internet banners and messages on utility and credit card
bills.
4. Obtaining exemptions from the majority of state governments so that
individuals receiving Temporary Assistance for Needy Families,
Medicaid, and selected other types of public assistance would not have
their benefits reduced when earning census income, thus making census
jobs more attractive. At the start of nonresponse follow-up, 44 states
and the Virgin Islands had granted an exemption for one or more of
these programs.
5. Encouraging local offices to continue their recruiting efforts
throughout nonresponse follow-up, regardless of whether offices had
met their recruiting goals, to ensure a steady stream of available
applicants.
The bureau matched these initiatives with an ongoing monitoring effort
that enabled bureau officials to rapidly respond to recruiting
difficulties. For example, during the last 2 weeks of April, the
bureau mailed over 5 million recruiting postcards to Boston,
Charlotte, and other locations where it found recruitment efforts were
lagging.
Based on the results of our local census office visits, it is clear
that the bureau‘s human capital strategy had positive outcomes. Of the
60 local census offices we visited, officials at 59 offices provided
useable responses to our question about whether their offices had the
type of staff they needed to conduct nonresponse follow-up, including
staff with particular language skills to enumerate in targeted
areas.[Footnote 12] Officials at 54 of the 59 offices said they had
the type of staff they needed to conduct nonresponse follow-up. For
example, officials in the Boston North office said they hired
enumerators who spoke Japanese, Vietnamese, Portuguese, Spanish,
French, Russian, and Chinese, while Pittsburgh office officials said
they had enumerators that knew sign language to communicate with deaf
residents.
Managers at local census offices we surveyed provided additional
perspective on recruiting needed field staff. As shown in figure 5, 30
percent of the respondents believed that the bureau‘s ability to
recruit and hire high-quality field staff needed no improvements.
While managers at 52 percent of the local offices commented that some
improvement to the recruiting and hiring process was needed and
another 17 percent commented that a significant amount of improvement
was needed, their suggestions varied. Managers‘ suggestions generally
related to various hiring practices, such as a greater use of face-to-
face interviews to select managers at local census offices and earlier
recruitment advertising.
Figure 5: Local Managers‘ Perceptions of Recruiting and Hiring:
[Refer to PDF for image: vertical bar graph]
Extent of improvement needed: No improvement needed;
Percentage of local census offices: 30%.
Extent of improvement needed: Some improvement needed;
Percentage of local census offices: 52%.
Extent of improvement needed: Significant improvement needed;
Percentage of local census offices: 17%.
Extent of improvement needed: No basis to judge;
Percentage of local census offices: 1%.
Source: GAO survey of local census office managers.
[End of figure]
Once nonresponse follow-up began, bureau officials tracked production
rates as the primary measure of whether local offices had met their
staffing goals. For example, bureau officials said that both bureau
headquarters and regional census center staff monitored local census
offices‘ production daily. If an office was not meeting its production
goals, bureau headquarters officials said they worked with regional
census personnel, who in turn worked with the local census office
manager, to determine the reasons for the shortfall and the actions
necessary to increase production. Possible actions included bringing
in enumerators from neighboring local census offices.
Overall, preliminary bureau data shows that about 500,000 enumerators
worked on nonresponse follow-up. Nationally, the bureau established a
hiring goal of 292,000 enumerator positions for nonresponse follow-up,
which represented two people working approximately 25 hours per week
for each position and assumed 100 percent turnover, according to
bureau officials. The bureau has not yet analyzed how many enumerators
charged at least 25 hours per week during nonresponse follow-up.
Moreover, according to a senior bureau official, the bureau has not
decided whether it will do such an analysis for 2010 planning
purposes. According to this official, because the bureau hired about
500,000 enumerators and completed the operation a week ahead of
schedule, they believe the bureau generally met its hiring goal.
Local Census Offices Planned in Advance for Specific Enumeration
Challenges:
A third factor that contributed to the timely completion of
nonresponse follow-up was preparing in advance for probable
enumeration challenges. To do this, the bureau called on local census
offices and their respective regional census centers to develop action
plans that, among other things, identified hard-to-enumerate areas
within their jurisdictions, such as immigrant neighborhoods, and
propose strategies for dealing with those challenges. These strategies
included such methods as paired/team enumeration for high-crime areas,
and hiring bilingual enumerators. While this early planning effort
helped local census offices react to a variety of enumeration
challenges, the currency and accuracy of the nonresponse follow-up
address lists and maps remained problematic for a number of local
census offices.
Most Local Offices Used Action Plans to Address Enumeration Challenges:
Of the 60 local census offices we visited, officials at 55 offices
provided useable responses to our question about how, if at all, their
offices used their action plan for hard-to-enumerate areas during
nonresponse follow-up.[Footnote 13] Officials at 51 of 55 offices said
their offices used the strategies in their action plan to address the
enumeration challenges they faced.
At the offices we visited, a frequently cited enumeration challenge
was gaining access to gated communities or secure apartment buildings.
Officials at 42 of the 60 offices we visited identified this as a
problem. To address it, officials said they developed partnerships
with building management and community leaders, among other
strategies. In an Atlanta office, for example, local officials said
they sent letters to managers of gated communities that stressed the
importance of the census. Similarly, officials in a Chicago office
said they personally phoned managers of secure apartment buildings.
When enumerators from a Milwaukee local census office encountered
problems accessing locked apartment buildings, local census officials
told us that the City of Milwaukee sent aldermen to visit the building
managers and encourage them to participate in the census.
Another common enumeration challenge appeared to be obtaining
cooperation from residents”cited as a difficulty by officials at 34 of
the 60 offices we visited. One problem they noted was obtaining
responses to the long-form questionnaire”either in its entirety or to
specific items, such as income-related questions--which, according to
local census officials, some residents found to be intrusive.
Enumerators also encountered residents who were unwilling to
participate in the census because of language and cultural
differences, or their fears of government. The bureau‘s standardized
training for enumerators included procedures for handling refusals.
Local census officials encouraged public participation with a variety
of approaches as well. For example, census officials in Cleveland and
Cincinnati said they provided additional training for enumerators on
how to handle refusals and practiced what was taught in mock
interviews. Officials in other census offices said they partnered with
local community leaders who subsequently helped reach out to hard-to-
enumerate groups, hired people who were bilingual or otherwise trusted
and known by residents, and held media campaigns. Overall, according
to bureau data, close to 470,000 households of the approximately 42
million making up the nonresponse follow-up workload (about 1
percent), refused to participate in the census.
The Accuracy and Currency of Nonresponse Follow-up Address Lists and
Maps Appeared to Be Problematic:
Of the 60 local census offices we visited, officials at 52 offices
provided useable responses to our question about whether their offices‘
nonresponse follow-up address list reflected the most accurate and
current information.[Footnote 14] Officials at 21 of the 52 offices
said that their lists generally were not accurate and current.
Nationwide, as shown in figure 6, based on our survey of local census
office managers, we estimate that managers at approximately 50 percent
of local census offices believed that some improvement was needed in
the accuracy of address lists for nonresponse follow-up. We estimated
that managers at about 22 percent of local census offices believed
that a significant amount of improvement was needed.
Figure 6: Local Managers‘ Perceptions of the Accuracy of Nonresponse
Follow-up Address Lists:
[Refer to PDF for image: vertical bar graph]
No improvement needed:
Percentage of local census offices: 25%.
Some improvement needed:
Percentage of local census offices: 50%.
Significant improvement needed:
Percentage of local census offices: 22%.
No basis to judge:
Percentage of local census offices: 3%.
Source: GAO survey of local census office managers.
[End of figure]
Among the more frequent problems managers cited were duplicate
addresses and changes not being made from prior operations. For
example, at a local census office in the Seattle region, managers said
that some addresses were residences or businesses that had been gone
for 10-15 years and should have been deleted in previous census
operations but were not.
Local census officials we visited cited problems with the accuracy of
the census maps as well. Of the 60 local census offices we visited,
officials at 58 offices provided useable responses to our question
about whether the most accurate and current information was reflected
on the nonresponse follow-up maps.[Footnote 15] Officials at about a
third of local census offices”21 of 58 offices”said the nonresponse
follow-up maps did not reflect the most accurate and current
information.
Further, as shown in figure 7, based on our survey of local census
office managers, at about 41 percent of the offices, managers believed
that some improvement was needed in maps for nonresponse follow-up. At
about 23 percent of the offices, managers believed that a significant
amount of improvement was needed in these maps.
Figure 7: Local Managers‘ Perceptions of the Accuracy of Maps:
[Refer to PDF for image: vertical bar graph]
Extent of improvement needed: No improvement needed;
Percentage of local census offices: 34%.
Extent of improvement needed: Some improvement needed;
Percentage of local census offices: 41%.
Extent of improvement needed: Significant improvement needed;
Percentage of local census offices: 23%.
Extent of improvement needed: No basis to judge;
Percentage of local census offices: 2%.
Source: GAO survey of local census office managers.
[End of figure]
Managers who commented that improvements were needed to the
nonresponse follow-up maps said the maps were difficult to use, not
updated from prior operations, and contained errors. For example, an
official at a local census office in the Atlanta region said that some
roads on the map did not exist or were not oriented correctly on the
census maps. To address this difficulty, local office staff purchased
commercial maps or used the Internet to help them locate some housing
units.
The bureau developed its master address list and maps using a series
of operations that made incremental updates designed to continuously
improve the completeness and accuracy of the master address file and
maps. A number of these updates occurred during nonresponse follow-up
when enumerators encountered, for example, nonexistent or duplicate
housing units, or units that needed to be added to the address list.
As a result, the bureau was expecting some discrepancies between the
nonresponse follow-up address list and what enumerators found in the
field when they went door-to-door, which could account for some of the
local census officials‘ perceptions.
Another factor that affected the currency of the nonresponse follow-up
address list was the cut-off date for mail-back responses. The bureau
set April 11, 2000, as the deadline for mail-back responses for
purposes of generating the address list for nonresponse follow-up. In
a subsequent late mail return operation, the bureau updated its field
follow-up workload by removing those households for which
questionnaires were received from April 11 through April 18. However,
according to bureau officials, the bureau continued to receive
questionnaires, in part because of an unexpected boost from its
outreach and promotion campaign. For example, by April 30”less than 2
weeks after the bureau removed the late mail returns that it had
checked-in as of April 18--the bureau received 773,784 additional
questionnaires. Bureau headquarters officials told us it was
infeasible to remove the late returns from the nonresponse follow-up
address lists and thus, enumerators needed to visit these households.
The cost of these visits approached $22 million, based on our earlier
estimate that a 1-percentage point increase in workload could add at
least $34 million in direct salary, benefits, and travel costs to the
price tag of nonresponse follow-up.[Footnote 16] In addition, the
bureau‘s data processing centers then had to reconcile the duplicate
questionnaires. According to officials at some local offices we
visited, the visits to households that had already responded confused
residents who questioned why enumerators came to collect information
from them after they had mailed back their census forms.
The Bureau‘s Stretch Goals to Complete Nonresponse Follow-up May Have
Produced Mixed Results:
To help ensure that local census offices completed nonresponse follow-up
on schedule, the bureau developed ambitious interim stretch goals.
These goals called on local census offices to finish 80 percent of
their nonresponse follow-up workload within the first 4 weeks of the
operation and be completely finished by the end of the eighth week.
Under the bureau‘s master schedule, local census offices had 10 weeks
to complete the operation.
Local Census Office Managers Cited Both Positive and Negative Effects
of the Nonresponse Follow-up Schedule on the Quality of the Operation:
Our survey of local census office managers asked what impact, if any,
scheduling pressures to complete nonresponse follow-up had on the
quality of the operation. On the one hand, as shown in figure 8, about
41 percent of the local census office managers believed that
scheduling pressures had little or no impact on the quality of the
operation, while about 17 percent believed that such pressure had a
positive or significantly positive impact. At a local census office in
the New York region, for example, the local census office manager
stated that, "pressuring people a little gave them the motivation to
produce.“ Managers in local census offices located in the Dallas
region commented that the schedule ’kept people on their toes and
caused them to put forth their best effort" and that it ’had a
positive impact, particularly on quality.“
On the other hand, managers at a substantial number of local census
offices had the opposite view. As shown in figure 8, about 40 percent
of the respondents believed that scheduling pressure during
nonresponse follow-up had a negative or significantly negative impact
on the quality of the operation.
Figure 8: Local Managers‘ Views on the Impact of Scheduling Pressures
on the Quality of Nonresponse Follow-up:
[Refer to PDF for image: vertical bar graph]
Type of impact: Positive to significantly positive;
Percentage of local census offices: 17%.
Type of impact: Little or no significance;
Percentage of local census offices: 41%.
Type of impact: Negative to significantly negative;
Percentage of local census offices: 40%.
Type of impact: No basis to judge;
Percentage of local census offices: 1%.
Note: Due to rounding, percentages do not equal 100 percent.
Source: GAO survey of local census office managers.
[End of figure]
Of those managers who believed that the pressure to complete
nonresponse follow-up adversely affected the quality of the operation,
a common perception appeared to be that production was emphasized more
than accuracy and that the schedule required local census offices to
curtail procedures that could have improved data quality. For example,
managers at some local census offices told us that the bureau‘s
regional census centers encouraged competition between local census
offices by, among other actions, ranking local census offices by their
progress and distributing the results to local managers. Managers at
some local census offices believed that such competition fostered a
culture where quantity was more important than quality. As one manager
told us, the bureau‘s ambitious nonresponse follow-up schedule led the
manager ’to put enormous pressure on people in the field to complete
the operation quickly, and this affected the quality of data.“
However, none of the managers we surveyed cited specific examples of
where corners were cut or quality was compromised.
The Pace of Nonresponse Follow-up Was Not Associated with the
Collection of Less Complete Data:
One measure of the quality of nonresponse follow-up is the
completeness of the data collected by enumerators. The bureau went to
great lengths to obtain complete data directly from household members.
Bureau procedures generally called for enumerators to make up to three
personal visits and three telephone calls to each household on
different days of the week at different times until they obtained
needed information on that household.
However, in cases where household members could not be contacted or
refused to answer all or part of the census questionnaire, enumerators
were permitted to obtain data via proxy (a neighbor, building manager,
or other nonhousehold member presumed to know about its residents), or
collect less complete data than called for by the census
questionnaire. Such data include (1) ’closeout“ interviews, where
questionnaires only contain the information on the status of the
housing unit (e.g., whether or not it was occupied), and the number of
residents and (2) ’partial“ interviews, which contain more information
than a closeout interview but less than a completed questionnaire.
There were several well-publicized breakdowns in these enumeration
procedures at a small number of local census offices that took short
cuts to complete their work (which the bureau later took steps to
rectify). Nationally, however, our analysis of bureau data found no
statistically significant association between the week individual
local census offices finished their nonresponse follow-up workload and
the percentage of partial[Footnote 17] or closeout[Footnote 18]
interviews they reported, after controlling for the enumeration
difficulty level of each local office‘s area[Footnote 19] (at the time
of our review, the bureau did not have information on data collected
via proxy interviews).
Neither did we find a statistically significant relationship between
the week that local census offices finished their nonresponse follow-
up workload and the amount of residual workload,[Footnote 20] they
had, if any. The residual workload consisted of households that were
part of the original follow-up workload, but from which the bureau did
not receive a questionnaire from the local census offices, and thus
had not been processed through data capture. According to bureau data,
519 local offices had to conduct residual nonresponse follow-up on
121,792 households.
Similarly, we did not find an association between week-to-week ’spikes“
in local census offices‘ production and the percentage of either
partial or closeout interview data reported. Spikes or surges in
production could indicate that local census offices were cutting
corners to complete their workloads by a specific deadline.
Nationally, we found no relationship between the number of
questionnaires finished each week and either the percentage of those
finished that were closeout interviews[Footnote 21] or partial
interviews.[Footnote 22]
Overall, as shown in figure 9, as nonresponse follow-up progressed,
the proportion of closeout and partial interview data collected
relative to the amount of questionnaires finished remained relatively
constant.
Figure 9: Collection of Partial Interview and Closeout Data Remained
Relatively Constant Throughout Nonresponse Follow-up:
[Refer to PDF for image: vertical bar graph]
Ending week that nonresponse follow-up completed: 2;
Cumulative nonresponse follow-up workload finished: 23 million;
Cumulative partial interview data collected: 0;
Cumulative closeout data collected: 0.
Ending week that nonresponse follow-up completed: 3;
Cumulative nonresponse follow-up workload finished: 39 million;
Cumulative partial interview data collected: 1 million;
Cumulative closeout data collected: 0 million.
Ending week that nonresponse follow-up completed: 4;
Cumulative nonresponse follow-up workload finished: 64 million;
Cumulative partial interview data collected: 1 million;
Cumulative closeout data collected: 2 million.
Ending week that nonresponse follow-up completed: 5;
Cumulative nonresponse follow-up workload finished: 80 million;
Cumulative partial interview data collected: 2 million;
Cumulative closeout data collected: 2 million.
Ending week that nonresponse follow-up completed: 6;
Cumulative nonresponse follow-up workload finished: 91 million;
Cumulative partial interview data collected: 2 million;
Cumulative closeout data collected: 2 million.
Ending week that nonresponse follow-up completed: 7;
Cumulative nonresponse follow-up workload finished: 98 million;
Cumulative partial interview data collected: 2 million;
Cumulative closeout data collected: 2 million.
Ending week that nonresponse follow-up completed: 8;
Cumulative nonresponse follow-up workload finished: 100 million;
Cumulative partial interview data collected: 2 million;
Cumulative closeout data collected: 2 million.
Ending week that nonresponse follow-up completed: 9;
Cumulative nonresponse follow-up workload finished: 100 million;
Cumulative partial interview data collected: 2 million;
Cumulative closeout data collected: 2 million.
Note: There were no bureau data available for weeks 1 and 10.
Comparable data for 1990 were not available for comparison to 2000
results. Percentage of workload finished is out of the total workload;
percentages of partial interviews and closeouts are out of the
workload completed.
Source: GAO analysis of Census Bureau data.
[End of figure]
Moreover, only a small percentage of most local census offices‘
nonresponse follow-up workload was finished using closeout and partial
interviews. As shown in figure 10, of the 499 local offices where
reliable closeout data were available,[Footnote 23] 413 (83 percent)
reported that less than 2 percent of their questionnaires were
finished in this manner, while 19 offices (4 percent) reported 5
percent or more of their finished nonresponse follow-up work as
closeout interviews. For partial interviews, of the 508 offices where
reliable data were available, 267 (53 percent) reported collecting
less than 2 percent of such data, while 47 offices (9 percent)
reported 5 percent or more of their finished work as partial
interviews. The median percentages of closeout and partial interviews
were .8 percent and 1.9 percent, respectively.
Figure 10: Percentage of Local Census Offices Collecting Less Complete
Data:
[Refer to PDF for image: vertical bar graph]
Percentage of nonresponse follow-up workload: Partial interviews;
Less than 2 percent: 53%;
2 percent to less than 5 percent: 38%;
5 percent or more: 9%.
Percentage of nonresponse follow-up workload: Closeout interviews;
Less than 2 percent: 83%;
2 percent to less than 5 percent: 13%;
5 percent or more: 4%.
Note: Comparable data for 1990 were not available for comparison to
2000 results.
Source: GAO analysis of Census Bureau data.
[End of figure]
At those local census offices that had substantially higher levels of
closeout and partial interview data than other offices, the bureau
said that some of this was understandable given the enumeration
challenges that these census offices faced. For example, according to
the bureau, the relatively high partial interview rate at a New York
local office (3.8 percent of that office‘s finished nonresponse follow-
up workload) was in line with the regional average of 2.2 percent,
partly due to the difficulty that staff had in gaining access to
apartment buildings. Once building managers gave enumerators access
and they were able to obtain information from proxies, the number of
refusals may have decreased, but the number of partial interviews
increased because the proxies could not provide complete information.
Still, as noted above, some local census offices inappropriately used
certain enumeration techniques. For example, the Hialeah, Florida,
office reported finishing its nonresponse follow-up workload in 5
weeks”well ahead of the 8-week stretch goals and 10 weeks allotted for
the operation. The Homestead, Florida, office”where Hialeah-trained
enumerators were later transferred to help complete nonresponse follow-
up”reported finishing its workload in 7 weeks. The Commerce Department‘
s Office of the Inspector General later found that Hialeah-trained
enumerators did not make the required number of visits and telephone
calls before contacting a proxy for information, and did not properly
implement quality control procedures designed to detect data
falsification.[Footnote 24] The bureau responded to these findings by,
among other actions, reworking over 64,000 questionnaires from the
Hialeah and Homestead offices.
Questions Surround Whether Certain Reinterview Procedures Were
Implemented as Intended:
To help ensure that enumerators followed proper enumeration procedures
and were not falsifying data, the bureau ’reinterviewed“ households
under certain circumstances to check enumerators‘ work. As such,
reinterviews were a critical component of the bureau‘s quality
assurance program for nonresponse follow-up. If falsification was
detected during a reinterview, the local office was to terminate the
enumerator and redo all of the enumerator‘s work. Enumerators making
inadvertent errors were to correct their mistakes and be retrained.
The bureau conducted three types of reinterviews:
1. Random reinterviews were to be performed on a sample of enumerators‘
work during the early weeks of their employment. Seven randomly
selected questionnaires from each enumerator‘s first 70 cases were to
have been reinterviewed.
2. Administrative reinterviews checked the work of enumerators whose
performance in certain dimensions (e.g., the number of partial
interviews conducted) differed significantly from that of other
enumerators employed in the same area”and there was no justification
for the difference. In such cases, enumerators could be fabricating
data. According to the bureau, administrative tests were designed to
identify enumerators who were making errors that were more likely to
occur toward the end of the operation, after the random check of
enumerators‘ initial work. They were conducted at the discretion of
local census officials.
3. Supplemental reinterviews were to be conducted at the discretion of
local census officials when they had some basis for concern about the
quality of an enumerator‘s work.
On the basis of our work and that of the bureau, we found that local
census office officials often used their discretion to not conduct
administrative and supplemental reinterviews and thus, a number of
local offices did not conduct such reinterviews. At those offices,
once the random check of enumerators‘ initial work was completed,
there were no additional checks specifically designed to catch
enumerators suspected of falsifying data. This raises questions about
the reinterview program‘s ability to ensure the quality of enumerators‘
work over the full duration of their employment on nonresponse follow-
up.
Local Managers Often Decided Against Conducting Administrative
Reinterviews:
Of the 520 local census offices, 52 offices (10 percent) conducted no
administrative and no supplemental reinterviews, according to bureau
data.[Footnote 25] An additional 14 offices (3 percent) conducted no
administrative reinterviews, and an additional 231 offices (44
percent) conducted no supplemental reinterviews.
A chief in the bureau‘s Quality Assurance Office expressed concern
about the adequacy of quality assurance coverage toward the end of
nonresponse follow-up for offices that did not conduct administrative
and supplemental reinterviews. According to this official, this meant
that once random reinterviews were completed at those offices, there
were no additional checks specifically designed to detect fabricated
data. Although enumerators‘ immediate supervisors were to check
enumerators‘ work daily, these reviews were generally designed to
identify enumerators who were completing questionnaires incorrectly
(e.g., not following the proper question sequence and writing
illegibly), whereas administrative and supplemental reinterviews were
aimed at identifying enumerators who were intentionally falsifying
data.
Bureau officials said that at those local census offices that did not
conduct any administrative reinterviews, local census office managers
could conduct supplemental reinterviews if warranted. However,
managers employed this option infrequently. Of the 66 local offices
that did not conduct any administrative reinterviews, just 14
conducted supplemental reinterviews.
Reasons that local census managers could use”as specified by the
bureau”for not conducting an administrative reinterview included (1)
the enumerator no longer worked in the area for which the
administrative test was conducted; (2) the enumerator‘s work was
characteristic with the area (e.g., the enumerator reported a large
number of vacant housing units and the area had a large number of
seasonal housing units); or (3) other reason, with an accompanying
explanation. Managers were to document their decision on the bureau‘s
administrative reinterview trouble reports listing the suspect
enumerators.
Our analysis of a week‘s worth of administrative reinterview trouble
reports at 31 local census offices found that while a number of
enumerators were flagged for administrative reinterviews, local census
office officials typically decided against conducting them.
Specifically, of the 3,784 enumerators identified for possible
reinterview, local officials subjected the work of 154 enumerators (4
percent) to reinterviews, and passed on 3,392 enumerators (90
percent). For 306 of the 3,874 enumerators (8 percent) listed on the
administrative trouble reports we reviewed, there was no indication of
a final decision on whether or not to subject the future work of these
enumerators to administrative reinterview.
Overall, local census offices conducted far fewer administrative
reinterviews than the bureau had anticipated. Local census offices
conducted 276,832 administrative reinterviews”146,993 (35 percent)
fewer than the 423,825 administrative reinterviews the bureau had
expected based on a number of factors, including the number of cases
completed per hour during the 1990 Census, and the estimated workload
in 2000. Whether this was due to better quality work on the part of
enumerators, or local managers deciding against subjecting enumerators‘
work to reinterviews, is unknown. However, as administrative
reinterviews were designed to detect fabrication and other quality
problems more likely to occur toward the end of nonresponse follow-up
after the random check of enumerators‘ initial work, it will be
important for the bureau to examine whether local census offices
properly conducted administrative reinterviews, and thus ensure the
quality of nonresponse follow-up data throughout the duration of the
operation.
Conclusions:
Although nonresponse follow-up was fraught with extraordinary
managerial and logistical challenges, the bureau generally completed
nonresponse follow-up consistent with its operational plan”a
remarkable accomplishment given the scope and complexity of the
effort. Our review highlighted several strategies that were key to the
bureau‘s success including (1) an aggressive outreach and promotion
campaign and other efforts aimed at boosting the mail response rate
and lowering the bureau‘s nonresponse follow-up workload; (2) a
flexible recruiting strategy that made the bureau a competitive
employer in a tight national labor market; (3) advance planning for
addressing location-specific enumeration challenges; and (4) ambitious
stretch goals that encouraged local managers to accelerate the pace of
the operation. It will be important for the bureau to document the
lessons learned from these initiatives and use them to help inform
planning efforts for the next decennial census in 2010.
It will also be important for the bureau to address the continuing
significant challenges that were revealed by the conduct of
nonresponse follow-up in 2000, including:
* achieving an acceptable response rate (and thus lowering the
bureau‘s follow-up workload) while controlling costs;
* reversing the downward trend in public participation in the census,
in part by converting the relatively large number of people who are
aware of the census into census respondents;
* keeping the address list and maps used for nonresponse follow-up
accurate and up-to-date;
* finding the right mix of incentives to motivate local census offices
to complete nonresponse follow-up on schedule without compromising
data quality; and;
* ensuring that reinterview procedures provide sufficient quality
assurance coverage through the full duration of enumerators‘
employment on nonresponse follow-up.
Recommendations for Executive Action:
As the bureau plans for the next national head count in 2010, we
recommend that the Secretary of Commerce ensure that the bureau take
the following actions to help ensure that nonresponse follow-up is
conducted as cost effectively as possible:
* Identify and refine lessons learned from the 2000 nonresponse follow-
up operation and apply them to the bureau‘s plans for the 2010 Census.
* Assess to the extent practicable, why people who were aware of the
census did not return their census questionnaires and develop
appropriate marketing countermeasures to bridge the gap between their
awareness of the census on the one hand, and their motivation to
respond on the other.
* Develop and test procedural and technological options that have the
potential to generate a more accurate and up-to-date address list and
set of maps for nonresponse follow-up. As part of this effort, the
bureau should explore how to refresh the nonresponse follow-up address
list more frequently, even as nonresponse follow-up is underway, so
that enumerators would not have to make costly visits to late-
responding households. The bureau also needs to examine the methods it
uses in activities that precede nonresponse follow-up to develop and
update the nonresponse address list and associated maps. Specifically,
the bureau should determine the extent to which updates that should
have been made were properly reflected in the nonresponse follow-up
list and maps, and take appropriate corrective actions to address any
problems it identifies.
* Ensure that the bureau‘s procedures and incentives for the timely
completion of nonresponse follow-up emphasize the collection of
quality data and proper enumeration techniques as much as speed.
* Examine the bureau‘s reinterview procedures”particularly as they
relate to the discretion given to local census officials”to help
ensure that the procedures are sufficient for consistently and
reliably detecting potential problems throughout the duration of
enumerators‘ employment on nonresponse follow-up.
Agency Comments and Our Evaluation:
The Secretary of Commerce forwarded written comments from the Bureau
of the Census on a draft of this report. The bureau concurred with all
five of our recommendations and had no specific comments on them. The
bureau also clarified several key points and provided additional
information and perspective, which we incorporated in our report as
appropriate.
The bureau noted that, in addition to the locked apartment buildings
that we cited in the Results in Brief section of our report, gated
communities were also an enumeration challenge. While the body of the
report already contained this information, we added it to the Results
in Brief section as well.
Our draft report stated: ’One reason for the errors in the nonresponse
follow-up address lists was that the bureau found it was infeasible to
remove late-responding households. As a result, enumerators needed to
visit over 773,000 households that had already mailed back their
questionnaires....“ The bureau commented that it made a conscious
decision to conduct these visits based on logistical concerns and, as
a result, the bureau believes that our use of the terms ’errors“ and
’needlessly“ do not take this into consideration and are misleading.
Because the bureau could not refresh its nonresponse follow-up address
list to reflect households that responded after April 18, the bureau
had no choice but to send enumerators to those households and collect
the information in-person. However, the term ’needed to“ better
characterizes the bureau‘s lack of options and we revised the text
accordingly. We also deleted the term ’errors.“
In response to our finding that 52 local census offices did not
conduct any reinterviews after an initial random check of enumerators‘
work, the bureau commented that the initial random check was not a
minimal activity in that it involved reinterviewing up to seven cases
per enumerator. The bureau also noted that there were no operational
requirements to conduct a specific number of administrative or
supplemental reinterviews. We agree with the bureau‘s comments.
Indeed, the draft report already included information on the number of
initial random reinterviews the bureau conducted and the discretionary
nature of administrative and supplemental reinterviews. Nevertheless,
it is also true, as we note in our report, that once those 52 local
census offices completed the seven random reinterviews, there were no
additional checks specifically designed to catch enumerators suspected
of falsifying data. Moreover, we reported that nationwide, local
census offices conducted far fewer administrative reinterviews than
the bureau had expected. As we note in the report, whether this was
due to the quality of enumerators‘ work or local managers using their
discretion and opting not to subject enumerators‘ work to
reinterviews, is unknown.
With respect to the bureau‘s monitoring of local census office‘s
productivity, the bureau noted that headquarters officials did not
work directly with local census office staff as noted in the draft;
rather, headquarters personnel worked with the bureau‘s regional
census centers, and they in turn worked with the local offices. We
revised the text to reflect this information.
With respect to our observation that several local census offices had
to quickly respond to unanticipated challenges, such as working with
nonresponse follow-up address lists and maps that were not accurate or
current, the bureau commented that there were standard procedures in
the nonresponse follow-up enumerator manual on how to deal with
map/register discrepancies. We verified this and revised the text
accordingly.
In describing the steps that local census officials took to encourage
public participation in the census, we noted that census officials in
Cleveland and Cincinnati said they provided additional training for
enumerators on how to handle refusals. The bureau noted that
standardized training was provided, across the nation, on options for
handling refusals, and information was also provided in the
nonresponse follow-up enumerator manual. We verified this information
and added it to the report.
The bureau commented that the address list and map difficulties that
enumerators encountered were not nonresponse problems because, as we
note in the report, and the bureau agrees, they should have been dealt
with in earlier census operations. Nevertheless, the problems did not
surface until nonresponse follow-up when enumerators encountered
duplicate and nonexistent addresses, and were less productive as a
result. For this reason, the report recommends that the bureau examine
the methods it uses in activities that precede nonresponse follow-up
to ensure the address lists and maps used for nonresponse follow-up
are accurate and up-to-date.
In response to our statement that nonresponse follow-up was to help
verify changes to the address list from earlier address list
development operations, the bureau commented that nonresponse follow-
up was conducted to enumerate households from which it did not receive
a completed questionnaire; map and address updates were incidental. We
agree with the bureau on the primary purpose of nonresponse follow-up
and revised the text to better reflect this point. However, the bureau‘
s program master plan for the master address file includes nonresponse
follow-up as one of a number of address list development and
maintenance operations, and the bureau expected enumerators to update
maps and address registers as needed as part of their field visits.
The bureau said it could not confirm data in our draft report on the
number of vacant and deleted units identified during nonresponse
follow-up and suggested removing this information. Although we
obtained the data directly from the bureau, given the bureau‘s
concerns, we deleted the section.
In commenting on the fact that we did not find a statistically
significant relationship between the week that local census offices
finished their follow-up workload and the amount of their residual
workload, the bureau stated that the report needed to reflect the fact
that residual nonresponse consisted of housing units for which
completed questionnaires had not been processed through data capture.
We revised the draft accordingly.
The bureau noted that assistant managers for field operations, among
other local census officials, could request supplemental reinterviews,
and not just field operations supervisors as we stated in our report.
We revised our draft to include this information.
With respect to our findings concerning the reinterview program‘s
ability to detect problems, particularly at the end of nonresponse
follow-up, the bureau commented that there was turnover in the
enumerator workforce; consequently, with new hires, random
reinterviews were conducted during all stages of the operation. As we
note in the report, 52 local census offices (about 10 percent of all
local offices), did not conduct any administrative and supplemental
reinterviews. Thus, once these offices completed the random
reinterviews on the initial work of newly hired enumerators, there
were no additional checks specifically designed to catch enumerators
suspected of falsifying data. We added language to better clarify this
point.
The bureau said that it was uncertain as to the methodology and
documentation used for deriving figures on the number of reinterviews
the bureau conducted. We obtained the data from the bureau‘s cost and
progress system.
The bureau stated that there was no evidence that data quality was
compromised to motivate on-time completion of nonresponse follow-up.
Our research suggests that the impact of the bureau‘s incentives to
motivate timeliness was less clear-cut given the fact that, as we note
in our report, (1) about 40 percent of the local census office
managers believed that scheduling pressures had a negative or
significantly negative impact on the quality of nonresponse follow-up,
and (2) a small number of local census offices took short-cuts to
complete their work (which the bureau later took steps to rectify).
Thus, while we agree with the bureau that maintaining data quality
should be a given in determining motivational elements, the extent to
which the bureau accomplished this goal for nonresponse follow-up
appeared to have had mixed results.
In commenting on our conclusion that it will be important for the
bureau to ensure that reinterview procedures provide sufficient
quality assurance through the full duration of nonresponse follow-up,
the bureau noted that the reinterview operation must be designed to
provide sufficient quality assurance coverage. We revised the text
accordingly.
We are sending copies of this report to the Honorable Dan Miller and
Carolyn B. Maloney, House of Representatives, and those in other
interested congressional committees; the Secretary of Commerce; and
the Acting Director of the Bureau of the Census. Copies will be made
available to others on request. Major contributors to this report are
included in appendix III. If you have any questions concerning this
report, please call me on (202) 512-6806.
Signed by:
Patricia A. Dalton:
Director:
Strategic Issues:
[End of section]
Appendix I: Local Census Offices Included in This Review:
Local Census Offices in the Census Bureau‘s Atlanta Region:
Atlanta East;
Bradenton;
Fort Myers.
Local Census Offices in the Census Bureau‘s Boston Region:
Boston North;
Burlington;
Hartford;
Providence.
Local Census Offices in the Census Bureau‘s Charlotte Region:
Ashland-Hanover;
Beaufort;
Conway;
Greenville, North Carolina, East;
Greenville, North Carolina, West;
Wilmington.
Local Census Offices in the Census Bureau‘s Chicago Region:
Chicago Central;
Chicago Far North;
Chicago Near North;
Chicago Near South;
Chicago Near Southwest;
Chicago West;
Indianapolis;
Midland;
Milwaukee;
Superior.
Local Census Offices in the Census Bureau‘s Dallas Region:
Corpus Christi;
Dallas Central;
Greenville, Mississippi;
Harris County, Northeast;
Laredo;
McAllen;
New Orleans Central;
Orleans Parish.
Local Census Offices in the Census Bureau‘s Denver Region:
Flagstaff;
Las Cruces;
Las Vegas
Phoenix South;
Santa Fe;
Yuma.
Local Census Offices in the Census Bureau‘s Detroit Region:
Cincinnati;
Cleveland;
Marquette.
Local Census Offices in the Census Bureau‘s Kansas City Region:
Kansas City;
Moorhead;
St. Louis City.
Local Census Offices in the Census Bureau‘s Philadelphia Region:
Baltimore West;
Philadelphia North
Philadelphia South;
Pittsburgh.
Local Census Offices in the Census Bureau‘s Los Angeles Region:
Hollywood/Mid-Wilshire;
Los Angeles Downtown;
Santa Monica.
Local Census Offices in the Census Bureau‘s New York Region:
Bronx Northeast;
Brooklyn Central;
Brooklyn East;
Brooklyn Northeast;
New York East;
New York North;
New York Northeast.
Local Census Offices in the Census Bureau‘s Seattle Region:
Portland;
San Francisco Northeast;
San Francisco Southeast.
[End of section]
Appendix II: Comments from the Secretary of Commerce:
The Secretary Of Commerce:
Washington, D.C. 20230:
January 10, 2002:
Mr. J. Christopher Mihm:
Director, Strategic Issues:
General Accounting Office:
Washington, DC 20548:
Dear Mr. Mihm:
The Department of Commerce appreciates the opportunity to comment on
the General Accounting Office draft report entitled "2000 Census: Best
Practices and Lessons Learned for More Cost-Effective Nonresponse
Follow-up." The Department's comments on this report are enclosed.
Warm regards:
Signed by:
Donald L. Evans:
Enclosure:
[End of letter]
Comments from the U.S. Department of Commerce:
U.S. Census Bureau:
U.S. General Accounting Office draft report entitled 2000 Census: Best
Practices and Lessons Learned for More Cost-Effective Nonresponse
Follow-up:
Comments on the Text of the Report:
1. Section: Page 3, Bullet 3 - [The Bureau] "called on local census
offices to identify local enumeration challenges, such as locked
apartment buildings, and to develop action plans to address them...."
Comment: Gated communities also were identified as an enumeration
challenge.
2. Section: Page 4, Paragraph 3, continued on Page 5 - "Third, the
address lists used for nonresponse follow-up did not always contain
the latest available information, and associated maps used by census
enumerators during nonresponse follow-up contained inaccuracies. One
reason for the errors in the nonresponse follow-up address lists was
that the Bureau found it was infeasible to remove late-responding
households. As a result, enumerators needlessly visited over 773,000
households that had already mailed back their questionnaires”an effort
that approached $22 million in additional costs for nonresponse follow-
up, based on our estimate, and confused respondents."
Comment: The determination that it was infeasible to remove late-
responding households was a conscious decision based on logistical
concerns. Use of the terms "errors" and "needlessly" do not take this
into consideration and are misleading.
3. Section: Page 6, Paragraph 1 - "Finally, questions surround the
extent to which certain reinterview procedures were implemented
throughout the entire nonresponse follow-up operation as intended, as
local census office managers often exercised their discretion and
opted against conducting this key quality assurance procedure aimed at
detecting enumerator fraud. For example, 52 local census offices
(about 10 percent of all local offices) did not conduct any
reinterviews after an initial random check of enumerators' work. A
senior Bureau quality assurance official expressed concerns about the
adequacy of quality assurance coverage toward the end of nonresponse
follow-up at these offices."
Comment: The initial random check was not a minimal activity. The
check involved reinterview of up to seven cases per enumerator. There
were no operational requirements to conduct a specific number of
administrative or supplemental reinterviews.
4. Section: Page 22, Paragraph 1 - "Once nonresponse follow-up began,
Bureau officials tracked production rates as the primary measure of
whether local offices had met their staffing goals. For example,
Bureau officials said that both Bureau headquarters and regional
census center staff monitored local census offices' production daily.
If an office was not meeting its production goals, Bureau headquarters
officials worked with managers of that office to determine the reasons
for the shortfall and the actions necessary to increase production."
Comment: Census Bureau headquarters and regional census center (RCC)
staff did monitor local census offices (LCOs); however, headquarters'
officials worked with the RCCs, who worked with the LCOs. Headquarters
personnel did not work directly with LCO staff.
5. Section: Page 23, Paragraph 2 - "A third factor that contributed to
the timely completion of nonresponse follow-up was preparing in
advance for probable enumeration challenges. To do this, the Bureau
called on local census offices and their respective regional census
centers to develop action plans that, among other things, identified
hard-to-enumerate areas within their jurisdictions, such as immigrant
neighborhoods, and propose strategies for dealing with those
challenges. These strategies included such methods as paired/team
enumeration for high-crime areas, and hiring bilingual enumerators.
While this early planning effort helped local census offices react to
anticipated enumeration challenges, several local census offices also
had to quickly respond to unanticipated challenges, such as working
with nonresponse follow-up address lists and maps that were not
accurate or current."
Comment: Most LCOs had to quickly meet unanticipated challenges;
however, there were standard procedures in the nonresponse follow-up
(NRFU) enumerator manual on how to deal with map/register
discrepancies.
6. Section: Page 25, Paragraph 1 - "Local census officials encouraged
public participation with a variety of approaches. For example, census
officials in Cleveland and Cincinnati said they provided additional
training for enumerators on how to handle refusals and practiced what
was taught in mock interviews. Officials in other census offices said
they partnered with local community leaders who subsequently helped
reach out to hard-to-enumerate groups, hired people who were bilingual
or otherwise trusted and known by residents, and held media campaigns.
Overall, according to Bureau data, close to 470,000 households of the
approximately 42 million making up the nonresponse follow-up workload
(about 1 percent), refused to participate in the census."
Comment: Standardized training was provided, across the Nation, on
options for handling refusals. This information also was provided in
written form in the NRFU enumerator manual.
7. Section: Page 26, Paragraph 2, continued on Page 27 - "Among the
more frequent problems managers cited were duplicate addresses and
changes not being made from prior operations. For example, at a local
census office in the Seattle region, managers said that some addresses
were residences or businesses that had been gone for 10-15 years and
should have been deleted in previous census operations but were not."
Comment: These address list and map "problems" are not really NRFU
problems. The end of the paragraph indicates clearly that some
addresses "...should have been deleted in previous census operations."
8. Section: Page 28, Paragraph 2, continued on Page 29 - "The Bureau
developed its master address list and maps using a series of
operations throughout the decade, each designed to add incremental
improvements. Nonresponse follow-up was to help verify changes to the
address list from some of these earlier operations. As a result, the
Bureau was expecting some discrepancies between the nonresponse follow-
up address list and what enumerators found in the field when they went
door-to-door, which could account for some of the local census
officials' perceptions. Of the approximately 119 million
questionnaires delivered, 3.1 million were to units subsequently found
during nonresponse follow-up to be vacant and 1.9 million were deleted
(e.g., because they were found to be nonexistent units), according to
Bureau data."
Comment: The NRFU was conducted to enumerate households from which we
had not received a completed questionnaire. It was not conducted to
"...help verify changes to the address list...."; map and address
updates were incidental and "problems" were remnants of earlier
operations as indicated in the previous item. Furthermore, we cannot
confirm the numbers cited for vacant and deleted units identified
during NRFU in 2000. However, we can confirm that these numbers are
too low, based on the fact that about 8 million vacant and deleted
units were identified as such for the first time during NRFU. We
recommend that this paragraph be deleted, given the concerns noted.
9. Section: Page 29, Paragraphs 2 and 3 - "Another factor that
affected the currency of the nonresponse follow-up address list was
the cut-off date for mail-back responses. The Bureau set April 11,
2000, as the deadline for mail-back responses for purposes of
generating the address list for nonresponse follow-up. However,
according to Bureau officials, the Bureau got an unexpected boost from
its outreach and promotion campaign, which stressed the importance of
cooperating with census enumerators. As a result, by April 30”almost 2
weeks after the April 18 printing of the nonresponse follow-up address
list for late mail returns”the Bureau had received an additional
773,784 questionnaires. Bureau headquarters officials told us it was
not feasible to remove these from the address lists and thus,
enumerators visited these households. The cost to the Bureau of these
otherwise needless visits approached $22 million ...."
Comment: Some addresses of late returns were removed from the D-166
report form; however, even after the new report was generated, we
continued to receive late mail returns. As mentioned earlier, the
reference to "needless visits" implies arbitrary inaction in allowing
enumerators to visit these households as opposed to a conscious
decision based on logistical concerns.
10. Section: Page 34, (full) Paragraph 2 - `Neither did we find a
statistically significant relationship between the week that local
census offices finished their nonresponse follow-up workload and the
amount of residual workload [footnote], they had, if any."
Comment: The second sentence needs to reflect the fact that residual
nonresponse consisted of units for which completed questionnaires had
not been processed through data capture.
11. Section: Page 39, Item (3) - "Supplemental reinterviews were to be
conducted”also at local census managers' discretion”when local census
personnel, called field operations supervisors, requested that such
reinterviews be done because they had some basis for concern about the
quality of an enumerator's work."
Comment: Quality reinterviews could be requested by assistant managers
for field operations or others in addition to field operations
supervisors.
12. Section: Page 39, Last Paragraph - "On the basis of our work and
that of the Bureau, we found that local census office managers often
used their discretion to not conduct administrative and supplemental
reinterviews. As a result, a number of offices did not conduct any
administrative or supplemental reinterviews, which raises questions
about the ability of the reinterview program to detect problems,
particularly towards the end of the nonresponse follow-up operation."
Comment: A statement needs to be added indicating that there was
turnover in the enumerator workforce; hence, with new hires, random
reinterview was conducted during all stages of the operation.
13. Section: Page 42, Paragraph 1 - "Overall, local census offices
conducted far fewer administrative reinterviews than the Bureau had
anticipated. Local census offices conducted 276,832 administrative
reinterviews-146,993 (35 percent) fewer than the 423,825
administrative reinterviews the Bureau had expected based on a number
of factors, including the number of cases completed per hour during
the 1990 Census, and the estimated workload in 2000. Whether this was
due to better quality work on the part of enumerators, or local
managers deciding against subjecting enumerators' work to
reinterviews, is unknown. However, as administrative reinterviews were
designed to detect problems more likely to occur at the end of
nonresponse follow-up, it will be important for the Bureau to examine
whether local census offices properly conducted administrative
reinterviews, and thus ensured the quality of nonresponse follow-up
data throughout the duration of the operation."
Comment: We are uncertain as to the methodology for deriving this
estimate and the documentation from which it was obtained.
14. Section: Page 43, Bullet 4 - "[It will also be important for the
Bureau to address..., including] finding the right mix of incentives
to motivate local census offices to complete nonresponse follow-up on
schedule without compromising data quality...."
Comment: There is no evidence that data quality was compromised to
motivate on-time completion of NRFU. Not compromising the quality of
data should be a given in determining motivational elements.
15. Section: Page 43, Bullet 5 - "[It will also be important for the
Bureau to address..., including] ensuring that reinterview procedures
provide sufficient quality assurance through the full duration of
nonresponse follow-up."
Comment: The reinterview operation must be designed to provide
sufficient quality assurance coverage.
Responses to GAO Recommendations:
Census Bureau Response: The Census Bureau concurs with the
recommendations and has no specific comments on them.
Appendix III: GAO Contacts and Staff Acknowledgments:
GAO Contacts:
Patricia A. Dalton, (202) 512-6806:
Robert Goldenkoff, (202) 512-2757:
Acknowledgments:
In addition to those named above, the following headquarters staff
made key contributions to this report: Wendy Ahmed; Tom Bean; James
Fields; Rich Hung; Lily Kim; J. Christopher Mihm; Victoria E. Miller;
Vicky L. Miller; Ty Mitchell; Anne Rhodes-Kline; Lynn Wasielewski;
Susan Wallace.
The following staff from the Western Regional Office also contributed
to this report: James Bancroft; Robert Bresky; Arthur Davis; Julian
Fogle; Araceli Hutsell; RoJeanne Liu; Elizabeth Dolan; Thomas Schulz;
Nico Sloss; Cornelius Williams.
The following staff from the Central Regional Office also contributed
to this report: Richard Burrell; Michael De La Garza; Maria Durant;
Donald Ficklin; Ron Haun; Arturo Holguin, Jr.; Reid Jones; Stefani
Jonkman; Roger Kolar; Tom Laetz; Miguel Salas; Enemencio Sanchez;
Jeremy Schupbach; Melvin Thomas; Richard Tsuhara; Theresa Wagner;
Patrick Ward; Linda Kay Willard; Cleofas Zapata, Jr.
The following staff from the Eastern Regional Office also contributed
to this report: Carnmillia Campbell; Lara Carreon; Betty Clark;
Johnetta Gatlin-Brown; Marshall Hamlett; Carlean Jones; Janet Keller;
Cameron Killough; Jean Lee; Christopher Miller; S. Monty Peters;
Sharon Reid; Matthew Smith.
[End of section]
Related GAO Products on the Results of the 2000 Census and Lessons
Learned for a More Cost-Effective Census in 2010:
2000 Census: Coverage Evaluation Interviewing Overcame Challenges, but
Further Research Needed (GAO-02-26, December 31, 2001).
2000 Census: Analysis of Fiscal Year 2000 Budget and Internal Control
Weaknesses at the U.S. Census Bureau (GAO-02-30, December 28, 2001).
2000 Census: Significant Increase in Cost Per Housing Unit Compared to
1990 Census (GAO-02-31, December 11, 2001).
2000 Census: Better Productivity Data Needed for Future Planning and
Budgeting (GAO-02-4, October 4, 2001).
2000 Census: Review of Partnership Program Highlights Best Practices
for Future Operations (GAO-01-579, August 20, 2001).
Decennial Censuses: Historical Data on Enumerator Productivity Are
Limited (GAO-01-208R, January 5, 2001).
2000 Census: Information on Short- and Long-Form Response Rates
(GAO/GGD-00-127R, June 7, 2000).
[End of section]
Footnotes:
[1] The initial mail response rate is calculated as a percentage of
all forms in the mail-back universe from which the bureau received a
questionnaire. It factors in housing units that are discovered to be
nonexistent or unoccupied during nonresponse follow-up. The bureau
uses this percentage as an indicator of its nonresponse follow-up
workload. This differs from the mail return rate which the bureau uses
as a measure of public cooperation. It is the percentage of forms the
bureau receives from occupied housing units in the mail-back universe
and is calculated after the bureau completes the enumeration process.
[2] The completion time excludes certain follow-up activities
conducted after the bureau finished its initial workload.
[3] Our analysis did not include nine local census offices located in
Puerto Rico.
[4] The index measure, or "hard-to-count score," was based on
variables contained in the 1990 Data for Census 2000 Planning
Database, such as the percent of households with no adult who speaks
English well.
[5] Of the 511 local offices, 3 were not included in the analysis of
partial interviews and 12 were not included in the analysis of
closeout interviews because the bureau identified their values for
these variables as erroneous due to coding errors.
[6] 2000 Census: Review of Partnership Program Highlights Best
Practices for Future Operations (GAO-01-579, Aug. 20, 2001).
[7] See for example, Decennial Census: 1990 Results Show Need for
Fundamental Reform (GAO/GGD-92-94, June 9, 1992).
[8] For the 2000 Census, the bureau used what it refers to as an
"initial response rate" to provide a measure of the scope of the
nonresponse follow-up operation. This initial response rate is defined
as the percentage of all questionnaires that are completed and
returned by April 18, 2000. The rate includes the number of
questionnaires that are mailed back, transmitted via the Internet, or
completed over the telephone through the bureau's Telephone
Questionnaire Assistance program. It also includes Be Counted Forms
that have census identification numbers. On September 19, 2000, the
bureau announced that it had achieved a final mail-back response rate
of 67 percent.
[9] 2000 Census: Significant Increase in Cost Per Housing Unit
Compared to 1990 Census (GAO-02-31, Dec. 11, 2001).
[10] 2000 Census: Preparations for Dress Rehearsal Leave Many
Unanswered Questions (GAO/GGD-98-74, Mar. 26, 1998).
[11] The bureau later adjusted its qualified applicant goal to 2.1
million based on the actual nonresponse follow-up workload.
[12] At one of the local census offices we visited, we were unable to
obtain a useable response to this question generally because the local
census office‘s managers were unavailable during the time of our
review.
[13] At five of the local census offices we visited, we were unable to
obtain a useable response to this question generally because local
census office managers were either unavailable or did not know.
[14] At eight local census offices we visited, we were unable to
obtain a useable response to this question generally because local
census office managers were either unavailable or did not know.
[15] At two of the local census offices we visited, we were unable to
obtain a useable response to this question generally because local
census office managers were either unavailable or did not know.
[16] GAO/GGD-00-6, December 14, 1999.
[17] Results of regression: t = -1.65; p = 0.10.
[18] Results of regression: t = -0.44; p = 0.66.
[19] We used an index measure (hard-to-count score) developed by the
bureau.
[20] Results of regression: t = -.04; p = 0.97.
[21] Results of correlation: r = -.08.
[22] Results of correlation: r = -.15.
[23] We excluded data for those local census offices that, according
to the bureau, were not reliable because of various anomalies, such as
inaccurate coding of questionnaires by local office staff.
[24] For more information on this incident, see U.S. Department of
Commerce, Office of Inspector General, Bureau of the Census: Re-
enumeration at Three Local Census Offices in Florida: Hialeah, Broward
South, and Homestead (ESD-13215-0-0001, Sept. 29, 2000).
[25] In addition to the 511 local census offices located in the United
States, there were 9 offices in Puerto Rico.
[End of section]
GAO‘s Mission:
The General Accounting Office, the investigative arm of Congress,
exists to support Congress in meeting its constitutional
responsibilities and to help improve the performance and
accountability of the federal government for the American people. GAO
examines the use of public funds; evaluates federal programs and
policies; and provides analyses, recommendations, and other assistance
to help Congress make informed oversight, policy, and funding
decisions. GAO‘s commitment to good government is reflected in its
core values of accountability, integrity, and reliability.
Obtaining Copies of GAO Reports and Testimony:
The fastest and easiest way to obtain copies of GAO documents at no
cost is through the Internet. GAO‘s Web site [hyperlink,
http://www.gao.gov] contains abstracts and full text files of current
reports and testimony and an expanding archive of older products. The
Web site features a search engine to help you locate documents using
key words and phrases. You can print these documents in their
entirety, including charts and other graphics.
Each day, GAO issues a list of newly released reports, testimony, and
correspondence. GAO posts this list, known as ’Today‘s Reports,“ on
its Web site daily. The list contains links to the full-text document
files. To have GAO e-mail this list to you every afternoon, go to
[hyperlink, http://www.gao.gov] and select ’Subscribe to daily E-mail
alert for newly released products“ under the GAO Reports heading.
Order by Mail or Phone:
The first copy of each printed report is free. Additional copies are
$2 each. A check or money order should be made out to the
Superintendent of Documents. GAO also accepts VISA and Mastercard.
Orders for 100 or more copies mailed to a single address are
discounted 25 percent. Orders should be sent to:
U.S. General Accounting Office: 441 G Street NW, Room LM:
Washington, D.C. 20548:
To order by Phone:
Voice: (202) 512-6000:
TDD: (202) 512-2537:
Fax: (202) 512-6061:
To Report Fraud, Waste, and Abuse in Federal Programs Contact:
Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]:
E-mail: fraudnet@gao.gov:
Automated answering system: (800) 424-5454 or (202) 512-7470:
Public Affairs:
Jeff Nelligan, managing director,
NelliganJ@gao.gov:
(202) 512-4800:
U.S. General Accounting Office:
441 G Street NW, Room 7149:
Washington, D.C. 20548: