Airport Passenger Screening
Preliminary Observations on Progress Made and Challenges Remaining
Gao ID: GAO-03-1173 September 24, 2003
Passenger screening is critical to the security of our nation's aviation system, particularly in the aftermath of the September 11, 2001, terrorist attacks. The Transportation Security Administration (TSA) is tasked with securing all modes of transportation, including the screening of airline passengers. TSA has met numerous requirements in this regard, such as deploying more than 50,000 federal screeners at over 440 commercial airports nationwide. To determine whether TSA's passenger screening program is achieving its intended results, GAO is conducting an ongoing evaluation of TSA's efforts to (1) ensure that passenger screeners are effectively trained and supervised, (2) measure screener performance in detecting threat objects, and (3) implement and evaluate the contract screening pilot program.
The Transportation Security Administration (TSA) was tasked with the tremendous challenge of building a large federal agency responsible for securing all modes of transportation, while simultaneously meeting ambitious deadlines to enhance the security of the nation's aviation system. Although TSA has made significant progress related to its passenger screening program, challenges remain. TSA recognized that ongoing training of screeners on a frequent basis, and effective supervisory training, is critical to maintaining and enhancing skills. However, TSA has not fully developed or deployed recurrent or supervisory training programs. Although TSA has not yet deployed these programs, it has taken steps in establishing recurrent and supervisory training, including developing six recurrent training modules that will soon be deployed to all airports, as well as working with the U.S. Department of Agriculture (USDA) Graduate School to tailor its off-the-shelf supervisory course to the specific training needs of TSA's screening supervisors. TSA currently collects little information regarding screener performance in detecting threat objects. The primary source of information collected on screener's ability to detect threat objects is covert testing conducted by TSA's Office of Internal Affairs and Program Review. However, TSA does not consider the results of these tests as a measure of screener performance, but rather a "snapshot" of a screener's ability to detect threat objects at a particular point in time. Additionally, TSA does not currently use the Threat Image Projection system, which places images of threat objects on x-ray screens during actual operations and records whether screeners identify the threat. However, TSA plans to fully activate the Threat Image Projection system with significantly more threat images than previously used, as well as implement an annual screener certification program in October 2003. TSA also recently completed a screener performance improvement study and is taking steps to address the deficiencies identified during the study. As required by the Aviation and Transportation Security Act, TSA implemented a pilot program using contract screeners in lieu of federal screeners at 5 commercial airports. However, TSA has not yet determined how to evaluate and measure the performance of the pilot program airports, or prepare for airports potentially applying to opt-out of using federal screeners, as allowed by the act, beginning in November 2004. Although TSA has not begun evaluating the performance of the pilot program airports, it plans to award a contract by October 1, 2003, to compare the performance of pilot screeners with federal screeners and determine the reasons for any differences. Numerous airport operators have contacted TSA to express an interest in obtaining more information to assist in their decision regarding opting-out of using federal screeners.
GAO-03-1173, Airport Passenger Screening: Preliminary Observations on Progress Made and Challenges Remaining
This is the accessible text file for GAO report number GAO-03-1173
entitled 'Airport Passenger Screening: Preliminary Observations on
Progress Made and Challenges Remaining' which was released on September
25, 2003.
This text file was formatted by the U.S. General Accounting Office
(GAO) to be accessible to users with visual impairments, as part of a
longer term project to improve GAO products' accessibility. Every
attempt has been made to maintain the structural and data integrity of
the original printed product. Accessibility features, such as text
descriptions of tables, consecutively numbered footnotes placed at the
end of the file, and the text of agency comment letters, are provided
but may not exactly duplicate the presentation or format of the printed
version. The portable document format (PDF) file is an exact electronic
replica of the printed version. We welcome your feedback. Please E-mail
your comments regarding the contents or accessibility features of this
document to Webmaster@gao.gov.
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed
in its entirety without further permission from GAO. Because this work
may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this
material separately.
Report to the Chairman, Subcommittee on Aviation, Committee on
Transportation and Infrastructure, House of Representatives:
United States General Accounting Office:
GAO:
September 2003:
Airport Passenger Screening:
Preliminary Observations on Progress Made and Challenges Remaining:
GAO-03-1173:
GAO Highlights:
Highlights of GAO-03-1173, a report to the Chairman, Subcommittee on
Aviation, Committee on Transportation and Infrastructure, House of
Representatives
Why GAO Did This Study:
Passenger screening is critical to the security of our nation‘s
aviation system, particularly in the aftermath of the September 11,
2001, terrorist attacks. The Transportation Security Administration
(TSA) is tasked with securing all modes of transportation, including
the screening of airline passengers. TSA has met numerous requirements
in this regard, such as deploying more than 50,000 federal screeners
at over 440 commercial airports nationwide. To determine whether TSA‘s
passenger screening program is achieving its intended results, GAO is
conducting an ongoing evaluation of TSA‘s efforts to (1) ensure that
passenger screeners are effectively trained and supervised, (2)
measure screener performance in detecting threat objects, and (3)
implement and evaluate the contract screening pilot program.
What GAO Found:
The Transportation Security Administration (TSA) was tasked with the
tremendous challenge of building a large federal agency responsible
for securing all modes of transportation, while simultaneously meeting
ambitious deadlines to enhance the security of the nation‘s aviation
system. Although TSA has made significant progress related to its
passenger screening program, challenges remain.
TSA recognized that ongoing training of screeners on a frequent basis,
and effective supervisory training, is critical to maintaining and
enhancing skills. However, TSA has not fully developed or deployed
recurrent or supervisory training programs. Although TSA has not yet
deployed these programs, it has taken steps in establishing recurrent
and supervisory training, including developing six recurrent training
modules that will soon be deployed to all airports, as well as working
with the U.S. Department of Agriculture (USDA) Graduate School to
tailor its off-the-shelf supervisory course to the specific training
needs of TSA‘s screening supervisors.
TSA currently collects little information regarding screener
performance in detecting threat objects. The primary source of
information collected on screener‘s ability to detect threat objects
is covert testing conducted by TSA‘s Office of Internal Affairs and
Program Review. However, TSA does not consider the results of these
tests as a measure of screener performance, but rather a ’snapshot“ of
a screener‘s ability to detect threat objects at a particular point in
time. Additionally, TSA does not currently use the Threat Image
Projection system, which places images of threat objects on x-ray
screens during actual operations and records whether screeners
identify the threat. However, TSA plans to fully activate the Threat
Image Projection system with significantly more threat images than
previously used, as well as implement an annual screener certification
program in October 2003. TSA also recently completed a screener
performance improvement study and is taking steps to address the
deficiencies identified during the study.
As required by the Aviation and Transportation Security Act, TSA
implemented a pilot program using contract screeners in lieu of
federal screeners at 5 commercial airports. However, TSA has not yet
determined how to evaluate and measure the performance of the pilot
program airports, or prepare for airports potentially applying to opt-
out of using federal screeners, as allowed by the act, beginning in
November 2004. Although TSA has not begun evaluating the performance
of the pilot program airports, it plans to award a contract by October
1, 2003, to compare the performance of pilot screeners with federal
screeners and determine the reasons for any differences. Numerous
airport operators have contacted TSA to express an interest in
obtaining more information to assist in their decision regarding
opting-out of using federal screeners.
What GAO Recommends:
Because our evaluation is ongoing and our results are preliminary, we
are not making any recommendations.
[End of section]
Contents:
Letter:
Results in Brief:
Background:
Scope and Methodology:
Recurrent and Supervisory Training Programs Not Fully Developed:
Little Information Exists to Measure Screeners' Performance in
Detecting Threat Objects:
An Assessment of the Contract Screening Pilot Program Has Not Yet
Begun:
TSA Continuing to Work to Identify Appropriate Staffing Levels at the
Nation's Airports:
Appendix I: Examples of Information Collected and Maintained in the
Transportation Security Administration's Performance Management
Information System:
Appendix II: GAO Contacts and Staff Acknowledgments:
GAO Contacts:
Staff Acknowledgments:
Abbreviations:
AAAE: American Association of Airport Executives:
ACI: Airports Council International:
ATSA: Aviation and Transportation Security Act:
DOT: Department of Transportation:
FAA: Federal Aviation Administration:
FSD: Federal Security Directors:
LMS: On-Line Learning Management System:
OIAPR: Office of Internal Affairs and Program Review:
OIG: Office of Inspector General:
OJT: on-the-job training:
PMIS: Performance Management Information System:
SOP: standard operating procedure:
TIP: Threat Image Projection:
TSA: Transportation Security Administration:
USDA: U.S. Department of Agriculture:
United States General Accounting Office:
Washington, DC 20548:
September 24, 2003:
The Honorable John Mica
Chairman,
Subcommittee on Aviation
Committee on Transportation and Infrastructure
House of Representatives:
Dear Mr. Chairman:
Passenger screening is a critical component to the security of our
nation's aviation system. Passenger screeners use metal detectors, X-
ray machines, explosive trace detection machines, and physical searches
to examine passengers and their baggage to identify threat objects. On
November 19, 2001, prompted by the terrorist attacks of September 11,
2001, the President signed the Aviation and Transportation Security Act
(ATSA), with a primary goal of strengthening the security of the
nation's aviation system. ATSA created the Transportation Security
Administration (TSA) and mandated specific improvements to aviation
security, including the federalization of passenger screening at over
440 commercial airports in the United States by November 19, 2002.
TSA was tasked with the tremendous challenge of building a large
federal agency responsible for securing all modes of transportation,
while simultaneously meeting ambitious deadlines to federalize aviation
security as mandated by ATSA. TSA has met numerous requirements related
to its passenger screening program, including deploying more than
50,000 federal screeners at over 440 commercial airports nationwide,
developing and implementing a basic screener training program, and
establishing a pilot program at 5 airports where screening of
passengers and property would be conducted by private screening
companies and overseen by TSA.
To determine whether TSA's passenger screening program is achieving its
intended results, the Subcommittee on Aviation, House Committee on
Transportation and Infrastructure, requested that we review various
aspects of the program. Specifically, the Subcommittee asked that we
evaluate TSA's efforts to (1) ensure that passenger screeners are
effectively trained and supervised, (2) measure screener performance in
detecting threat objects, (3) implement and evaluate the contract
screening pilot program, and (4) address airport-specific staffing
needs, while reducing the screener workforce. On September 5, 2003, we
briefed the Subcommittee staff on our preliminary observations of TSA's
passenger screening program based on our work to date.
This report summarizes and updates the information presented at that
briefing. Because our work is still on going, the observations
discussed in this report are preliminary.
In conducting our work, we obtained and reviewed TSA documentation
related to screener training, testing and supervision; the contract
screening pilot program; screener staffing levels; and airport security
concerns. We also interviewed relevant officials at TSA headquarters
and field offices, airports, and several aviation associations. A more
detailed description of our scope and methodology is contained later in
this report.
Results in Brief:
TSA has deployed basic and remedial screener training programs, but has
not fully developed or deployed a recurrent or supervisory training
program to ensure to ensure that screeners are effectively trained and
supervised. However, recognizing that training of screeners on a
frequent basis and effective supervision are critical to screener
performance, TSA has taken some positive steps in this direction. These
steps include designing an On-Line Learning Management System (LMS)
that will be fielded in October 2003, and working with the U.S.
Department of Agriculture's (USDA) Graduate School to tailor its off-
the-shelf supervisory course to the specific training needs of TSA's
screening supervisors.
TSA currently collects little information to measure screener
performance in detecting threat objects. The primary source of
information collected on screeners' ability to detect threat objects is
operational testing conducted by TSA's Office of Internal Affairs and
Program Review (OIAPR).[Footnote 1] However, TSA does not consider the
results of OIAPR's covert tests as a measure of screener performance,
but rather a "snapshot" of a screener's ability to detect threat
objects at a particular point in time, and as a system-wide performance
indicator. In addition, the Threat Image Projection (TIP) system, which
the Federal Aviation Administration (FAA) deployed in late 1999 to
measure and improve screener performance in detecting threat objects,
was shut down immediately following the September 11th terrorist
attacks for fear that it would result in screening delays and
panic.[Footnote 2] However, TSA officials reported that they have
recently begun fielding TIP to airports, with significantly more threat
images than used by the FAA. Further, TSA has not yet implemented an
ATSA requirement for an annual proficiency review for all screeners,
but plans to begin implementing an annual screener certification
program in October 2003. TSA also developed a Performance Management
Information System (PMIS) to collect and maintain information on the
performance of TSA's passenger and baggage screening operations.
However, PMIS contains little information on screener performance in
detecting threat objects.[Footnote 3]
Consistent with ATSA, TSA implemented a pilot program using contract
screeners at 5 commercial airports, but has not yet determined how to
evaluate and measure the performance of the pilot program airports.
However, TSA plans to award a contract by October 1, 2003, to compare
the performance of pilot screeners with federal screeners and determine
the reasons for any differences. While the purpose of the screener
pilot program is to determine the feasibility of using private
screening companies rather than federal screeners, TSA initially
required private screening companies to adhere to all of the procedures
and protocols used by federal screeners. However, TSA recently provided
the contractors with some flexibility, such as allowing them to
determine and maintain their own staffing levels and to make
independent hiring decisions. ATSA also gives airport operators the
option of applying to transition from using federal screeners to
private screeners beginning in November 2004; however, TSA has not
begun to plan for the possible transition of airports from a federal
system to a private screening company. Numerous airport operators have
contacted TSA to express an interest in obtaining more information to
assist in their decision regarding using private screeners.
To address airport-specific staffing needs and accomplish workforce
reduction goals, TSA developed a staffing model to determine staffing
levels at each airport, and recently hired an outside consultant to
assist the agency in determining whether identified staffing levels are
appropriate. Federal Security Directors (FSD), who are responsible for
overseeing security at each of the nation's commercial airports, have
expressed concern that they have had limited authority to respond to
airport specific staffing needs, such as reacting to fluctuations in
daily and/or seasonal passenger flow. TSA headquarters officials
acknowledged that their initial staffing efforts created imbalances in
the screener workforce and have taken steps to correct identified
imbalances, such as such as authorizing the hiring of part-time
screeners at over 200 airports--the first of which began working on
September 15, 2003.
Because our observations are preliminary and our evaluation is ongoing,
we are not making recommendations at this time.
TSA officials reviewed a draft of this report and provided technical
comments, which we incorporated as appropriate.
Background:
ATSA created TSA as an agency within the Department of Transportation
(DOT) to ensure security for all modes of transportation, to include
aviation.[Footnote 4] ATSA set forth specific enhancements to aviation
security for TSA to implement and established deadlines for completing
many of them. These enhancements included federalizing passenger
screeners at more than 440 commercial airports by November 19,
2002;[Footnote 5] screening checked baggage for explosives by December
31, 2002; enhancing screener training standards; and establishing and
managing a 2-year pilot program at five airports--one in each airport
category--where screening of passengers and property would be conducted
by a private screening company and overseen by TSA. Additionally, ATSA
included a provision that allows airport operators to apply to opt-out
of using federal screeners in favor of private screeners beginning
November 19, 2004.
Prior to the passage of ATSA, air carriers were responsible for
screening passengers and most used private security firms to perform
this function. Longstanding concerns existed regarding screener
performance in detecting threat objects. Inadequate training and poor
supervision, along with rapid turnover and inadequate attention to
human factors, were historically identified as key contributors to poor
screener performance.[Footnote 6] As early as 1987, we reported that
too little attention had been paid to (1) individual aptitudes for
effectively performing screening duties; (2) the sufficiency of
screener training and screeners' ability to comprehend training; and
(3) the monotony of the job and distractions that reduced screeners'
vigilance.[Footnote 7] Additional studies have shown that effective
training can lead to more effective performance and lower turnover
rates for passenger screeners.
Concerns have long existed over screeners' inability to detect threat
objects during covert tests at passenger screening checkpoints. In
1978, screeners failed to detect 13 percent of the potentially
dangerous objects FAA agents carried through checkpoints during tests-
-a level that was considered "significant and alarming."[Footnote 8] In
1987, screeners did not detect 20 percent of the objects during the
same types of tests.[Footnote 9] In addition, we reported that FAA
tests conducted between 1991 and 1999 showed that screeners' ability to
detect objects was not improving, and in some cases was worsening. In
tests conducted in the late 1990s, as the testing objects became more
realistic and more closely approximated how a terrorist might attempt
to penetrate a checkpoint, screeners' ability to detect dangerous
objects declined even further.[Footnote 10]
Scope and Methodology:
Our preliminary observations are based on our review of TSA
documentation related to screener training, testing, and supervision;
the contract screening pilot program; screener staffing levels; and
airport security concerns. We interviewed TSA headquarters' officials
in Arlington, Virginia; and interviewed FSDs, their staffs, and
screeners at 12 commercial airports throughout the nation;[Footnote 11]
10 airport operators; officials at 5 air carriers; and officials from 4
aviation associations--American Association of Airport Executives
(AAAE), Airports Council International (ACI), Air Transport
Association, and Regional Airline Association. We also reviewed our
prior reports that addressed issues related to the performance of
airport passenger screeners. We conducted our work from May through
September 2003 in accordance with generally accepted government
auditing standards. Because our review is still ongoing, the results
presented in this report are preliminary.
To complete our work, we will continue to collect and review TSA
documentation related to each of our four objectives, including
obtaining and analyzing the results of TSA's operational tests. We will
also administer a survey to all 158 FSDs to obtain their perspectives
on general and airport specific information related to each of our four
objectives. Additionally, we will visit at least 8 additional airports
to conduct interviews with FSDs, their staffs, members of the screener
workforce, and airport operators. We will also interview
representatives of all 5 pilot program airports, as well as airport
operators at all category X airports, to obtain information on their
coordination with TSA and their plans, if any, to apply to opt-out of
the federal screening program beginning November 19, 2004. Finally, we
will continue to meet with TSA headquarters officials to obtain current
information related to the issues addressed in this report. We
anticipate issuing a final report in April 2004.
Recurrent and Supervisory Training Programs Not Fully Developed:
TSA developed basic and remedial screener training programs, but has
not fully developed or deployed a recurrent or supervisory training
program to ensure that screeners are effectively trained and
supervised. Comprehensive and frequent training is key to passenger
screeners' ability to detect threat objects. Studies have shown that
on-going training can lead to more effective performance and lower
turnover rates for passenger screeners. According to TSA, there are
three key elements of passenger screener training: (1) basic training,
(2) remedial training, and (3) recurrent training. As required by ATSA,
TSA established a basic screener-training program comprised of 40 hours
of classroom instruction and 60 hours of on-the-job training (OJT). TSA
reported that all of its screeners who work independently have
completed basic screener training and that those who failed an
operational test received required remedial training.[Footnote 12]
Basic Training:
TSA requires screeners to complete a minimum of 40 hours of classroom
instruction and 60 hours of OJT prior to making independent screening
decisions. This requirement is an increase over FAA's basic training
requirements when it oversaw passenger screening, which called for 12
hours of classroom instruction and 40 hours of OJT. According to TSA
officials, all screeners who work independently have met the basic
screener training requirements.[Footnote 13] TSA contractors are
responsible for delivering and tracking basic screener classroom
training, while OJT is tracked locally at each airport. TSA encourages,
but does not require, screening managers, who are responsible for
overseeing screening functions to participate in classroom training,
even if they do not have prior screening experience. Nevertheless, 2 of
the 12 FSDs we interviewed said that they require their screening
managers to observe basic screener training.
Remedial Training:
Consistent with ATSA, TSA requires remedial training for any screener
who fails an operational test and prohibits screeners from performing
the screening function related to the test they failed until they
successfully complete the training.[Footnote 14] FSDs must certify that
screeners identified as requiring remedial training complete the
training before they can perform the screening function identified as a
performance weakness. TSA's Aviation Operations Division is responsible
for tracking the completion of remedial training following the failure
of covert tests. The tracking of remedial training initiated for
reasons other than failing a covert test is the responsibility of the
FSDs or their designees. TSA reported that all screeners requiring
remedial training have received the training.[Footnote 15]
Recurrent Training:
TSA has not fully developed or deployed a recurrent training program,
but has recognized that ongoing training of screeners on a frequent
basis is critical to maintaining and enhancing screener skills.
According to agency officials, TSA established a training task force
comprised of airport Training Coordinators, screeners, and headquarters
officials to conduct an assessment of training needs. As a result of
the task force's suggestions, TSA is developing six recurrent training
modules--the first of which TSA plans to deploy to all airports
beginning in October 2003. TSA plans to release each of the remaining
five modules as they are finalized, which they anticipate will occur
throughout 2004. TSA officials also said that they designed and are
currently pilot testing an On-Line Learning Management System (LMS)
comprised of 366 various training courses, which they expect to field
in October 2003. Officials said that they were not further along in
implementing their recurrent training modules or LMS due to budget
considerations.
Fourteen of the 22 passenger screeners and supervisors we interviewed
expressed the need for recurrent training.[Footnote 16] They were
particularly interested in receiving additional training related to
recognizing x-ray images of threat objects. In addition, 10 of the 12
FSDs we interviewed reported implementing their own locally developed
recurrent training courses rather than waiting for the training modules
to be deployed by headquarters. TSA's OIAPR found that screeners at
airports that conducted frequent, on-going training performed better
during covert tests--TSA's form of operational testing--than screeners
who did not receive recurrent training.
Supervisory Training:
TSA describes its screening supervisors as the key to a strong defense
in detecting threat objects. In September 2001, we reported on the
results of our survey of aviation stakeholders and aviation and
terrorism experts concerning options for conducting screening. The
respondents identified better supervision as one of the factors
necessary for improving screener performance.[Footnote 17]
Additionally, DOT's Office Inspector General (OIG) recently reported
that screener supervisors are the key to effective screening,[Footnote
18] and TSA's OIAPR identified a lack of supervisory training as a
cause for screener testing failures. FSDs and TSA headquarters
officials recognize the need to enhance the skills of screening
supervisors through supervisory training.TSA is currently working with
USDA to tailor its off-the-shelf supervisory course to the specific
needs of TSA's screening supervisors. TSA recently reported that it is
sending supervisors to the basic USDA supervisor's course until the
customized course is fielded, which it expects to occur in April 2004.
To supplement the classroom training, TSA also plans to establish a
supervisory training module for recurrent training. We plan to review
TSA's training initiatives further during the remainder of our
evaluation.
Little Information Exists to Measure Screeners' Performance in
Detecting Threat Objects:
Currently, the results of TSA's OIAPR's operational, or covert, testing
is the only indication of screener performance in detecting threat
objects. However, TSA does not view the results of OIAPR's covert
testing as a measure of screener performance, but rather as a
"snapshot" of a screener's ability to detect threat objects at a
particular point in time. Although OAIPR conducts fewer covert tests of
passenger screeners than previously conducted by the FAA, TSA considers
its tests more rigorous than FAA's tests because they more closely
approximate techniques terrorists might use. In addition to conducting
operational testing, TSA plans to fully activate the Threat Image
Projection system and implement a screener certification program in
October 2003 to collect additional information on screener performance.
TSA also developed a Performance Management Information System to
collect and maintain information on the performance of its passenger
and baggage screening operations. However, PMIS contains little data on
screener performance in detecting threat objects. TSA officials said
that they plan to expand PMIS to collect some performance information,
but did not identify a timeframe for when the data will be collected.
Operational Testing:
TSA defines an operational screening test as any covert test of a
screener, conducted by TSA, on any screener function to assess the
screener's threat item detection ability and/or adherence to TSA-
approved procedures. When a screener fails a test, he or she is
required to receive immediate remedial training, and is prohibited from
performing the function related to the failed test until he or she
satisfactory completes the training. Currently, OIAPR's covert testing
is the only source of operational testing conducted of passenger
screeners. These tests are designed to identify systematic problems
affecting the performance of screeners in the areas of training,
policy, and equipment. TSA does not view the results of OIAPR's covert
testing as a measure of screener performance, but rather a "snapshot"
of a screener's ability to detect threat objects at a particular point
in time and as an indicator of systemwide screener performance. OIAPR
testing to date has shown weaknesses in screeners' ability to detect
threat objects. Testing conducted by the DOT's OIG, the Department of
Homeland Security's OIG, and GAO have also identified screener
performance weaknesses.
Prior to the creation of TSA, FAA conducted thousands of covert tests
annually of passenger screeners. Most of these tests were compliance
tests in which FAA agents attempted to get nine test objects, such as
guns and grenades, past screeners conducting x-ray, metal detector, and
physical searches at airport checkpoints. The DOT OIG described these
tests as unlike the techniques that terrorists would employ.[Footnote
19] In 1997, FAA incorporated simulated improvised explosive devices
into its compliance testing and performed, on average, more than 2,000
of these test each year. In addition to compliance tests, FAA's special
headquarters based testing unit, often called the Red Team, conducted
more realistic tests using harder to detect threat objects by agents
not known to screeners. [Footnote 20]
TSA's OIAPR has conducted fewer covert tests than conducted by FAA, but
considers its testing methods more rigorous than either of FAA's
compliance or Red Team tests because they more closely approximate
techniques terrorists might use. OIAPR officials further said that
their tests are intentionally designed to have a high probability of
failure in an effort to identify vulnerabilities and areas needing
improvement. Additionally, unlike testing conducted under FAA, OIAPR
staff that perform the tests reported that they provide immediate
feedback to screeners, their managers, and the FSDs to explain how they
beat the system and provide instant remedial training. We plan to
review OIAPR's operational testing in more detail during the remainder
of our evaluation.
Based on an anticipated increase in staff from about 100 in fiscal year
2003 to 200 in fiscal year 2004, OIAPR plans to conduct twice as many
covert tests next year. In addition, TSA recently established 5 mission
support centers located throughout the country, which according to TSA,
will be staffed with OIAPR personnel available to conduct additional
covert tests.[Footnote 21] These centers will also be staffed with
mobile testing teams that will work with FSDs in their region to
conduct screener training using some of the test objects OIAPR uses in
its covert tests.
Threat Image Projection (TIP) System:
In late 1999, to help screeners remain alert, train them to become more
adept at detecting harder to spot threat objects, and continuously
measure screener performance, FAA began deploying TIP. TIP places
images of threat objects on x-ray screens during actual operations and
records whether screeners identify the threat object.[Footnote 22] By
frequently exposing screeners to a variety of images of dangerous
objects on the x-ray screens, the system provides continuous OJT and
allows for immediate supervisory feedback, on-the-spot training, and
remedial training.
According to TSA officials, TIP was shut down immediately following the
September 11th terrorist attacks due to concerns that it would result
in screening delays and panic, as screeners might think that they were
actually viewing a threat object. TSA officials recognize that TIP is a
key tool in maintaining and enhancing screener performance, and said
that they had begun reactivating TIP with significantly more images
than FAA had in place. TSA officials said that TIP had not been
reactivated sooner due to a lack of automated data collection via
cellular modems; competing priorities; a lack of training; and a lack
of resources needed to deploy TIP activation teams.
Annual Screener Certification:
ATSA requires that each passenger screener receive an annual
proficiency review to ensure he or she continues to meet all
qualifications and standards required to perform the screening
function. Although TSA has not yet implemented this requirement, it
plans to develop an annual screener certification program comprised of
three components, including (1) image recognition test; (2) knowledge
of standard operating procedures (SOPs); and (3) practical
demonstration of skills, to be administered by a contractor. TSA has
not yet determined the level of performance that screeners must achieve
to be certified, but officials said that they plan to require
performance at a high, but reasonable level. Officials also said that
they plan to remediate and retest screeners who fail any portion of the
test, but have not yet determined the number of times a screener may
retake the test before termination. Certification is scheduled to begin
in October 2003 and to be completed at all 442 airports by January
2004, in the order in which the airports began federal screening
operations. TSA officials recently reported that they awarded a
contract to conduct the practical demonstration component of the test;
however, TSA has not developed a schedule for when the program will be
fielded to the airports. We plan to review TSA's annual screener
certification program during the remainder of our evaluation.
Performance Management Information System:
TSA's Performance Management Information System--PMIS--for passenger
and baggage screening operations contains little data on screener
performance in detecting threat objects. PMIS collects information on
workload, staffing, and equipment and is used to identify some
performance and policy issues, such as the level of absenteeism,
average time for equipment repairs, and status of TSA's efforts to
meets goals for 100 percent baggage screening.[Footnote 23] (See app. I
for examples of information collected and contained in PMIS.)
Additionally, TSA uses PMIS data to identify needed changes to
SOPs.[Footnote 24] Officials further reported that PMIS has the ability
to generate reports that enable TSA to track its progress toward
meeting its performance goals as well as to generate reports by region,
FSD, airport, and/or individual screening checkpoint. PMIS has been
deployed to all airports with federal screeners. FSDs are responsible
for designating a staff person to enter performance data into PMIS on a
daily basis.
TSA officials reported that they are planning to integrate performance
information from various systems into PMIS to assist the agency in
making strategic decisions. TSA also recently reported that it is
developing a screener performance index, which is supposed to include
information such as the results of TIP tests, training tests, and
certification tests. We plan to review these plans in more detail
during the remainder of our evaluation.
Screener Performance Improvement Study:
TSA is taking steps to improve screener performance. In July 2003, TSA
completed a Screener Performance Improvement Study, which was designed
to identify root causes for gaps between current screener performance
and TSA's desired performance--defined as 100 percent interception of
prohibited items coming through the passenger screening checkpoints. As
part of its study, TSA identified four significant screener performance
deficiencies. TSA concluded that four key factors contributed to the
identified deficiencies: (1) lack of skills, knowledge, or information;
(2) low motivation; (3) ineffective work environment; and (4) incorrect
or missing incentives. To address the screener performance deficiencies
identified in the study, TSA developed several key solutions, including
the need to establish adequate training facilities at airports; staff
airports adequately to allow time for training; reconfigure checkpoints
to eliminate distractions; implement TIP at all airports; and enhance
supervisory skills. According to TSA officials, the appropriate TSA
components are currently developing action plans for each of the
deficiencies identified in the Performance Improvement Study. The plans
are to include action steps, timelines, required resources, and
anticipated outcomes. We plan to review these plans during the
remainder of our evaluation.
An Assessment of the Contract Screening Pilot Program Has Not Yet
Begun:
TSA has implemented a pilot program using contract screeners at 5
airports, but has not determined how to evaluate and measure the
performance of the pilot program airports. The purpose of the 2-year
pilot program is to determine the feasibility of using private
screening companies rather than federal screeners. Initially, TSA
required private screening companies to adhere to all of the procedures
and protocols used for federal screeners. However, TSA recently
provided the pilot contractors with some flexibility, such as allowing
them to determine and maintain their own staffing levels and make
independent hiring decisions. While TSA has not yet determined how to
evaluate and measure the performance of the pilot program airports, it
plans to award a contract by October 1, 2003, to compare the
performance of pilot screeners with federal screeners and determine the
reasons for any differences. TSA officials said that the Office of
Management and Budget requested that they include in their evaluation
ways to allow more innovation by contract screening companies.
Although ATSA allows airports to apply to opt-out of using federal
screeners beginning in November 2004, TSA has not begun to plan for the
possible transition of airports from a federal system to a private
screening company. Airports Council International officials said that
numerous airports have contacted them expressing an interest in
obtaining more information to assist in their decision regarding
opting-out. Six of the 10 airport operators we interviewed said that
they had not made any decisions regarding opting-out, and all 10 said
they had not received any information from TSA regarding the
option.[Footnote 25] However, the airport operators said that they
would like information to assist them in deciding whether to opt-out,
such as determining who bears responsibility for funding the screening
contract; airport liability in the event of an incident linked to a
screener failure; how well the current pilot program airports are
performing; performance standards to which contract screeners will be
held; and TSA's role in overseeing contracted screening.
TSA Continuing to Work to Identify Appropriate Staffing Levels at the
Nation's Airports:
Initially, TSA headquarters determined screener-staffing levels for all
airports without actively seeking input from FSDs. Eight of the 12 FSDs
we interviewed said that they had limited authority to respond to
airport specific staffing needs, such as reacting to fluctuations in
daily and/or seasonal passenger flow. However, TSA headquarters
officials said that during the second stage of their workforce
reduction process, they solicited input from FSDs, airport officials,
and air carriers. TSA headquarters officials acknowledged that their
initial staffing efforts created imbalances in the screener workforce
and have taken steps to correct identified imbalances, such as such as
authorizing the hiring of part-time screeners at over 200 airports--the
first of which began working on September 15, 2003.
TSA determined the current screener staffing levels using a computer-
based modeling process that took into account the number of screening
checkpoints and lanes at an airport; originating passengers; the number
of airport workers requiring screening; projected air carrier service
increases and decreases during calendar year 2003; and hours needed to
accommodate screener training, leave, and breaks.[Footnote 26] TSA
recently hired an outside consultant to conduct a study of screener
staffing levels at various airports. TSA officials stated that they
will continue to review the staffing allocation provided through the
modeling efforts to assess air carrier and airport growth patterns, and
adjustments will be made as appropriate. We plan to review TSA's
efforts to determine appropriate staffing levels for passenger
screeners during the remainder of our evaluation.
As agreed with your office, unless you publicly announce its contents
earlier, we plan no further distribution of this report until 5 days
after its issue date. At that time, we will send copies of this report
to the Secretary of the Department of Homeland Security and interested
congressional committees. We will also make copies available to others
upon request. In addition, the report will be available at no charge on
GAO's Web site at http://www.gao.gov.
If you have any questions about this report, or wish to discuss it
further, please contact me at (202) 512-8777 or Jack Schulze, Assistant
Director, at (202) 512-4390. Key contributors to this report are listed
in appendix II.
Sincerely yours,
Cathleen A. Berrick
Acting Director, Homeland Security and Justice Issues:
Signed by Cathleen A. Berrick:
[End of section]
Appendix I: Examples of Information Maintained in TSA's Performance
Management Information System:
Category of information collected: Checkpoint:
Examples of information collected[A]: Category of information
collected: Number of prohibited items.
Examples of information collected[A]: Category of information
collected: Number of weapons surrendered at sweep screening[B].
Examples of information collected[A]: Category of information
collected: Number of cleared Explosive Trace Detection (ETD) alarms.
Examples of information collected[A]: Category of information
collectedIncidents: Percent of absenteeism.
Category of information collected: Incidents:
Examples of information collected[A]: Category of information
collected: Number of incidents.
Examples of information collected[A]: Category of information
collected: Number of arrests.
Examples of information collected[A]: Category of information
collected: Number of evacuations.
Examples of information collected[A]: Category of information
collectedFeedback: Number of disruptive passengers.
Category of information collected: Feedback:
Examples of information collected[A]: Category of information
collected: Customer complaints.
Examples of information collected[A]: Category of information
collected: Discourteous treatment.
Examples of information collected[A]: Category of information
collected: Nonstandard screening.
Examples of information collected[A]: Category of information
collectedHuman Resources--Employee Census: Lost, stolen, or damaged
items.
Category of information collected: Human Resources--Employee Census:
Examples of information collected[A]: Category of information
collected: Total active authorized screeners.
Examples of information collected[A]: Category of information
collected: Number of Screeners on light duty.
Examples of information collected[A]: Category of information
collected: Number of Screening managers.
Examples of information collected[A]: Category of information
collected: FSD staff.
Examples of information collected[A]: Category of information
collected: Number of screeners trained on baggage only/passenger only/
cross-trained.
Examples of information collected[A]: Category of information
collectedTSA-wide: Screener retention.
Category of information collected: TSA-wide:
Examples of information collected[A]: Category of information
collected: Federalization progress.
Examples of information collected[A]: Category of information
collected: Number of airports complete.
Examples of information collected[A]: Category of information
collected: Machines not in use.
Examples of information collected[A]: Category of information
collected: Percent of airports using the CAPPS II system.
Examples of information collected[A]: Category of information
collectedSizing: Average wait time at passenger screening checkpoints
for federalized airports.
Category of information collected: Sizing:
Examples of information collected[A]: Category of information
collected: Number of gates in use.
Examples of information collected[A]: Category of information
collected: Number of checkpoints.
Examples of information collected[A]: Category of information
collected: Number of lanes.
Examples of information collected[A]: Category of information
collected: Number of ETS, x-ray machines, explosive detection systems
(EDS) machines.
Examples of information collected[A]: Category of information
collectedBaggage status: Number of enplanements.
Category of information collected: Baggage status:
Examples of information collected[A]: Category of information
collected: EDS/ETS shortage.
Examples of information collected[A]: Category of information
collected: EDS/ETD inoperable.
Examples of information collected[A]: Category of information
collected: Training shortage.
Examples of information collected[A]: Category of information
collected: Staffing shortage.
Examples of information collected[A]: Category of information
collected: Staff absent.
Category of information collected: Baggage metrics:
Examples of information collected[A]: Category of information
collected: Explosive materials.
Examples of information collected[A]: Category of information
collected: Drugs.
Examples of information collected[A]: Category of information
collected: Number of bags opened.
Examples of information collected[A]: Number of screeners on
duty.
Category of information collected: Attainment:
Examples of information collected[A]: Individual airport
measures to achieve change in threat level by date and time.
Source: TSA.
[A] For each of the data elements for which data are reported, the
Performance Management Information System also contains several subsets
of information. For example, the number of prohibited items includes
information on the number of weapons (by category of weapon, such as
deadly/dangerous weapon) surrendered at the checkpoint, at a gate, at a
secondary screening point, etc.
[B] TSA officials described sweep screening as a method of screening in
which screeners randomly stop passengers in the airport concourse for
additional screening.
[End of table]
[End of section]
Appendix II: GAO Contacts and Staff Acknowledgments:
GAO Contacts:
Cathy A. Berrick (202) 512-8777:
Jack Schulze (202) 512-4390:
Staff Acknowledgments:
In addition to those named above, David Alexander, Lisa Brown,
Christopher Jones, Stuart Kaufmann, Thomas Lombardi, Jan Montgomery,
Edward Stephenson, Maria Strudwick, and Susan Zimmerman were key
contributors to this report.
FOOTNOTES
[1] TSA defines an operational screening test as any covert test of a
screener, conducted by TSA, on any screener function to assess the
screener's threat item detection ability and/or adherence to TSA-
approved procedures.
[2] TIP places images of threat objects on x-ray screens during actual
operations and records whether screeners identify the threat. TIP was
designed by FAA to help screeners remain alert, train them to become
more adept at detecting harder to spot threat objects, and continuously
measure screener performance.
[3] TSA officials recently reported that they plan to modify PMIS to
collect data on screener performance in the future.
[4] The Homeland Security Act, signed into law on November 25, 2002,
transferred TSA to the new Department of Homeland Security.
[5] The December 31, 2002, deadline was extended to December 31, 2003,
in some cases by the Homeland Security Act.
[6] U.S. General Accounting Office, Aviation Security: Long-Standing
Problems Impair Airport Screeners' Performance, GAO/RCED-00-75
(Washington, D.C.: June 28, 2000). "Human factors" refers to the
demands a job places on the capabilities of, and the constraints it
imposes on, the individuals performing the function. Reports on the
human factors involved in checkpoint screening date back more than 20
years and include repetitive tasks screeners perform, the close and
constant monitoring required to detect threat objects, and the stress
involved in dealing with the public, who may dislike being screened or
demand faster action to avoid missing their flights.
[7] U.S. General Accounting Office, Aviation Security: Slow Progress in
Addressing Long-Standing Screener Performance Problems, GAO/
T-RCED-00-125 (Washington, D.C.: March 16, 2000).
[8] U.S. General Accounting Office, Aviation Security: Vulnerabilities
Still Exist in the Aviation Security System, GAO/T-RCED/AIMD-00-142
(Washington, D.C.: Apr. 6, 2000).
[9] See footnote 8.
[10] U.S. General Accounting Office, Aviation Security: Terrorist Acts
Demonstrate Urgent Need to Improve Security at the Nation's Airports,
GAO-01-1162T (Washington, D.C.: Sept. 20, 2001).
[11] As of September 19, 2003, we have visited the following 12
commercial airports: Baltimore-Washington International; Dallas-Ft.
Worth International; Dallas Love-Field; Kansas City International;
Little Rock National; Orlando International; Orlando Sanford; Portland
International; Seattle-Tacoma International; Tampa International;
Washington-Dulles International; and Washington Reagan National.
[12] The PMIS currently reports the breakdown of those screeners
trained for passenger and baggage screening as well as the number of
cross-trained screeners by airport.
[13] We plan to verify whether passenger screeners received basic
training as required during the remainder of our evaluation.
[14] Screening supervisors and managers may also require screeners to
participate in corrective action training based on their observations
of performance deficiencies, such as failure to follow a standard
operating procedure.
[15] We plan to verify whether identified passenger screeners received
3 hours of remedial training as required by TSA during the remainder of
our evaluation.
[16] As we did not select statistical samples of passenger screeners
and supervisors to interview, the views of those we interviewed should
not be considered representative of the views of all screeners and
supervisors at the airports we visited.
[17] U.S. General Accounting Office, Aviation Security: Vulnerabilities
in, and Alternatives for, Preboard Screening Security Operations,
GAO-01-1171T (Washington, D.C.: Sept. 25, 2001). The survey respondents
identified compensation and improved training as the highest priorities
of improving screener performance. In addition to identifying a need
for better supervision, they also believed that the implementation of
performance standards, team and image building, awards for exemplary
work, and certification of individual screeners would improve screener
performance.
[18] Statement of the Honorable Kenneth M. Mead, Inspector General,
U.S. Department of Transportation, before the National Commission on
Terrorist Attacks Upon the United States, May 22, 2003.
[19] At the May 22, 2003, hearing of the National Commission on
Terrorist Attacks Upon the United States, DOT's IG, described FAA's
standard protocols for testing how well screeners performed when using
uncluttered carry-on bags with a firearm or simulated bomb inside. The
IG said that it would be difficult for a screener to miss a test object
when undergoing such a covert test.
[20] Aviation Security: Screeners Continue to Have Serious Problems
Detecting Dangerous Objects, GAO/RCED-00-159 (Washington, D.C.: June
2000). The tests performed by FAA's Red Team, a special headquarters
based unit, were considered their most realistic tests because they
used weapons and improvise devices, a wider variety of bags with more
clutter in them, and headquarters-based agents who were not likely to
be recognized by the screeners.
[21] The mission support centers are located in Atlanta, Dallas,
Detroit, Philadelphia, and San Francisco.
[22] TIP is designed to test screeners' detection capabilities by
projecting threat images, including guns and explosives, into bags as
they are screened, or projecting images of bags containing threat
objects onto the x-ray screen as live baggage is screened. Screeners
are responsible for positively identifying the threat image and calling
for the bag to be searched. Once prompted, TIP identifies to the
screener whether the threat is real and then records the screener's
performance in a database that FAA could access to analyze performance
trends. TIP exposes screeners to threat images on a routine basis to
enable them to become more adept at recognizing threat objects. The
system records the screeners' responses to the projected images and
provides a measure of their performance while assisting in keeping them
alert.
[23] TSA officials said that PMIS also contains other metrics,
including human resources, sizing, checkpoint, feedback, and incidents.
[24] For example, using PMIS data, TSA determined that passengers were
unintentionally leaving money at the screening checkpoints when they
were divesting themselves of all objects that could possibly cause the
walkthrough metal detectors to alarm. In response to this finding, TSA
established a protocol instructing screeners on how to address this
issue.
[25] Three of the remaining four airport operators we interviewed said
they were not currently considering opting out of using federal
screeners. At the pilot program airport we visited, the airport
operator said that the airport plans to continue using contract
screeners.
[26] TSA's screener workforce totaled 55,600 on March 31, 2003. Due
primarily to budget constraints, the agency was directed to cut 3,000
positions to result in a screener workforce of 52,600 on June 1, 2003.
An additional 3,000 positions were cut for a workforce of 49,600 full-
time equivalents on September 30, 2003, the end of the fiscal year. TSA
officials predicted that, based on the fiscal year 2004 budget, the
screener staffing level will be down to 45,000 full-time equivalents by
the end of fiscal year 2004.
GAO's Mission:
The General Accounting Office, the investigative arm of Congress,
exists to support Congress in meeting its constitutional
responsibilities and to help improve the performance and accountability
of the federal government for the American people. GAO examines the use
of public funds; evaluates federal programs and policies; and provides
analyses, recommendations, and other assistance to help Congress make
informed oversight, policy, and funding decisions. GAO's commitment to
good government is reflected in its core values of accountability,
integrity, and reliability.
Obtaining Copies of GAO Reports and Testimony:
The fastest and easiest way to obtain copies of GAO documents at no
cost is through the Internet. GAO's Web site ( www.gao.gov ) contains
abstracts and full-text files of current reports and testimony and an
expanding archive of older products. The Web site features a search
engine to help you locate documents using key words and phrases. You
can print these documents in their entirety, including charts and other
graphics.
Each day, GAO issues a list of newly released reports, testimony, and
correspondence. GAO posts this list, known as "Today's Reports," on its
Web site daily. The list contains links to the full-text document
files. To have GAO e-mail this list to you every afternoon, go to
www.gao.gov and select "Subscribe to e-mail alerts" under the "Order
GAO Products" heading.
Order by Mail or Phone:
The first copy of each printed report is free. Additional copies are $2
each. A check or money order should be made out to the Superintendent
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or
more copies mailed to a single address are discounted 25 percent.
Orders should be sent to:
U.S. General Accounting Office
441 G Street NW,
Room LM Washington,
D.C. 20548:
To order by Phone:
Voice: (202) 512-6000:
TDD: (202) 512-2537:
Fax: (202) 512-6061:
To Report Fraud, Waste, and Abuse in Federal Programs:
Contact:
Web site: www.gao.gov/fraudnet/fraudnet.htm E-mail: fraudnet@gao.gov
Automated answering system: (800) 424-5454 or (202) 512-7470:
Public Affairs:
Jeff Nelligan, managing director, NelliganJ@gao.gov (202) 512-4800 U.S.
General Accounting Office, 441 G Street NW, Room 7149 Washington, D.C.
20548: