Aviation Security
Risk, Experience, and Customer Concerns Drive Changes to Airline Passenger Screening Procedures, but Evaluation and Documentation of Proposed Changes Could Be Improved
Gao ID: GAO-07-634 April 16, 2007
The Transportation Security Administration's (TSA) most visible layer of commercial aviation security is the screening of airline passengers at airport checkpoints, where travelers and their carry-on items are screened for explosives and other dangerous items by transportation security officers (TSO). Several revisions made to checkpoint screening procedures have been scrutinized and questioned by the traveling public and Congress in recent years. For this review, GAO evaluated (1) TSA's decisions to modify passenger screening procedures between April 2005 and December 2005 and in response to the alleged August 2006 liquid explosives terrorist plot, and (2) how TSA monitored TSO compliance with passenger screening procedures. To conduct this work, GAO reviewed TSA documents, interviewed TSA officials and aviation security experts, and visited 25 airports of varying sizes and locations.
Between April 2005 and December 2005, proposed modifications to passenger checkpoint screening standard operating procedures (SOP) were made for a variety of reasons, and while a majority of the proposed modifications--48 of 92--were ultimately implemented at airports, TSA's methods for evaluating and documenting them could be improved. SOP modifications were proposed based on the professional judgment of TSA senior-level officials and program-level staff. TSA considered the daily experiences of airport staff, complaints and concerns raised by the traveling public, and analysis of risks to the aviation system when proposing SOP modifications. TSA also made efforts to balance the impact on security, efficiency, and customer service when deciding which proposed modifications to implement, as in the case of the SOP changes made in response to the alleged August 2006 liquid explosives terrorist plot. In some cases, TSA tested proposed modifications at selected airports to help determine whether the changes would achieve their intended purpose. However, TSA's data collection and analyses could be improved to help TSA determine whether proposed procedures that are operationally tested would achieve their intended purpose. For example, TSA officials decided to allow passengers to carry small scissors and tools onto aircraft based on their review of threat information, which indicated that these items do not pose a high risk to the aviation system. However, TSA did not conduct the necessary analysis of data it collected to assess whether this screening change would free up TSOs to focus on screening for high-risk threats, as intended. TSA officials acknowledged the importance of evaluating whether proposed screening procedures would achieve their intended purpose, but cited difficulties in doing so, including time pressures to implement needed security measures quickly. Finally, TSA's documentation on proposed modifications to screening procedures was not complete. TSA documented the basis--that is, the information, experience, or event that encouraged TSA officials to propose the modifications--for 72 of the 92 proposed modifications. In addition, TSA documented the reasoning behind its decisions for half (26 of 44) of the proposed modifications that were not implemented. Without more complete documentation, TSA may not be able to justify key modifications to passenger screening procedures to Congress and the traveling public. TSA monitors TSO compliance with passenger checkpoint screening procedures through its performance accountability and standards system and through covert testing. Compliance assessments include quarterly observations of TSOs' ability to perform particular screening functions in the operating environment, quarterly quizzes to assess TSOs' knowledge of procedures, and an annual knowledge and skills assessment. TSA uses covert tests to evaluate, in part, the extent to which TSOs' noncompliance with procedures affects their ability to detect simulated threat items hidden in accessible property or concealed on a person. TSA airport officials have experienced resource challenges in implementing these compliance monitoring methods. TSA headquarters officials stated that they are taking steps to address these challenges.
Recommendations
Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.
Director:
Team:
Phone:
GAO-07-634, Aviation Security: Risk, Experience, and Customer Concerns Drive Changes to Airline Passenger Screening Procedures, but Evaluation and Documentation of Proposed Changes Could Be Improved
This is the accessible text file for GAO report number GAO-07-634
entitled 'Aviation Security: Risk, Experience, and Customer Concerns
Drive Changes to Airline Passenger Screening Procedures, but Evaluation
and Documentation of Proposed Changes Could Be Improved' which was
released on May 7, 2007.
This text file was formatted by the U.S. Government Accountability
Office (GAO) to be accessible to users with visual impairments, as part
of a longer term project to improve GAO products' accessibility. Every
attempt has been made to maintain the structural and data integrity of
the original printed product. Accessibility features, such as text
descriptions of tables, consecutively numbered footnotes placed at the
end of the file, and the text of agency comment letters, are provided
but may not exactly duplicate the presentation or format of the printed
version. The portable document format (PDF) file is an exact electronic
replica of the printed version. We welcome your feedback. Please E-mail
your comments regarding the contents or accessibility features of this
document to Webmaster@gao.gov.
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed
in its entirety without further permission from GAO. Because this work
may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this
material separately.
Report to Congressional Requesters:
United States Government Accountability Office:
GAO:
April 2007:
Aviation Security:
Risk, Experience, and Customer Concerns Drive Changes to Airline
Passenger Screening Procedures, but Evaluation and Documentation of
Proposed Changes Could Be Improved:
GAO-07-634:
GAO Highlights:
Highlights of GAO-07-634, a report to congressional requesters
Why GAO Did This Study:
The Transportation Security Administration‘s (TSA) most visible layer
of commercial aviation security is the screening of airline passengers
at airport checkpoints, where travelers and their carry-on items are
screened for explosives and other dangerous items by transportation
security officers (TSO). Several revisions made to checkpoint screening
procedures have been scrutinized and questioned by the traveling public
and Congress in recent years.
For this review, GAO evaluated
(1) TSA‘s decisions to modify passenger screening procedures between
April 2005 and December 2005 and in response to the alleged August 2006
liquid explosives terrorist plot, and (2) how TSA monitored TSO
compliance with passenger screening procedures. To conduct this work,
GAO reviewed TSA documents, interviewed TSA officials and aviation
security experts, and visited 25 airports of varying sizes and
locations.
What GAO Found:
Between April 2005 and December 2005, proposed modifications to
passenger checkpoint screening standard operating procedures (SOP) were
made for a variety of reasons, and while a majority of the proposed
modifications”48 of 92”were ultimately implemented at airports, TSA‘s
methods for evaluating and documenting them could be improved. SOP
modifications were proposed based on the professional judgment of TSA
senior-level officials and program-level staff. TSA considered the
daily experiences of airport staff, complaints and concerns raised by
the traveling public, and analysis of risks to the aviation system when
proposing SOP modifications. TSA also made efforts to balance the
impact on security, efficiency, and customer service when deciding
which proposed modifications to implement, as in the case of the SOP
changes made in response to the alleged August 2006 liquid explosives
terrorist plot. In some cases, TSA tested proposed modifications at
selected airports to help determine whether the changes would achieve
their intended purpose. However, TSA‘s data collection and analyses
could be improved to help TSA determine whether proposed procedures
that are operationally tested would achieve their intended purpose. For
example, TSA officials decided to allow passengers to carry small
scissors and tools onto aircraft based on their review of threat
information, which indicated that these items do not pose a high risk
to the aviation system. However, TSA did not conduct the necessary
analysis of data it collected to assess whether this screening change
would free up TSOs to focus on screening for high-risk threats, as
intended. TSA officials acknowledged the importance of evaluating
whether proposed screening procedures would achieve their intended
purpose, but cited difficulties in doing so, including time pressures
to implement needed security measures quickly. Finally, TSA‘s
documentation on proposed modifications to screening procedures was not
complete. TSA documented the basis”that is, the information,
experience, or event that encouraged TSA officials to propose the
modifications”for 72 of the 92 proposed modifications. In addition, TSA
documented the reasoning behind its decisions for half (26 of 44) of
the proposed modifications that were not implemented. Without more
complete documentation, TSA may not be able to justify key
modifications to passenger screening procedures to Congress and the
traveling public. TSA monitors TSO compliance with passenger checkpoint
screening procedures through its performance accountability and
standards system and through covert testing. Compliance assessments
include quarterly observations of TSOs‘ ability to perform particular
screening functions in the operating environment, quarterly quizzes to
assess TSOs‘ knowledge of procedures, and an annual knowledge and
skills assessment. TSA uses covert tests to evaluate, in part, the
extent to which TSOs‘ noncompliance with procedures affects their
ability to detect simulated threat items hidden in accessible property
or concealed on a person. TSA airport officials have experienced
resource challenges in implementing these compliance monitoring
methods. TSA headquarters officials stated that they are taking steps
to address these challenges.
What GAO Recommends:
In the March 2007 report that contained sensitive security information,
GAO recommended, and the Department of Homeland Security concurred,
that TSA develop sound methods to assess whether proposed screening
changes would achieve their intended purpose and generate complete
documentation on proposed screening changes that are deemed
significant.
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-07-634].
To view the full product, including the scope and methodology, click on
the link above. For more information, contact Cathleen Berrick at
(202)512-3404 or berrickc@gao.gov.
[End of section]
Contents:
Letter:
Results in Brief:
Background:
TSA Considered Risk, Experience, and Customer Concerns when Modifying
Passenger Screening Procedures, but Could Improve Its Evaluation and
Documentation of Proposed Procedures:
TSA Has Several Methods in Place to Monitor TSO Compliance with
Passenger Checkpoint Screening SOPs:
Conclusions:
Recommendations for Executive Action:
Agency Comments and Our Evaluation:
Appendix I: Objectives, Scope and Methodology:
Appendix II: Sources of SOP Changes:
Appendix III: Comments from the Department of Homeland Security:
Appendix IV: GAO Contact and Staff Acknowledgments:
GAO Related Products:
Tables:
Table 1: Categories of Proposed and Implemented Passenger Checkpoint
Screening Changes Considered between April 2005 and December 2005:
Table 2: Factors Considered by TSA When Deciding How to Modify
Passenger Checkpoint Screening SOPs in Response to the Alleged August
2006 Terrorist Plot to Detonate Liquid Explosives on U.S.-Bound
Aircraft:
Table 3: Proposed Procedures Operationally Tested by the Explosives
Detection Improvement Task Force, October 2005-January 2006:
Table 4: TSA Evaluation of Documentation of Agency Decisions Made
between August 7 and August 13, 2006, Regarding the Alleged Liquid
Explosives Terrorist Plot:
Table 5: Modules Included in Recertification Knowledge and Skills
Assessment:
Figures:
Figure 1: Passenger Checkpoint Screening Functions:
Figure 2: TSA Airport Screening Positions:
Abbreviations:
ATSA: Aviation and Transportation Security Act:
CAPPS: computer-assisted passenger prescreening system:
CBP: Customs and Border Protection:
COMPEX: Compliance Examination:
DHS: Department of Homeland Security:
ETD: explosive trace detection:
ETP: explosives trace portal:
FBI: Federal Bureau of Investigation:
FEMA: Federal Emergency Management Agency:
FFDO: Federal Flight Deck Officer:
FSD: Federal Security Director:
HHMD: hand-held metal detector:
IED: improvised explosive device:
PASS: Performance Accountability and Standards System:
PMIS: Performance Management Information System:
PWD: person with disabilities:
SOP: standard operating procedure:
SPOT: Screening Passengers by Observation Technique:
STEA: Screener Training Exercises and Assessments:
TIP: Threat Image Projection:
TSA: Transportation Security Administration:
TSO: transportation security officer:
USP: Unpredictable Screening Process:
WTMD: walk-through metal detector:
[End of section]
United States Government Accountability Office:
Washington, DC 20548:
April 16, 2007:
The Honorable Bennie Thompson:
Chairman,
Committee on Homeland Security:
House of Representatives:
The Honorable John Mica:
Ranking Republican Member:
Committee on Transportation and Infrastructure:
House of Representatives:
The alleged August 2006 terrorist plot to detonate liquid explosives
onboard multiple commercial aircraft bound for the United States from
the United Kingdom has highlighted the continued importance of securing
the commercial aviation system. The Transportation Security
Administration (TSA) is responsible for, among other things, securing
the nation's commercial aviation system while also facilitating the
movement of passengers. To protect this system after the September 11,
2001, terrorist attacks, TSA implemented a multilayered system of
security--the most publicly visible layer being the physical screening
of passengers and their carry-on bags at airport screening checkpoints,
which all passengers must pass through prior to entering an airport's
sterile area, the area within the terminal that provides passengers
access to boarding aircraft.[Footnote 1]
The passenger checkpoint screening system is composed of three
elements: (1) the people responsible for conducting the screening of
airline passengers and their carry-on items--Transportation Security
Officers (TSO) (formerly known as screeners), (2) the procedures TSOs
are to follow to conduct screening, and (3) the technology used in the
screening process. Collectively, these elements help to determine the
effectiveness and efficiency of passenger checkpoint screening. TSA has
made efforts to enhance each of the three elements of the passenger
checkpoint screening system.
Since its inception, TSA has issued 25 versions of the passenger
checkpoint screening standard operating procedures (SOP), to include
new screening procedures as well as changes to existing screening
procedures. Several of these revisions have been criticized and
scrutinized by the traveling public and Congress. For example, in
September 2004, TSA modified its passenger screening procedures in
response to the August 2004 midair explosions of two Russian airliners,
believed to have been caused by Chechen women who concealed explosive
devices under their clothing. Specifically, the revision entailed a
more invasive technique for patting down the torso area of passengers.
According to TSA officials, in response to complaints raised by airline
passengers and TSA's review of additional threat information, TSA
further modified the pat-down procedure in December 2004 to entail a
more targeted, less intrusive pat-down procedure. In December 2005, TSA
allowed passengers to carry small scissors and small tools onto
aircraft, resulting in concern by some industry representatives that
allowing sharp objects onto planes would put flight crew at risk of
attack.[Footnote 2] This procedural change also resulted in the TSA
Assistant Secretary being asked to testify before Congress on the
agency's rationale for allowing passengers to carry small scissors and
small tools onto planes and a legislative mandate for us to assess the
impact of the prohibited items list change on public safety and
screening operations.[Footnote 3]
In light of the potential impact of changes to passenger checkpoint
screening procedures, you asked that we assess TSA's process for
determining whether and how screening procedures should be modified, to
include the creation of new screening procedures and changes to
existing screening procedures. Specifically, this report addresses the
following questions: (1) How and on what basis did TSA modify passenger
screening procedures and what factors guided the decisions to do so?
(2) How does TSA determine whether TSOs are complying with the standard
procedures for screening passengers and their carry-on items? In March
2007, we issued a report that contained sensitive security information
regarding TSA's passenger checkpoint screening procedures, including
the factors TSA considered when modifying passenger screening
procedures and TSA's efforts to monitor TSO compliance with standard
passenger screening procedures. This report provides the results of our
March 2007 report with sensitive security information removed.
To obtain information on the process used to modify passenger
checkpoint screening procedures, we reviewed and analyzed available TSA
documentation on proposed procedure modifications that were considered
between April 2005 and December 2005, as well as threat assessments and
operational studies that supported SOP modifications.[Footnote 4] We
also reviewed and analyzed similar documentation for proposed
modifications considered between August 2006 and November 2006 in
response to the alleged terrorist plot to detonate liquid explosives
onboard multiple aircraft en route from the United Kingdom to the
United States. We included modifications to passenger checkpoint
screening procedures related to this particular event because they
provided the most recent information available of TSA's approach to
modifying screening procedures in response to an immediate perceived
threat to civil aviation. To assess TSO compliance with standard
operating procedures, our work also involved a review of available
documentation, including guidance, checklists, and other evaluation
tools used by TSA. In addition, we met with TSA headquarters officials
who were involved in the process of determining whether proposed
changes to passenger checkpoint screening procedures should be
implemented, and who were responsible for overseeing efforts to monitor
TSO compliance with screening procedures. We also visited or conducted
phone interviews with staff at 25 airports, which we selected based on
variation in size, geographic location, and level of performance on
compliance-related assessments. At each airport, we interviewed Federal
Security Directors (FSD),[Footnote 5] members of their management
teams, and TSOs with passenger screening responsibilities. Information
from these interviews cannot be generalized to all airports nationwide.
Two of the airports we visited were also participants in TSA's
Screening Partnership Program.[Footnote 6] We also met with officials
from the Department of Homeland Security (DHS) Science and Technology
Directorate as well as the Federal Bureau of Investigation (FBI) to
discuss the methodology and results of their liquid explosives tests,
which were used to support TSA's decisions to modify the SOP in
September 2006. We also interviewed five experts in the field of
aviation security to obtain their perspectives on TSA's approach for
deciding whether to implement proposed checkpoint screening
procedures.[Footnote 7] We compared TSA's approach for implementing and
revising passenger checkpoint screening procedures, and for monitoring
TSO compliance, with the Comptroller General's standards for internal
control in the federal government[Footnote 8] and with risk management
guidance. We assessed the reliability of the data we acquired from TSA
regarding TSO compliance and found the data to be sufficiently reliable
for our purposes.
We conducted our work from March 2005 through January 2007 in
accordance with generally accepted government auditing standards. More
details about the scope and methodology of our work are presented in
appendix I.
Results in Brief:
During our 9-month review period, proposed modifications to passenger
checkpoint screening procedures were made in various ways and for a
variety of reasons, and while a majority of the proposed modifications-
-48 of 92--were ultimately implemented at airports, TSA's methods for
evaluating and documenting them could be improved. Some SOP
modifications were proposed based on the professional judgment of TSA
senior-level officials and program-level staff at headquarters and at
airports nationwide, while other modifications were proposed by members
of a TSA task force charged with enhancing TSA's ability to detect
improvised explosive devices at checkpoints. TSA officials proposed SOP
modifications based on risk information (threat and vulnerability
information), daily experiences of staff working at airports, and
complaints and concerns raised by the traveling public. In addition to
these factors, TSA senior leadership made efforts to balance the impact
that proposed SOP modifications--such as the changes to the liquids,
gels, and aerosols screening procedures--would have on security,
efficiency, and customer service when deciding whether proposed SOP
modifications should be implemented. In some cases, TSA first tested
proposed modifications to screening procedures at selected airports to
help determine whether the changes would achieve their intended
purpose, such as to enhance detection of prohibited items or to free up
TSO resources to perform screening activities focused on threats
considered to pose a high risk, such as explosives. However, TSA's data
collection and analyses could be improved to help TSA determine whether
proposed procedures that are operationally tested would achieve their
intended purpose. Specifically, for the seven tests of proposed
screening procedures TSA conducted during our review period, although
TSA collected some data on the efficiency of and customer response to
the procedures at selected airports, the agency generally did not
collect the type of data or conduct the necessary analysis that would
yield information on whether proposed procedures would achieve their
intended purpose. TSA officials acknowledged that they could have made
some improvements in the various analyses they conducted related to the
decision to allow small scissors and tools onboard aircraft, but cited
several difficulties in doing so. Nevertheless, until TSA takes steps
to improve its ability to evaluate the potential impact of screening
changes on security and TSO resource availability, it may be difficult
for TSA to determine how best to allocate limited TSO resources, help
ensure the screeners' ability to detect explosives and other high-
threat objects, and evaluate whether proposed modifications to
screening procedures would have the intended effect. Finally, TSA's
documentation on proposed modifications to screening procedures was not
always complete. TSA documented the basis--that is, the information,
experience, or event that encouraged TSA officials to propose the
modifications--for 72 of the 92 proposed modifications. In addition,
TSA only documented the reasoning behind its decisions for about half
(26 of 44) of the proposed modifications that were not implemented. Our
standards for governmental internal controls and associated guidance
suggest that agencies should document key decisions in a way that is
complete and accurate. Without such information, TSA cannot always
justify significant SOP modifications to Congress and the traveling
public. TSA officials acknowledged that it is beneficial to maintain
documentation on the reasoning behind decisions to implement or reject
proposed SOP modifications deemed significant, particularly given the
organizational restructuring and staff turnover within TSA.
TSA monitors TSO compliance with passenger checkpoint screening SOPs
through its performance accountability and standards system and through
local and national covert testing.[Footnote 9] According to TSA
officials, the agency developed the performance accountability and
standards system in response to our 2003 report that recommended that
TSA establish a performance management system that makes meaningful
distinctions in employee performance,[Footnote 10] and in response to
input from TSA airport staff on how to improve passenger and checked
baggage screening measures. This system is used by TSA to measure TSO
compliance with passenger checkpoint screening procedures. Of the 24
FSDs we interviewed about compliance assessments, 9 cited difficulties
in implementing the performance accountability and standards system
because of a lack of available staff to conduct observations and
administer SOP quizzes. When asked whether they planned to address
FSDs' concerns regarding a lack of available staff to evaluate TSO
compliance with SOPs, TSA headquarters officials said that they have
automated many of the data entry functions of the performance
accountability and standards system to relieve the field of the burden
of manually entering this information into the online system.
Furthermore, the TSA Assistant Secretary stated that FSDs were given
the option of delaying implementation of the performance accountability
and standards system if they were experiencing resource challenges. In
addition to implementing the performance accountability and standards
system, TSA conducts local and national covert tests to evaluate, in
part, the extent to which TSOs' noncompliance with the SOPs affects
their ability to detect simulated threat items hidden in accessible
property or concealed on a person.[Footnote 11] Even though all 24 FSDs
said that they have conducted local covert tests, 10 of these FSDs said
that lack of available staff made it difficult to conduct these tests.
TSA officials told us that they are considering resource alternatives
for implementing these tests, but did not provide us with specific
details of these plans. Based on the results of national covert tests
conducted between September 2005 and July 2006, which showed that some
TSOs did not identify threat objects, in part because they did not
comply with SOPs, TSA's Office of Inspection recommended, among other
things, that the Office of Security Operations ensure that TSOs adhere
to the current passenger checkpoint screening SOPs. However, until the
resource limitations that have restricted TSA's use of its compliance
monitoring tools have been fully addressed, TSA may not have assurance
that TSOs are screening passengers according to standard procedures.
To help improve TSA's ability to evaluate proposed SOP modifications
and to justify its decisions regarding whether proposed SOP
modifications should be implemented, in the March 2007 report that
contained sensitive security information, we recommended that the
Secretary of the Department of Homeland Security direct the Assistant
Secretary of Homeland Security for TSA to (1) develop sound evaluation
methods, when possible, that can help TSA determine whether proposed
procedures that are operationally tested would achieve their intended
purpose, and (2) generate and maintain complete documentation of
proposed modifications deemed significant by TSA. DHS generally
concurred with our findings and recommendations and outlined actions
TSA plans to take to implement the recommendations. For example, TSA
intends to improve its methods for evaluating proposed SOP
modifications, which may entail randomly selecting the airports that
will participate in a study to better isolate the impact of proposed
SOP modifications on passenger screening. DHS also stated that TSA is
in the process of developing protocols that will require documentation
of the source and intent of proposed SOP modifications, as well as
documentation of TSA officials' reasoning for implementing or rejecting
proposed modifications. The full text of DHS's comments is included in
appendix III.
Background:
Passenger Checkpoint Screening System:
Passenger screening is a process by which personnel authorized by TSA
inspect individuals and property to deter and prevent the carriage of
any unauthorized explosive, incendiary, weapon, or other dangerous item
onboard an aircraft or into a sterile area.[Footnote 12] Passenger
screening personnel must inspect individuals for prohibited items at
designated screening locations.[Footnote 13] As shown in figure 1, the
four passenger screening functions are:
* X-ray screening of property,
* walk-through metal detector screening of individuals,
* hand-wand or pat-down screening of individuals, and:
* physical search of property and trace detection for explosives.
Typically, passengers are only subjected to X-ray screening of their
carry-on items and screening by the walk-through metal detector.
Passengers whose carry-on baggage alarms the X-ray machine, who alarm
the walk-through metal detector, or who are designated as selectees--
that is, passengers selected by the Computer-Assisted Passenger
Prescreening System (CAPPS[Footnote 14]) or other TSA-approved
processes to receive additional screening--are screened by hand-wand or
pat-down and have their carry-on items screened for explosives traces
or physically searched.
Figure 1: Passenger Checkpoint Screening Functions:
[See PDF for image]
Source: GAO and Nova Development Corporation.
Note: Explosive trace detection (ETD) works by detecting vapors and
residues of explosives. Human operators collect samples by rubbing
swabs along the interior and exterior of an object that TSOs determine
to be suspicious, and place the swabs in the ETD machine, which then
chemically analyzes the swab to identify any traces of explosive
materials.
Bomb Appraisal Officers (BAO) are available to respond to unresolved
alarms at the checkpoint that involve possible explosive devices. The
BAO may contact appropriate law enforcement or bomb squad officials if
review indicates possible or imminent danger, in which case the BAO
ensures that the security checkpoint is cleared. The BAO approves
reopening of security lane(s) if no threat is posed.
[A] BDOs are TSOs specially trained to detect suspicious behavior in
individuals approaching the checkpoint. Should the BDO observe such
behavior, he or she may refer the individual for additional screening
or to a law enforcement officer.
[B] The hand-wand or pat-down is conducted if a passenger is identified
or randomly selected for additional screening because he or she met
certain criteria or alarmed the walk-through metal detector.
[C] Manual or ETD searches of accessible property occur if the
passenger is identified or randomly selected for additional screening
or if the screener identified a potential prohibited item on X-ray.
[End of figure]
The passenger checkpoint screening system is composed of three
elements: the people responsible for conducting the screening of
airline passengers and their carry-on items--TSOs, the technology used
during the screening process, and the procedures TSOs are to follow to
conduct screening. Collectively, these elements help to determine the
effectiveness and efficiency of passenger checkpoint screening.
Transportation Security Officers:
TSOs screen all passengers and their carry-on baggage prior to allowing
passengers access to their departure gates. There are several positions
within TSA that perform and directly supervise passenger screening
functions. Figure 2 provides a description of these positions.
Figure 2: TSA Airport Screening Positions:
[See PDF for image]
Source: GAO analysis of TSA data.
[A] Number of annualized TSA screening positions for fiscal year 2006.
These positions do not include private screener positions at the six
airports that participated in the Screening Partnership Program during
fiscal year 2006.
[End of figure]
In May 2005, we reported on TSA's efforts to train TSOs and to measure
and enhance TSO performance.[Footnote 15] We found that TSA had
initiated a number of actions designed to enhance passenger TSO,
checked baggage TSO, and supervisory TSO training. However, at some
airports TSOs encountered difficulty accessing and completing recurrent
(refresher) training because of technological and staffing constraints.
We also found that TSA lacked adequate internal controls to provide
reasonable assurance that TSOs were receiving legislatively mandated
basic and remedial training, and to monitor the status of its recurrent
training program. Further, we reported that TSA had implemented and
strengthened efforts to collect TSO performance data as part of its
overall effort to enhance TSO performance. We recommended that TSA
develop a plan for completing the deployment of high-speed Internet/
intranet connectivity to all TSA airport training facilities, and
establish appropriate responsibilities and other internal controls for
monitoring and documenting TSO compliance with training requirements.
DHS generally concurred with our recommendations and stated that TSA
has taken steps to implement them.
Screening Technology:
There are typically four types of technology used to screen airline
passengers and their carry-on baggage at the checkpoint:
* walk-through metal detectors,
* X-ray machines,
* hand-held metal detectors, and:
* explosive trace detection (ETD) equipment.
The President's fiscal year 2007 budget request noted that emerging
checkpoint technology will enhance the detection of prohibited items,
especially firearms and explosives, on passengers. As of December 2006,
TSA plans to conduct operational tests of three types of passenger
screening technologies within the next year. TSA has conducted other
tests in the past; for example, during fiscal year 2005, TSA
operationally tested document scanners, which use explosive trace
detection technology to detect explosives residue on passengers'
boarding passes or identification cards. TSA decided not to expand the
use of the document scanner, in part because of the extent to which
explosives traces had to be sampled manually. TSA also plans to begin
operational tests of technology that would screen bottles for liquid
explosives. We are currently evaluating the Department of Homeland
Security's and TSA's progress in planning for, managing, and deploying
research and development programs in support of airport checkpoint
screening operations. We expect to report our results in August 2007.
Standard Operating Procedures:
TSA has developed checkpoint screening standard operating procedures,
which are the focus of this report, that establish the process and
standards by which TSOs are to screen passengers and their carry-on
items at screening checkpoints.[Footnote 16] Between April 2005 and
December 2005, based on available documentation, TSA deliberated 189
proposed changes to passenger checkpoint screening SOPs, 92 of which
were intended to modify the way in which passengers and their carry-on
items are screened.[Footnote 17] TSA issued six versions of the
passenger checkpoint screening SOPs during this period.[Footnote 18]
TSA modified passenger checkpoint screening SOPs to enhance the
traveling public's perception of the screening process, improve the
efficiency of the screening process, and enhance the detection of
prohibited items and suspicious persons. As shown in table 1, 48 of the
92 proposed modifications to passenger checkpoint screening SOPs were
implemented, and the types of modifications made or proposed generally
fell into one of three categories--customer satisfaction, screening
efficiency, and security.
Table 1: Categories of Proposed and Implemented Passenger Checkpoint
Screening Changes Considered between April 2005 and December 2005:
Category of proposed changes: Customer satisfaction;
Description of category: Changes that will improve the traveling
public's perception of the screening process or reduce or exempt
categories of authorized individuals from certain aspects of the
screening process;
Proposed SOP changes: 42;
Implemented SOP changes: 22.
Category of proposed changes: Screening efficiency;
Description of category: Changes that will improve screening flow,
clarify screener duties, update equipment procedures, or enhance the
working environment of screening locations;
Proposed SOP changes: 31;
Implemented SOP changes: 17.
Category of proposed changes: Security;
Description of category: Changes that will improve TSA's ability to
detect prohibited items and suspicious persons;
Proposed SOP changes: 19;
Implemented SOP changes: 9.
Category of proposed changes: Total;
Description of category: [Empty];
Proposed SOP changes: 92;
Implemented SOP changes: 48.
Source: GAO analysis of TSA data.
[End of table]
TSA Considered Risk, Experience, and Customer Concerns when Modifying
Passenger Screening Procedures, but Could Improve Its Evaluation and
Documentation of Proposed Procedures:
TSA used various processes between April 2005 and December 2005 to
modify passenger checkpoint screening SOPs, and a variety of factors
guided TSA's decisions to modify SOPs. TSA's processes for modifying
SOPs generally involved TSA staff recommending proposed modifications,
reviewing and commenting on proposed modifications, and TSA senior
leadership making final decisions as to whether proposed modifications
should be implemented. During our 9-month review period, TSA officials
considered 92 proposed modifications to the way in which passengers and
their carry-on items were screened, and 48 were implemented.[Footnote
19] TSA officials proposed SOP modifications based on risk factors
(threat and vulnerability information), day-to-day experiences of
airport staff, and concerns and complaints raised by passengers. TSA
then made efforts to balance security, efficiency, and customer service
when deciding which proposed SOP modifications to implement. Consistent
with our prior work that has shown the importance of data collection
and analyses to support agency decision making, TSA conducted data
collection and analysis for certain proposed SOP modifications that
were tested before they were implemented at all airports. Nevertheless,
we found that TSA could improve its data collection and analysis to
assist the agency in determining whether the proposed procedures would
enhance detection or free up TSO resources, when intended. In addition,
TSA did not maintain complete documentation of proposed SOP
modifications; therefore, we could not fully assess the basis for
proposed SOP modifications or the reasons why certain proposed
modifications were not implemented. TSA officials acknowledged that it
is beneficial to maintain documentation on the reasoning behind
decisions to implement or reject SOP modifications deemed significant.
TSA's Processes for Modifying SOPs Were Driven by Input from TSA Field
and Headquarters Staff:
Proposed SOP modifications were submitted and reviewed under two
processes during our 9-month review period, and for each process, TSA
senior leadership made the final decision as to whether the proposed
modifications would be implemented. One of the processes TSA used to
modify passenger checkpoint screening SOPs involved TSA field staff or
headquarters officials, and, to a lesser extent, TSA senior leadership,
suggesting ways in which passenger checkpoint screening SOPs could be
modified. These suggestions were submitted through various mechanisms,
including electronic mail and an SOP panel review conducted by TSA
airport personnel. (These methods are described in more detail in app.
II.) Eighty-two of the 92 proposed modifications were considered under
this process.
If TSA officials determined, based on their professional judgment, that
the recommended SOP modifications--whether from headquarters or the
field--merited further consideration, or if a specific modification was
proposed by TSA senior leadership, the following chain of events
occurred:
* First, the procedures branch of the Office of Security Operations
drafted SOP language for each of the proposed modifications.[Footnote
20]
* Second, the draft language for each proposed modification was
disseminated to representatives of various TSA divisions for review,
and the language was revised as needed.
* Third, TSA officials tested proposed modifications in the airport
operating environment if they found it necessary to:
- assess the security impact of the proposed modification,
- evaluate the impact of the modification on the amount of time taken
for passengers to clear the checkpoint,
- measure the impact of the proposed modification on passengers and
industry partners, or:
- determine training needs created by the proposed modification.
* Fourth, the revised SOP language for proposed modifications was sent
to the heads of several TSA divisions for comment.
* Fifth, considering the comments of the TSA division heads, the head
of the Office of Security Operations or other TSA senior leadership
made the final decision as to whether proposed modifications would be
implemented.
Another process for modifying passenger checkpoint screening SOPs
during our 9-month review period was carried out by TSA's Explosives
Detection Improvement Task Force. The task force was established in
October 2005 by the TSA Assistant Secretary to respond to the threat of
improvised explosive devices (IED) being carried through the
checkpoint. The goal of the task force was to apply a risk-based
approach to screening passengers and their baggage in order to enhance
TSA's ability to detect IEDs.[Footnote 21] The task force developed 13
of the 92 proposed SOP modifications that were considered by TSA
between April 2005 and December 2005.[Footnote 22] The task force
solicited and incorporated feedback from representatives of various TSA
divisions on these proposed modifications and presented them to TSA
senior leadership for review and approval. TSA senior leadership
decided that 8 of the 13 proposed modifications should be operationally
tested--that is, temporarily implemented in the airport environment for
the purposes of data collection and evaluation--to better inform
decisions regarding whether the proposed modifications should be
implemented. Following the testing of these proposed modifications in
the airport environment, TSA senior leadership decided to implement 7
of the 8 operationally tested changes.[Footnote 23] (The task force's
approach to testing these procedures is discussed in more detail
below.) Following our 9-month period of review, the changes that TSA
made to its passenger checkpoint screening SOPs in response to the
alleged August 2006 liquid explosives terror plot were decided upon by
DHS and TSA senior leadership, with some input from TSA field staff,
aviation industry representatives, and officials from other federal
agencies.
Risk Factors, Day-to-Day Experiences, and Customer Concerns Were the
Basis for Proposed SOP Modifications:
Based on available documentation,[Footnote 24] risk factors (i.e.,
threats to commercial aviation and vulnerability to those threats), day-
to-day experiences of airport staff, and complaints and concerns raised
by passengers were the basis for TSA staff and officials proposing
modifications to passenger checkpoint screening SOPs.
Fourteen of the 92 procedure modifications recommended by TSA staff and
officials were based on reported or perceived threats to commercial
aviation, and existing vulnerabilities to those threats. For example,
the Explosives Detection Improvement Task Force proposed SOP
modifications based on threat reports developed by TSA's Intelligence
and Analysis division. Specifically, in an August 2005 civil aviation
threat assessment, the division reported that terrorists are likely to
seek novel ways to evade U.S. airport security screening.[Footnote 25]
Subsequently, the task force proposed that the pat-down procedure
performed on passengers selected for additional screening be revised to
include not only the torso area, which is what the previous pat-down
procedure entailed, but additional areas of the body such as the
legs.[Footnote 26] The August 2005 threat assessment also stated that
terrorists may attempt to carry separate components of an IED through
the checkpoint, then assemble the components while onboard the
aircraft. To address this threat, the task force proposed a new
procedure to enhance TSOs' ability to search for components of
improvised explosive devices. According to TSA officials, threat
reports have also indicated that terrorists rely on the routine nature
of security measures in order to plan their attacks. To address this
threat, the task force proposed a procedure that incorporated
unpredictability into the screening process by requiring designated
TSOs to randomly select passengers to receive additional search
procedures. Following our 9-month review period, TSA continued to use
threat information as the basis for proposed modifications to passenger
checkpoint screening SOPs. In August 2006, TSA proposed modifications
to passenger checkpoint screening SOPs after receiving threat
information regarding an alleged terrorist plot to detonate liquid
explosives onboard multiple aircraft en route from the United Kingdom
to the United States. Regarding vulnerabilities to reported threats,
based on the results of TSA's own covert tests (undercover, unannounced
tests), TSA's Office of Inspection recommended[Footnote 27] SOP
modifications to enhance the detection of explosives at the passenger
screening checkpoint.[Footnote 28]
TSA officials also proposed modifications to passenger checkpoint
screening SOPs based on their professional judgment regarding perceived
threats to aviation security. For example, an FSD recommended changes
to the screening of funeral urns based on a perceived threat. In some
cases, proposed SOP modifications appeared to reflect threat
information analyzed by TSA officials. For example, TSOs are provided
with Threat in the Spotlight, a weekly report that identifies new
threats to commercial aviation, examples of innovative ways in which
passengers may conceal prohibited items, and pictures of items that may
not appear to be prohibited items but actually are. TSOs are also
provided relevant threat information during briefings that take place
before and after their shifts. In addition, FSDs are provided
classified intelligence summaries on a daily and weekly basis, as well
as monthly reports of suspicious incidents that occurred at airports
nationwide. TSA's consideration of threat and vulnerability--through
analysis of current documentation and by exercising professional
judgment--is consistent with a risk-based decision-making
approach.[Footnote 29] As we have reported previously, and DHS and TSA
have advocated, a risk-based approach, as applied in the homeland
security context, can help to more effectively and efficiently prepare
defenses against acts of terrorism and other threats.
TSA headquarters and field staff also based proposed SOP modifications-
-specifically, 36 of the 92 proposed modifications--on experience in
the airport environment. For example, TSA headquarters officials
conduct reviews at airports to identify best practices and deficiencies
in the checkpoint screening process. During one of these reviews,
headquarters officials observed that TSOs were not fully complying with
the pat-down procedure. After discussions with TSOs, TSA headquarters
officials determined that the way in which TSOs were conducting the
procedure was more effective. In addition, TSA senior leadership, after
learning that small airports had staffing challenges that precluded
them from ensuring that passengers are patted down by TSOs of the same
gender, proposed that opposite-gender pat-down screening be allowed at
small airports.
Passenger complaints and concerns shared with TSA also served as a
basis for proposed modifications during our 9-month review period.
Specifically, of the 92 proposed SOP modifications considered during
this period, TSA staff and officials recommended 29 modifications based
on complaints and concerns raised by passengers. For example, TSA
headquarters staff recommended allowing passengers to hold their hair
while being screened by the Explosives Trace Portal,[Footnote 30] after
receiving complaints from passengers about eye injuries from hair
blowing in their eyes and hair being caught in the doors of the portal.
TSA Balanced Security, Efficiency, and Customer Service when Deciding
whether to Implement Proposed SOP Modifications:
When deciding whether to implement proposed SOP modifications, TSA
officials also made efforts to balance the impact of proposed
modifications on security, efficiency,[Footnote 31] and customer
service. TSA's consideration of these factors reflects the agency's
mission to protect transportation systems while also ensuring the free
movement of people and commerce. As previously discussed, TSA sought to
improve the security of the commercial aviation system by modifying the
SOP for conducting the pat-down search. (TSA identified the modified
pat-down procedure as the "bulk-item" pat-down.) When deciding whether
to implement the proposed modification, TSA officials considered not
only the impact that the bulk-item pat-down procedure would have on
security, but also the impact that the procedure would have on
screening efficiency and customer service. For example, TSA officials
determined that the bulk-item pat-down procedure would not
significantly affect efficiency because it would only add a few seconds
to the screening process. Following our 9-month review period, TSA
continued to make efforts to balance security, efficiency, and customer
service when deciding whether to implement proposed SOP modifications,
as illustrated by TSA senior leadership's deliberation on proposed SOP
modifications in response to the alleged August 2006 liquid explosives
terrorist plot. TSA modified the passenger checkpoint screening SOP
four times between August 2006 and November 2006 in an effort to defend
against the threat of terrorists' use of liquid explosives onboard
commercial aircraft.[Footnote 32] While the basis for these
modifications was to mitigate risk, as shown in table 2, TSA senior
leadership considered several other factors when deciding whether to
implement the modifications.
Table 2: Factors Considered by TSA When Deciding How to Modify
Passenger Checkpoint Screening SOPs in Response to the Alleged August
2006 Terrorist Plot to Detonate Liquid Explosives on U.S.-Bound
Aircraft:
Procedures: August 10, 2006: Total ban on liquids and gels in
accessible property or onboard aircraft. Exceptions:
* baby formula/ milk if infant is traveling;
* prescription medication with name matching passenger's ticket;
* insulin and other essential nonprescription medications;
* liquids and gels carried by passengers with disabilities, after
screening for explosive materials, with Supervisory TSO/Screening
Manager concurrence;
* supplies brought into retail area by approved vendors for restocking
of retail operations.
Passengers required to remove shoes at checkpoints for X-ray screening;
Impact on security: Benefits;
* Terrorists less likely to successfully carry liquid explosives onto
aircraft using container;
* Terrorists less likely to successfully carry liquid explosives onto
aircraft in shoes (e.g., gel-based insoles).
Drawbacks;
* None identified;
Impact on efficiency of screening process: Benefits;
* Requiring passengers to remove footwear will speed the screening
process by reducing the need to ETD and physically inspect footwear.
Footwear now only needs to be subjected to physical search if something
suspicious appears on the X-ray of the shoes;
Drawbacks;
* Total ban on liquids and gels may be unsustainable for long term
because more passengers would check their baggage rather than carry it
on, which would cause a strain on the checked baggage screening system;
Impact on customer service: Benefits;
* Exceptions allow passengers with legitimate medical and other needs
to bring essential liquids onboard aircraft;
* Passengers less confused about whether to remove shoes for X-ray
screening;
Drawbacks;
* Inconvenient for passengers to not be able to carry toiletries and
similar liquids and gels onto planes;
Other considerations:
* Threat was a specific type of liquid explosive;
* There was no checkpoint screening technology available for deployment
that could detect the specific liquid explosive.
Procedures: August 12, 2006[A] : Aerosols prohibited. Following
additional items allowed past checkpoints:
* baby food in small containers, if baby/small child is traveling;
* essential nonprescription medications (e.g., contact lens saline
solution, eye care products), not to exceed 4 fluid ounces per
container;
* liquids and gels for diabetic passengers, no greater than 8 fluid
ounces per container;
* gels, saline solutions, and other liquids used to augment portions of
body for medical/cosmetic reasons;
* life support/life sustaining liquids (e.g., bone marrow and blood
products;
Impact on security: Benefits;
* Terrorists less likely to successfully carry liquid explosives onto
aircraft using container;
* Terrorists less likely to successfully carry liquid explosives onto
aircraft in shoes (e.g., gel-based insoles).
Drawbacks;
* None identified;
Impact on efficiency of screening process: Benefits;
* Requiring passengers to remove footwear will speed the screening
process by reducing the need to ETD and physically inspect footwear.
Footwear now only needs to be subjected to physical search if something
suspicious appears on the X-ray of the shoes;
Drawbacks;
* Total ban on liquids and gels may be unsustainable for long term
because more passengers would check their baggage rather than carry it
on, which would cause a strain to the checked baggage screening system;
Impact on customer service: Benefits;
* Clarified for TSOs types and amounts of liquids and gels exempt from
ban;
* Created smoother process at checkpoint, minimizing impact upon
travelers;
* Gave diabetic passengers access to essential liquids;
* Lifted prohibition against critical life saving fluids;
Drawbacks;
* None identified;
Other considerations:
* Feedback from TSA field staff and industry representatives regarding
exemptions associated with liquids, gels, and aerosols restrictions and
specific information on the quantities of certain types of liquids,
gels, and aerosols that should be exempted from the restrictions;
* Additional information was obtained about the alleged terrorist plot,
including information from the United Kingdom and U.S. intelligence
communities and discussions with explosives experts.
Procedures: September 26, 2006: Liquids, gels, aerosols (not on
prohibited items list or considered hazardous materials) permitted in
accessible property in 3-fluid ounce bottles fit comfortably in one
quart-size, clear plastic, zip-top bag per passenger. Plastic bags
screened by X-ray. Items purchased in sterile area of airports
permitted onboard aircraft.
Items allowed past checkpoints in amounts larger than 3 fluid ounces;
must be declared and cleared by TSO:
* baby formula/milk /food in small containers if baby/ small child is
traveling,;
* medications (liquid, gel, aerosol),;
* liquids and gels for passengers indicating need to address diabetic
or other medical condition;
TSOs conducting declaration process positioned ahead of checkpoint to
assess liquids, gels, and aerosols to determine reasonable quantity for
passenger's itinerary, and to advise passengers on procedures related
to liquids, gels and aerosols that are either prohibited (requiring
disposal or abandonment of items) or permitted but outside of plastic
bag (TSO marks boarding pass or travel document to indicate items).
Items newly permitted past checkpoints in any amount:
* liquid, gel, and aerosol cleaning supplies required by airport
employees servicing sterile area,;
* gels and frozen liquids required to cool any other items permitted
past checkpoints, provided no unresolved suspicious items or
activities.
Random ETD sampling of plastic bags, containers within plastic bags,
and other containers holding liquids, gels, and aerosols;
Impact on security: Benefits;
* Plastic bags present deterrent and operational complexities for
terrorists--attempts to combine liquids increase probability of
detection;
* Requirement to remove and submit plastic bags for X-ray screening
serves as deterrent to terrorists, and provides TSOs opportunity to
view and examine all liquids, gels, and aerosols;
* Plastic bags hinder terrorists from carrying large enough amounts of
liquid explosives that could potentially cause catastrophic damage to
an aircraft;
* Declaration process thought to deter terrorists from attempting to
carry liquid explosives onboard aircraft;
* Random ETD sampling enables TSOs to determine whether the small
amounts of liquids and gels being carried through the checkpoint are,
in fact, explosives. This procedure may also deter terrorists from
attempting to carry liquid explosives onboard aircraft;
Drawbacks;
* Possibility that terrorists could combine liquids in small bottles to
generate an amount large enough to potentially cause catastrophic
damage to an aircraft; * The additional drawbacks related to the impact
on security are sensitive security information. Therefore, we do not
discuss those drawbacks in this report;
Impact on efficiency of screening process: Benefits;
* Enables TSOs to focus resources on detecting explosives, rather than
small amounts of liquids and gels that do not represent serious threat;
* Checked baggage screening expected to return to sustainable levels;
* Requirement to remove and submit plastic bags for X-ray screening
encourages passengers to reduce clutter in bags, making it easier for
TSOs to screen for prohibited and threat items.
Drawbacks;
* Increase in number of items X-rayed per passenger, which may slow
down screening process;
Impact on customer service: Benefits;
* Procedures easily learned by public and TSOs;
* Accommodates many passengers with legitimate needs for small
quantities of liquids during flights.
Drawbacks;
* Possible negative public reaction to passengers having to provide
their own plastic bags;
Other considerations:
* The results of liquid explosives tests conducted by DHS and the FBI.
The results of these tests are sensitive security information and are
not discussed in this report;
* TSA gathered data to test its assumption regarding sustainability of
the total ban on liquids, gels, and aerosols and found that following
the total ban, there was approximately a 27 percent increase in the
number of bags checked per passenger.
Procedures: November 21, 2006: Same as the procedures implemented on
September 26, 2006, with the exception of the following:
Liquids, gels, and aerosols allowed in 3.4-fluid- ounce (100-
milliliter) "travel size" bottles.
Declaration process eliminated; TSA employee ahead of checkpoint offers
public advisements and assessments on procedures; (Additional
modifications were made to the liquids, gels, and aerosols screening
procedures. However, these additional modifications are sensitive
security information. Therefore, we do not discuss these modifications
in this report.);
Impact on security: Benefits; No additional security benefits
identified.
Drawbacks;
No additional drawbacks to security identified;
Impact on efficiency of screening process: Benefits;
* Elimination of the declaration process will reduce unnecessary
redundancy in the examination of exempted liquids and gels, which
previously occurred both prior to and following x-ray screening; (TSA
identified additional efficiency benefits of this modification to the
liquids, gels, and aerosols screening procedures. These additional
benefits are sensitive security information. Therefore, we do not
discuss these benefits in this report.);
Drawbacks; No additional efficiency drawbacks identified;
Impact on customer service: Benefits;
* Allowing for risk-based discretion on the part of Supervisory TSOs
enhances customer service for passengers who have legitimate reasons
for carrying liquids, gels, or aerosols onboard planes.
Drawbacks; No additional customer service drawbacks identified;
Other considerations:
* The European Union allowed passengers to carry liquids, gels, and
aerosols in travel sized containers up to 100 milliliters,
approximately 3.4 fluid ounces;
* The results of liquid explosive testing conducted by FBI and DHS;
* TSA recognized that no procedure could be written to address every
possible scenario involving liquids, gels, and aerosols. Therefore, TSA
enabled Supervisory TSOs to use their discretion, while also
considering security risks.
Source: GAO analysis of TSA documentation:
[A] The August 12, 2006, SOP change incorporates clarifications
implemented on August 16, 2006.
[End of table]
As TSA senior leadership obtained more information about the particular
threat posed by the liquid explosives through tests conducted by DHS's
Science and Technology Directorate and FBI, TSA relaxed the
restrictions to allow passengers to carry liquids, gels, and aerosols
onboard aircraft in 3-fluid-ounce bottles--and as of November 2006, 3.4-
fluid-ounce bottles--that would easily fit in a quart-sized, clear
plastic, zip-top bag. TSA senior leadership identified both benefits
and drawbacks to this SOP modification, but determined that the balance
of security, efficiency, and customer service that would result from
these SOP changes was appropriate. As shown in table 2, TSA officials
recognize that there are security drawbacks--or vulnerabilities--
associated with allowing passengers to carry even small amounts of
liquids and gels onboard aircraft. For example, two or more terrorists
could combine small amounts of liquid explosives after they pass
through the checkpoint to generate an amount large enough to possibly
cause catastrophic damage to an aircraft. However, TSA officials stated
that doing so would be logistically challenging given the physical harm
that the specific explosives could cause to the person handling them,
and that suspicion among travelers, law enforcement officials, and
airport employees would likely be raised if an individual was seen
combining the liquid contents of small containers stored in two or more
quart-sized plastic bags. TSA officials stated that at the time of the
modifications to the liquid, gels, and aerosols screening procedures,
there was consensus among explosives detection experts, both
domestically and abroad, regarding TSA's assumptions about how the
explosives could be used and the damage they could cause to an
aircraft.[Footnote 33] TSA officials also stated that after reviewing
the intelligence information related to the alleged August 2006 London
terror plot--particularly with regard to the capability and intent of
the terrorists--TSA determined that allowing small amounts of liquids,
gels, and aerosols onboard aircraft posed an acceptable level of risk
to the commercial aviation system.[Footnote 34] Moreover, TSA officials
acknowledged that there are vulnerabilities with allowing passengers to
carry liquids that are exempted from the 3.4-fluid-ounce limit--such as
baby formula and medication--onboard aircraft.
TSA officials stated that the enhancements TSA is making to the various
other layers of aviation security will help address the security
vulnerabilities identified above. For example, TSA has increased
explosives detection canine patrols, deployed Federal Air Marshals on
additional international flights, increased random screening of
passengers at boarding gates, and increased random screening of airport
and TSA employees who pass through the checkpoint. TSA also plans to
expand implementation of its Screening Passengers by Observation
Technique (SPOT) to additional airports. SPOT involves specially
trained TSOs observing the behavior of passengers and resolving any
suspicious behavior through casual conversation with passengers and
referring suspicious passengers to selectee screening.[Footnote 35] TSA
intends for SPOT to provide a flexible, adaptable, risk-based layer of
security that can be deployed to detect potentially high-risk
passengers based on certain behavioral cues.
TSA's Analysis of the Impact of Certain Proposed Screening Changes on
Security and TSO Resources Could Be Strengthened:
While professional judgment regarding risk factors, experience in the
operating environment, and customer feedback have guided many of the
decisions TSA leadership made about which screening procedures to
implement, TSA also sought to use empirical data as a basis for
evaluating the impact some screening changes could have on security and
TSO resources. The TSA Assistant Secretary stated in December 2005 that
TSA sought to make decisions about screening changes based on data and
metrics--a practice he said TSA would continue. The use of data and
metrics to inform TSA's decision making regarding implementing proposed
screening procedures is consistent with our prior work that has shown
the importance of data collection and analyses to support agency
decision making. Between October 2005 and January 2006, TSA's
Explosives Detection Improvement Task Force sought to collect data as
part of an effort to test the impact of seven proposed procedures at
selected airports, as noted earlier.[Footnote 36] These seven proposed
procedures were selected because officials believed they would have a
significant impact on how TSOs perform daily screening functions, TSO
training, and customer acceptability. According to TSA's chief of
security operations, the purpose of testing these procedures in the
airport environment was to ensure that TSA was "on the right path" in
implementing them. These particular procedures were considered by
senior TSA officials as especially important for enhancing the
detection of explosives and for deterring terrorists from attempting to
carry out an attack. According to TSA, some of the proposed procedures
could also free up TSOs so that they could spend more time on
procedures for detecting explosives and less time on procedures
associated with low security risks, such as identifying small scissors
in carry-on bags. The seven proposed procedures tested by the task
force reflect both new procedures and modifications to existing
procedures, as shown in table 3.
Table 3: Proposed Procedures Operationally Tested by the Explosives
Detection Improvement Task Force, October 2005-January 2006:
Title of proposed procedure: Screening Passengers by Observation
Technique[A];
New or revised procedure: New;
Previous procedure: N/A;
Proposed procedure: Designated TSOs will observe the behavioral
patterns of passengers, and based on their observations, TSOs will
conduct casual conversations, refer suspicious passengers to secondary
screening, and in some cases refer some individuals to law enforcement
officers.
Title of proposed procedure: Unpredictable Screening Process (USP);
New or revised procedure: Revised;
Previous procedure: Selectee, or additional, screening of passengers
must be conducted continuously. If the number of individuals that alarm
the walk-through metal detector or if the number of bags that alarm is
not enough to ensure continual additional screening, individuals and
bags must be randomly selected to meet this requirement;
Proposed procedure: Random selectee screening is to be replaced by the
USP, which entails random selection of passengers across two screening
lanes to be subjected to a predetermined element of the selectee
screening process. The specific elements are sensitive security
information and are not discussed in this report.
Title of proposed procedure: Bulk-item pat-down search;
New or revised procedure: Revised;
Previous procedure: The pat-down procedure included only the torso area
of the body;
Proposed procedure: The pat-down is to include not only the torso, but
also from the waistline down.
Title of proposed procedure: IED components search;
New or revised procedure: New;
Previous procedure: N/A;
Proposed procedure: TSOs are to implement additional measures if they
find an IED component, such as a battery, when screening.
Title of proposed procedure: Selectee screening changes;
New or revised procedure: Revised;
Previous procedure: There was a rigid set of procedures for resolving
alarms set off by selectees;
Proposed procedure: More flexibility is to be provided for resolving
alarms set off by selectees.
Title of proposed procedure: Threat area search;
New or revised procedure: Revised; Previous procedure: For bags that
appear to pose a security threat, various searches were conducted,
where some of the searches were not directly focused on the reason for
suspicion;
Proposed procedure: For bags that appear to pose a security threat, the
searches that are conducted are intended to focus more on the reason
for suspicion.
Title of proposed procedure: Prohibited items list changes;
New or revised procedure: Revised;
Previous procedure: Scissors (metal with pointed tips, except ostomy
scissors with pointed tips with an overall length, including blades and
handle, of 4 inches or less, when accompanied by an ostomate supply kit
containing related supplies, such as collection pouches, wafers,
positioning plates, tubing, or adhesives) and tools (including, but not
limited to, wrenches and pliers) were not permitted on aircraft;
Proposed procedure: Allow scissors with pointed tips and blades less
than 4 inches and tools less than 7 inches in length onto aircraft.
Source: TSA:
Note: N/A stands for "not applicable," meaning that no previous
procedure existed prior to the new procedure.
[A] Implementation of SPOT did not involve a revision to the passenger
checkpoint screening SOP; rather, TSA developed a separate set of
standard operating procedures for SPOT. However, we included SPOT in
our review because it modifies the way in which TSOs screen passengers
and their carry-on items at the checkpoint.
[End of table]
Our analysis of TSA's data collection and data analysis for the seven
procedures that were operationally tested identified several problems
that affected TSA's ability to determine whether these procedures, as
designed and implemented by TSA, would have the intended effect--to
enhance the detection of explosives during the passenger screening
process or to free up resources so that explosives detection procedures
could be implemented. Although the deterrence of persons intending to
do harm is also an intended effect of some proposed SOP modifications,
TSA officials said that it is difficult to assess the extent to which
implementation of proposed procedures would deter terrorists. The
Office of Management and Budget has also acknowledged the difficulty in
measuring deterrence, particularly for procedures intended to prevent
acts of terrorism. While we agree that measuring deterrence is
difficult, opportunities exist for TSA to strengthen its analyses to
help provide information on whether the proposed procedures would
enhance detection or free up TSO resources, when intended.
Screening Passengers by Observation Technique. TSA officials stated
that SPOT is intended to both deter terrorists and identify suspicious
persons who intend to cause harm while on an aircraft. While we
recognize that it is difficult to assess the extent to which terrorists
are deterred by the presence of designated TSOs conducting behavioral
observations at the checkpoint, we believe that there is an opportunity
to assess whether SPOT contributes to enhancing TSA's ability to detect
suspicious persons that may intend to cause harm on an aircraft. One
factor that may serve as an indicator that a person intends to do harm
on an aircraft is whether that individual is carrying a prohibited
item. TSA collected and assessed data at 14 airports for various time
periods on the number of prohibited items found on passengers who were
targeted under SPOT and referred to secondary screening or law
enforcement officials.[Footnote 37] However, these data collection
efforts, alone, did not enable TSA to determine whether the detection
of prohibited items would be enhanced if SPOT were implemented because
TSA had no means of comparing whether persons targeted by SPOT were
more likely to carry prohibited items than persons not targeted by
SPOT. To obtain this information, the task force would have had to
collect data on the number of passengers not targeted by SPOT that had
prohibited items on them. This information could be used to determine
whether a greater percentage of passengers targeted under SPOT are
found to have prohibited items than those passengers who are not
targeted by SPOT, which could serve as one indicator of the extent to
which SPOT would contribute to the detection of passengers intending to
cause harm on an aircraft.
Although it has not yet done so, it may be possible for TSA to evaluate
the impact of SPOT on identifying passengers carrying prohibited items.
There is precedent in other federal agencies for evaluating the
security benefits of similar procedures. For instance, U.S. Customs and
Border Protection (CBP) within DHS developed the Compliance Examination
(COMPEX) system to evaluate the effectiveness of its procedures for
selecting international airline passengers for secondary screening.
Specifically, COMPEX compares the percentage of targeted passengers on
which prohibited items are found to the percentage of randomly selected
passengers on which prohibited items are found. The premise is that
targeting is considered to be effective if a greater percentage of
targeted passengers are found to possess prohibited items than the
percentage of randomly selected passengers, and the difference between
the two percentages is statistically significant.[Footnote 38] CBP
officials told us in May 2006 that they continue to use COMPEX to
assess the effectiveness of their targeting of international airline
passengers.[Footnote 39] When asked about using a method such as COMPEX
to assess SPOT, TSA officials stated that CBP and TSA are seeking to
identify different types of threats through their targeting programs.
CBP, through its targeting efforts, is attempting to identify
passengers with contraband and unauthorized aliens, whereas TSA,
through SPOT, is attempting to identify potential high-risk passengers.
Additionally, in commenting on a draft of this report, DHS stated that,
according to TSA, the possession of a prohibited item is not a good
measure of SPOT effectiveness because an individual may not intend to
use a prohibited item to cause harm or hijack an aircraft. While it may
be possible for a terrorist to cause harm or hijack an aircraft without
using a prohibited item, as in the case of the September 11 terrorist
attacks,[Footnote 40] other terrorist incidents and threat information
identify that terrorists who carried out or planned to carry out an
attack on a commercial aircraft intended to do so by using prohibited
items, including explosives and weapons. Therefore, we continue to
believe that comparing the percentage of individuals targeted and not
targeted under SPOT on which prohibited items are found could be one of
several potential indicators of the effectiveness of SPOT. Such a
measure may be most useful with regard to the prohibited items that
could be used to bring down or hijack an aircraft. TSA officials stated
that the agency agrees in principle that measuring SPOT effectiveness,
if possible, may provide valuable insights.
Unpredictable Screening Process, Bulk-Item Pat-Down Search, and IED
Component Search. We found that the task force also could have
strengthened its efforts to evaluate the security impact of other
proposed procedures--specifically, USP, the bulk-item pat-down search,
and the IED component search. For all three of these procedures, the
task force did not collect any data during the operational testing that
would help determine whether they would enhance detection capability.
TSA officials told us that they did not collect these data because they
had a limited amount of time to test the procedures because they had to
make SOP modifications quickly as part of the agency's efforts to focus
on higher threats, such as explosives, and the TSA Assistant
Secretary's goal of implementing the SOP modifications before the 2005
Thanksgiving holiday travel season. Nevertheless, TSA officials
acknowledged the importance of evaluating whether proposed screening
procedures, including USP and the bulk-item pat-down, would enhance
detection capability. TSA officials stated that covert testing has been
used to assess TSOs' ability to detect prohibited items, but covert
testing was not implemented during operational testing of proposed
procedures. Office of Inspection officials questioned whether covert
testing could be used to test, exclusively, the security benefit of
proposed procedures, because TSO proficiency and the capability of
screening technology also factor into whether threat objects are
detected during covert tests. Four of the five aviation security
experts we interviewed acknowledged this limitation but stated that
covert testing is the best way to assess the effectiveness of passenger
checkpoint screening.[Footnote 41] In commenting on a draft of this
report, DHS stated that, according to TSA, USP is intended to disrupt
terrorists' planning of an attack by introducing unpredictability into
the passenger checkpoint screening process, and tools such as covert
testing could not be used to measure the effectiveness of USP to this
end. While we agree that covert testing may not be a useful tool to
assess the impact USP has on disrupting terrorists' plans and deterring
terrorists from attempting to carry out an attack, we continue to
believe that covert testing could have been used to assess whether USP
would have helped to enhance detection capability during the passenger
screening process, which TSA officials stated was another intended
result of USP.
Although TSA did not collect data on the security impact of the USP and
bulk-item pat-down procedures, the task force did collect data on the
impact these procedures had on screening efficiency--the time required
to perform procedures--and on the reaction of TSOs, FSDs, and
passengers to the proposed procedures. These data indicated that the
USP procedure took less time, on average, for TSOs to conduct than the
procedure it replaced (the random continuous selectee screening
process); the revised pat-down procedure took TSOs about 25 seconds to
conduct; and that passengers generally did not complain about the way
in which both procedures were conducted.
With respect to operational testing of the IED component search
procedure, TSA was unable to collect any data during the testing period
because no IEDs were detected by TSOs at the airports where the testing
took place. As with the USP and bulk-item pat-down procedures, TSA
could have conducted covert tests during the operational testing period
to gather simulated data for the IED search procedure, in the absence
of actual data.
Selectee Screening Changes and Threat Area Search. Recognizing that
some of the proposed procedures intended to enhance detection would
require additional TSO resources, TSA implemented several measures
aimed collectively at freeing up TSOs' time so that they could focus on
conducting more procedures associated with higher threats--identifying
explosives and suspicious persons. For example, TSA modified the
selectee screening procedure and the procedure for searching carry-on
items--the threat area search--in order to reduce screening time.
During an informal pilot of these proposed procedures at 3 airports in
November 2005, TSA determined that the proposed selectee screening
procedure would reduce search time of each selectee passenger, on
average, by about 1.17 minutes at these airports. TSA also determined
through this study that the proposed threat area search, on average,
took 1.83 minutes to conduct at the participating airports, as compared
to the existing target object search that took, on average, 1.89
minutes, and the existing whole bag search that took, on average, 2.37
minutes.
Prohibited Items List Changes. Another measure TSA implemented to free
up TSO resources to focus on higher threats involved changes to the
list of items prohibited onboard aircraft. According to TSA, TSOs were
spending a disproportionate amount of TSA's limited screening resources
searching for small scissors and small tools, even though, based on
threat information and TSA officials' professional judgment, such items
no longer posed a significant security risk given the multiple layers
of aviation security. TSA officials surmised that by not having to
spend time and resources physically searching passengers' bags for low-
threat items, such as small scissors and tools, TSOs could focus their
efforts on implementing more effective and robust screening procedures
that can be targeted at screening for explosives.
To test its assumption that a disproportionate amount of TSO resources
was being spent searching for small scissors and tools, TSA collected
information from several sources. First, TSA reviewed data maintained
in TSA's Performance Management Information System (PMIS),[Footnote 42]
which showed that during the third and fourth quarters of fiscal year
2005 (a 6-month period), TSOs confiscated a total of about 1.8 million
sharp objects other than knives or box cutters. These sharp objects
constituted 19 percent of all prohibited items confiscated at the
checkpoint. Second, based on information provided by FSDs, TSOs, and
other screening experts, TSA determined that scissors constituted a
large majority of the total number of sharp objects found at passenger
screening checkpoints. Third, TSA headquarters officials searched
through confiscated items bins at 4 airports and found that most of the
scissors that were confiscated had blades less than 4 inches in length.
Based on these collective efforts, TSA concluded that a significant
number of items found at the checkpoint were low-threat, easily
identified items, such as small scissors and tools, and that a
disproportionate amount of time was spent searching for these items--
time that could have been spent searching for high-threat items, such
as explosives. TSA also concluded that because TSOs can generally
easily identify scissors, if small scissors were no longer on the
prohibited items list, TSOs could avoid conducting time-consuming
physical bag searches to locate and remove these items.
While we commend TSA's efforts to supplement professional judgment with
data and metrics in its decision to modify passenger checkpoint
screening procedures, TSA did not conduct the necessary analysis of the
data collected to determine the extent to which the removal of small
scissors and tools from the prohibited items list could free up TSO
resources. Specifically, TSA did not analyze the data on sharp objects
confiscated at the checkpoint along with other relevant factors, such
as the amount of time taken to search for scissors and the number of
TSOs at the checkpoint conducting these searches, to determine the
extent to which TSO resources could actually be freed up. Based on our
analysis of TSA's data for the 6-month period, where we considered
these other relevant factors, we determined that TSOs spent, on
average, less than 1 percent of their time--about 1 minute per day over
the 6-month period--searching for the approximately 1.8 million sharp
objects, other than knives and box cutters, that were found at
passenger screening checkpoints between April 2005 and September
2005.[Footnote 43] If the average amount of time TSOs spent searching
for sharp objects per day over a 6-month period was less than 1 minute
per TSO, and sharp objects constituted just 19 percent of all
prohibited items confiscated at checkpoints over this period, then it
may not be accurate to assume that no longer requiring TSOs to search
for small scissors and tools would significantly contribute to TSA's
efforts to free up TSO resources that could be used to implement other
security measures.
To further support its assertion that significant TSO resources would
be freed up as a result of removing small scissors and tools from the
list of prohibited items, TSA officials cited the results of an
informal study conducted in October 2005--which was intended to provide
a general idea of the types of prohibited items TSOs were finding as a
result of their searches and how long various types of searches were
taking TSOs to conduct. Specifically, according to the study conducted
at 9 airports over a 14-day period, TSA determined that 24 percent of
items found during carry-on bag searches were scissors. However, based
on data regarding the number of bags searched, removing scissors may
not significantly contribute to TSA's efforts to free up TSO
resources.[Footnote 44]
TSA conducted additional informal studies 30, 60, and 90 days after the
prohibited items list change went into effect to determine whether the
change had resulted in reductions in the percentage of carry-on bags
that were searched and overall screening time. However, we identified
limitations in TSA's methodology for conducting these studies.[Footnote
45] In February 2007, a TSA official stated that some FSDs interviewed
several TSOs after the prohibited items list change went into effect,
and these TSOs reported that the change did save screening time.
However, TSA could not identify how many TSOs were interviewed, at
which airports the TSOs were located, and how the TSOs were selected
for the interview; nor did TSA document the results of these
interviews. TSA also did not use random selection or representative
sampling when determining which TSOs should be interviewed. Therefore,
the interview results cannot be generalized.
TSA officials acknowledged that they could have made some improvements
in the various analyses they conducted on the prohibited items list
change. However, they stated that they had to make SOP modifications
quickly as part of the agency's efforts to focus on higher threats,
such as explosives, and the TSA Assistant Secretary's goal of
implementing the SOP modifications before the 2005 Thanksgiving holiday
travel season. Additionally, officials stated that they continue to
view their decision to remove small scissors and tools from the
prohibited items list as sound, particularly because they believe small
scissors and tools do not pose a significant threat to aviation
security. TSA officials also stated that they believe the prohibited
items list change would free up resources based on various sources of
information, including the professional judgment of TSA airport staff,
and their analysis of PMIS data on prohibited items confiscated at
checkpoints. The TSA Assistant Secretary told us that even if TSA
determined that the proposed SOP modifications would not free up
existing TSO resources to conduct explosives detection procedures, he
would have implemented the modifications anyway considering the added
security benefit of the explosives detection procedures. Additionally,
a TSA headquarters official responsible for airport security operations
stated that to help strengthen the agency's analysis of future proposed
SOP changes, the agency plans to provide the Explosives Detection
Improvement Task Force with the necessary resources to help improve its
data collection and analysis.
An additional measure intended to free up TSO resources[Footnote 46]
involved changes to CAPPS rules.[Footnote 47] TSA's assumption is that
these changes could allow TSOs who were normally assigned to selectee
screening duties to be reassigned to new procedures, such as USP, which
may require new screening positions. (Both USP and SPOT require TSO
positions: USP requires one screening position for every two screening
lanes, while SPOT typically uses more than one screening position per
ticket checker at the checkpoint.[Footnote 48]):
According to FSDs we interviewed, the changes made to the prohibited
items list and the CAPPS rules had not freed up existing TSO resources,
as intended. Specifically, as of August 2006, 13 of 19 FSDs we
interviewed at airports that tested USP or SPOT said that TSO resources
were not freed up as a result of these changes. In addition, 9 of the
19 FSDs said that in order to operationally test USP or SPOT, TSOs had
to work overtime, switch from other functions (such as checked baggage
screening), or a screening lane had to be closed. TSA's Explosives
Detection Improvement Task Force reported that nearly all of the FSDs
at airports participating in operational testing of USP believed that
the procedure had security value,[Footnote 49] though the task force
also reported that 1 FSD dropped out of the operational testing program
for USP due to insufficient staffing resources and another could only
implement the procedure during off-peak travel periods. Additionally,
most of the FSDs we interviewed stated that the changes to the
prohibited items list and CAPPS rules did not free up TSOs, as
intended, to better enable TSOs to take required explosives detection
training. Specifically, as of August 2006, of the 19 FSDs we
interviewed at airports that implemented USP and SPOT, 13 said that
they did not experience more time to conduct explosives training as a
result of changes to the prohibited items list and CAPPS
rules.[Footnote 50] Three of the 13 FSDs said that they used overtime
to enable TSOs to take the explosives training. As previously stated,
the TSA Assistant Secretary stated that even if existing TSO resources
are not freed up to conduct explosives detection procedures, longer
lines and wait times at airport checkpoints are an acceptable
consequence, considering their added security benefit. With regard to
explosives training, he stated that it is acceptable for FSDs to use
overtime or other methods to ensure that all TSOs participated in the
required explosives detection training. He further noted that even if
one screening change does not free up TSO resources, all of the changes
intended to accomplish this--when taken together--should ultimately
help to redirect TSO resources to where they are most needed.
TSA's efforts to add data and metrics to its tool kit for evaluating
the impact of screener changes are a good way to supplement the use of
professional judgment and input from other experts and sources in
making decisions about modifying screening procedures. However, TSA's
methods for data collection and analysis could be improved. We
recognize the challenges TSA faces in evaluating the effectiveness of
proposed procedures, particularly when faced with time pressures to
implement procedures. However, by attempting to evaluate the potential
impact of screening changes on security and resource availability, TSA
could help support its decision making on how best to allocate limited
TSO resources and ensure that the ability to detect explosives and
other high-threat objects during the passenger screening process is
enhanced.
Documentation of the Reasoning behind Proposed SOP Modifications Was
Incomplete:
While we were able to assess TSA's reasoning behind certain proposed
SOP modifications considered during our review period, our analysis was
limited because TSA did not maintain complete documentation of proposed
SOP modifications. Documentation of the reasoning behind decisions to
implement or reject proposed modifications was maintained in various
formats, including spreadsheets developed by TSA officials, internal
electronic mail discussions among TSA officials, internal memorandums,
briefing slides, and reports generated based on the results of
operational testing. TSA did improve its documentation of the proposed
SOP modifications that were considered during the latter part of our 9-
month review period. Specifically, the documentation for the SOP
modifications proposed under the Explosives Detection Improvement Task
Force provided more details regarding the basis of the proposed
modifications and the reasoning behind decisions to implement or reject
the proposed modifications.
Of the 92 proposed SOP modifications considered during our 9-month
review period that TSA documented, TSA provided the basis for 72. More
specifically, TSA documented the basis--that is, the information,
experience, or event that encouraged TSA officials to propose an SOP
modification--for 35 of the 48 that were implemented and for 37 of the
44 that were not implemented. However, TSA only documented the
reasoning behind TSA senior leadership's decisions to implement or not
implement proposed SOP modifications for 43 of 92 proposed
modifications. According to TSA officials, documentation that explains
the basis for recommending proposed modifications can also be used to
explain TSA's reasoning behind its decisions to implement proposed
modifications. However, the basis on which an SOP modification was
proposed cannot always be used to explain TSA senior leadership's
decisions not to implement a proposed modification. In these cases,
additional documentation would be needed to understand TSA's decision
making. However, TSA only documented the reasoning behind its decisions
for about half (26 of 44) of the proposed modifications that were not
implemented. TSA officials told us that they did not intend to document
all SOP modifications that were proposed during our review period.
Officials stated that, in some cases, the reasoning behind TSA's
decision to implement or not implement a proposed SOP modification is
obvious and documentation is not needed. TSA officials acknowledged
that it is beneficial to maintain documentation on the reasoning behind
decisions to implement or reject proposed SOP modifications deemed
significant, particularly given the organizational restructuring and
staff turnover within TSA.[Footnote 51] However, TSA officials could
not identify which of the 92 proposed SOP modifications they consider
to be significant because they do not categorize proposed modifications
in this way.
Our standards for governmental internal controls and associated
guidance suggest that agencies should document key decisions in a way
that is complete and accurate, and that allows decisions to be traced
from initiation, through processing, to after completion.[Footnote 52]
These standards further state that documentation of key decisions
should be readily available for review. Without documenting this type
of information, TSA cannot always justify significant modifications to
passenger checkpoint screening procedures to internal or external
stakeholders, including Congress and the traveling public. In addition,
considering the ongoing personnel changes, without sufficient
documentation, future decision makers in TSA may not know on what basis
the agency historically made decisions to develop new or revise
existing screening procedures.
Following our 9-month review period, TSA continued to make efforts to
improve documentation of agency decision making, as evidenced by
decisions regarding the August 2006 and September 2006 SOP
modifications related to the screening of liquids and gels. For
example, TSA senior leadership evaluated the actions taken by the
agency between August 7 and August 13, 2006, in response to the alleged
liquid explosives terrorist plot, in order to identify lessons learned
and improve the agency's reaction to future security incidents. As a
result of this evaluation, as shown in table 4, TSA made several
observations and recommendations for improving documentation of agency
decision making when considering modifications to screening procedures.
Table 5: TSA Evaluation of Documentation of Agency Decisions Made
between August 7 and August 13, 2006, Regarding the Alleged Liquid
Explosives Terrorist Plot:
Observations: There was no tracking of the overall timing and progress
of deliberations of the various decision options;
Recommendations for improvement: Track and record key issues raised and
the timing of deliberations.
Observations: There was no formal tracking of the decision options that
were discussed or the rationale that was used when selecting among the
various decision options;
Recommendations for improvement: Formally document options discussed,
decisions made, and the rationale behind the decisions.
Observations: There were no formal requirements for the type of
information that needed to be documented or the format used to document
the information on agency decisions;
Recommendations for improvement: For each decision that is made,
standardize the type of information that should be documented and
develop an appropriate mechanism to store the information.
Observations: The documentation that was used to support agency
decisions did not contain basic audit trail information, such as the
origin of the document and how the document was used. This may prevent
decision makers from understanding the relevancy of the documentation
to agency decisions;
Recommendations for improvement: For each document used to support
agency decisions, identify the origin of the document and how the
document was used by decision makers.
Source: TSA.
[End of table]
Documentation of TSA's decisions regarding the September 26, 2006,
modifications to the liquid screening procedures showed that TSA had
begun implementing the recommendations in table 4. TSA's documentation
identified the various proposed liquid screening procedures that were
considered by TSA, the benefits and drawbacks of each proposal, and the
rationale behind TSA's final decision regarding which proposal to
implement. The documentation also tracked the timing of TSA's
deliberations of each of the proposed liquid screening procedures.
However, the documentation of TSA's decisions was not always presented
in a standard format, nor was the origin and use of supporting
documentation always identified. TSA officials acknowledged that
documentation of the September 2006 SOP modifications could have been
improved and stated that efforts to improve documentation, through
implementation of the recommendations in table 4, will continue to be a
high priority.
TSA Has Several Methods in Place to Monitor TSO Compliance with
Passenger Checkpoint Screening SOPs:
A New Performance Accountability System Helps TSA Monitor TSO
Compliance with SOPs:
TSA implemented a performance accountability system in part to
strengthen its monitoring of TSO compliance with passenger checkpoint
screening SOPs. Specifically, in April 2006, TSA implemented the
Performance Accountability and Standards System (PASS) to assess the
performance of all TSA employees, including TSOs.[Footnote 53]
According to TSA officials, PASS was developed in response to our 2003
report that recommended that TSA establish a performance management
system that makes meaningful distinctions in employee
performance,[Footnote 54] and in response to input from TSA airport
staff on how to improve passenger and checked baggage screening
measures. With regard to TSOs, PASS is not intended solely to measure
TSO compliance with SOPs. Rather, PASS will be used by TSA to assess
agency personnel at all levels on various competencies, including
training and development, readiness for duty, management skills, and
technical proficiency.
There are three elements of the TSO technical proficiency component of
PASS that are intended to measure TSO compliance with passenger
checkpoint screening procedures: (1) quarterly observations conducted
by FSD management staff of TSOs' ability to perform particular
screening functions in the operational environment, such as pat-down
searches and use of the hand-held metal detector, to ensure they are
complying with checkpoint screening SOPs; (2) quarterly quizzes given
to TSOs to assess their knowledge of the SOPs; and (3) an annual,
multipart knowledge and skills assessment. While the first two elements
are newly developed, the third element--the knowledge and skills
assessment--is part of the annual TSO recertification program that is
required by the Aviation and Transportation Security Act (ATSA) and has
been in place since October 2003.[Footnote 55] Collectively, these
three elements of PASS are intended to provide a systematic method for
monitoring whether TSOs are screening passengers and their carry-on
items according to SOPs. TSA's implementation of PASS is consistent
with our internal control standards, which state that agencies should
ensure that policies and procedures are applied properly.[Footnote 56]
The first component of PASS (quarterly observations) is conducted by
screening supervisors or screening managers, using a standard checklist
developed by TSA headquarters, with input from TSA airport staff. There
is one checklist used for each screening function, and TSOs are
evaluated on one screening function per quarter. For example, the hand-
held metal detector skills observation checklist includes 37 tasks to
be observed, such as whether the TSO conducted a pat-down search to
resolve any suspect areas. The second component of PASS (quarterly
quizzes) consists of multiple-choice questions on the standard
operating procedures. For example, one of the questions on the PASS
quiz is "What is the correct place to start an HHMD outline [a hand-
held metal detector search] on an individual: (a) top of the head, (b)
top of the feet, or (c) top of the shoulder?"
The third component of PASS is the annual knowledge and skills
assessment, a component of the annual recertification program that
evaluates the technical proficiency of TSOs. This assessment is
composed of three modules: (1) knowledge of standard operating
procedures, (2) recognition of threat objects on an X-ray image, and
(3) demonstration of screening functions. According to TSA officials,
while recertification testing is not a direct measure of operational
compliance with passenger checkpoint screening SOPs, recertification
testing, particularly module 1 and module 3, is an indicator of whether
TSOs are capable of complying with SOPs. TSA officials stated that if a
TSO does not have knowledge of SOPs and if the TSO cannot demonstrate
basic screening functions as outlined in the SOPs, then the TSO will
likely not be able to comply with SOPs when performing in the operating
environment. Table 5 provides a summary of each of these modules.
Table 6: Modules Included in Recertification Knowledge and Skills
Assessment:
Testing module: Knowledge of standard operating procedures;
Description: Computerized 50-question multiple-choice test. It is
either passenger-or baggage-specific.
Testing module: Image recognition;
Description: Computerized test that consists of 100 images and is used
to evaluate a TSO's skill and ability in detecting threat or prohibited
objects within X-ray images.
Testing module: Practical demonstration of skills;
Description: Hands- on simulated work sample to evaluate a TSO's
knowledge, skills, and ability when performing specific screening tasks
along with ability to provide customer service.
Source: TSA.
[End of table]
FSDs we interviewed reported that they have faced resource challenges
in implementing PASS. Specifically, as of July 2006, 9 of 24 FSDs we
interviewed said they experienced difficulties in implementing PASS due
to lack of available staff to conduct the compliance-related
evaluations. TSA officials stated that they have automated many of the
data-entry functions of PASS to relieve the field of the burden of
manually entering this information into the PASS online system. For
example, all scores related to the quarterly quiz and skill observation
components are automatically uploaded, and PASS is linked to TSA's
online learning center database to eliminate the need to manually enter
TSOs' learning history. In addition, the TSA Assistant Secretary said
that FSDs were given the option of delaying implementation of PASS if
they were experiencing resource challenges.
TSA Uses Local and National Covert Testing, in Part, to Assess TSO
Compliance with SOPs:
TSA also conducts local and national covert tests, which are used to
evaluate, in part, the extent to which noncompliance with the SOPs
affects TSOs' ability to detect simulated threat items hidden in
accessible property or concealed on a person. TSA first issued guidance
on its local covert testing program--known as Screener Training
Exercises and Assessments (STEA)--in February 2004. STEA testing is
conducted by FSD staff at airports, who determine the frequency at
which STEA tests are conducted as well as which type of STEA tests are
conducted. According to the STEA results reported by TSA between March
2004 and February 2006,[Footnote 57] TSOs' noncompliance with the SOP
accounted for some of the STEA test failures.[Footnote 58] TSOs' lack
of proficiency in skills or procedures, which may affect TSOs' ability
to comply with procedures, was also cited as the reason for some of the
STEA test failures. TSOs who fail STEA tests are required to take
remedial training to help them address the reasons for their failure.
FSDs we interviewed reported that they have faced resource challenges
in conducting STEA tests. Specifically, even though all 24 FSDs we
interviewed as of July 2006 said that they have conducted STEA tests,
10 of these FSDs said that the lack of available staff made it
difficult to conduct these tests. When asked how they planned to
address FSDs' concerns regarding a lack of available staff to complete
STEA tests, TSA headquarters officials told us that they are
considering resource alternatives for implementing the STEA program,
but could not provide us with the specific details of these
plans.[Footnote 59] Until the resource limitations that have restricted
TSA's use of its compliance monitoring tools have been fully addressed,
TSA may not have assurance that TSOs are screening passengers according
to the SOP.
As previously discussed, TSA's Office of Inspection initiated its
national covert testing program in September 2002. National covert
tests are conducted by TSA headquarters-based inspectors who carry
simulated threat objects hidden in accessible property or concealed on
their person through airport checkpoints, and in cases where TSOs fail
to detect threat objects, the inspectors identify the reasons for
failure. During September 2005, TSA implemented a revised covert
testing program to focus more on catastrophic threats--threats that can
bring down or destroy an aircraft. According to Office of Inspection
officials, TSOs may fail to detect threat objects during covert testing
for various reasons, including limitations in screening technology,
lack of training, limitations in the procedures TSOs must follow to
conduct passenger and bag searches, and TSOs' noncompliance with
screening checkpoint SOPs. Office of Inspection officials also said
that one test could be failed due to multiple factors, and that it is
difficult to determine the extent to which any one factor contributed
to the failure. TSOs who fail national covert tests, like those who
fail STEA tests, are also required to take remedial training to help
them address the reasons for failure.[Footnote 60]
Conclusions:
The alleged August 2006 terrorist plot to detonate liquid explosives
onboard multiple U.S.-bound aircraft highlighted the need for TSA to
continuously reassess and revise, when deemed appropriate, existing
passenger checkpoint screening procedures to address threats against
the commercial aviation system. In doing so, TSA faces the challenge of
securing the aviation system while facilitating the free movement of
people. Passenger screening procedures are only one element that
affects the effectiveness and efficiency of the passenger checkpoint
screening system. Securing the passenger checkpoint screening system
also involves the TSOs who are responsible for conducting the screening
of airline passengers and their carry-on items, and the technology used
to screen passengers and their carry-on items.
We believe that TSA has implemented a reasonable approach to modifying
passenger checkpoint screening procedures through its consideration of
risk factors (threat and vulnerability information), day-to-day
experience of TSA airport staff, and complaints and concerns raised by
passengers and by making efforts to balance security, efficiency, and
customer service. We are also encouraged by TSA's efforts to conduct
operational testing and use data and metrics to support its decisions
to modify screening procedures. We acknowledge the difficulties in
assessing the impact of proposed screening procedures, particularly
with regard to the extent to which proposed procedures would deter
terrorists from attempting to carry out an attack onboard a commercial
aircraft. However, there are existing methods, such as covert testing
and CBP's COMPEX--a method that evaluates the effectiveness of CBP's
procedures for selecting international airline passengers for secondary
screening--that could be used by TSA to assess whether proposed
screening procedures enhance detection capability. It is also important
for TSA to fully assess available data to determine the extent to which
TSO resources would be freed up to perform higher-priority procedures,
when this is the intended effect. Without collecting the necessary data
or conducting the necessary analysis that would enable the agency to
assess whether proposed SOP modifications would have the intended
effect, it may be difficult for TSA to determine how best to improve
TSOs' ability to detect explosives and other high-threat items and to
allocate limited TSO resources. With such data and analysis, TSA would
be in a better position to justify its SOP modifications and to have a
better understanding of how the changes affect TSO resources.
Additionally, because TSA did not always document the basis on which
SOP modifications were proposed or the reasoning behind decisions to
implement or not implement proposed modifications, TSA may not be able
to justify SOP modifications to Congress and the traveling public.
While we are encouraged that TSA's documentation of its decisions
regarding the SOP modifications made in response to the alleged August
2006 liquid explosives terrorist plot was improved compared to earlier
documentation, it is important for TSA to continue to work to
strengthen its documentation efforts. Such improvements would enable
TSA officials responsible for making SOP decisions in the future to
understand how significant SOP decisions were made historically--a
particular concern considering the restructuring and staff turnover
experienced by TSA.
As shown by TSA's covert testing results, the effectiveness of
passenger checkpoint screening relies, in part, on TSOs' compliance
with screening procedures. We are, therefore, encouraged by TSA's
efforts to strengthen its monitoring of TSO compliance with passenger
screening procedures. We believe that TSA has implemented a reasonable
process for monitoring TSO compliance and that this effort should
assist TSA in providing reasonable assurance that TSOs are screening
passengers and their carry-on items according to screening procedures.
Given the resource challenges FSDs identified in implementing the
various methods for monitoring TSO compliance, it will be important for
TSA to take steps, such as automating PASS data entry functions, to
address such challenges.
Recommendations for Executive Action:
To help strengthen TSA's evaluation of proposed modifications to
passenger checkpoint screening SOPs and TSA's ability to justify its
decisions to implement or not implement proposed SOP modifications, in
the March 2007 report that contained sensitive security information, we
recommended that the Secretary of Homeland Security direct the
Assistant Secretary of Homeland Security for TSA to take the following
two actions:
* when operationally testing proposed SOP modifications, develop sound
evaluation methods, when possible, that can be used to assist TSA in
determining whether proposed procedures would achieve their intended
result, such as enhancing TSA's ability to detect prohibited items and
suspicious persons and freeing up existing TSO resources that could be
used to implement proposed procedures, and:
* for future proposed SOP modifications that TSA senior leadership
determines are significant, generate and maintain documentation to
include, at minimum, the source, intended purpose, and reasoning behind
decisions to implement or not implement proposed modifications.
Agency Comments and Our Evaluation:
On March 6, 2007, we received written comments on the draft report,
which are reproduced in full in appendix III. DHS generally concurred
with our recommendations and outlined actions TSA plans to take to
implement the recommendations.
DHS stated that it appreciates GAO's conclusion that TSA has
implemented a reasonable approach to modifying passenger checkpoint
screening procedures through its assessment of risk factors, the
expertise of TSA employees, and input from the traveling public and
other stakeholders, as well as TSA's efforts to balance security,
operational efficiency, and customer service while evaluating proposed
changes.
With regard to our recommendation to develop sound evaluation methods,
when possible, to help determine whether proposed SOP modifications
would achieve their intended result, DHS stated that TSA plans to make
better use of generally accepted research design principles and
techniques when operationally testing proposed SOP modifications. For
example, TSA will consider using random selection, representative
sampling, and control groups in order to isolate the impact of proposed
SOP modifications from the impact of other variables. DHS also stated
that TSA's Office of Security Operations is working with subject matter
experts to ensure that operational tests are well designed and
executed, and produce results that are scientifically valid and
reliable. As discussed in this report, employing sound evaluation
methods for operationally testing proposed SOP modifications will
enable TSA to have better assurance that new passenger checkpoint
screening procedures will achieve their intended purpose, which may
include improved allocation of limited TSO resources and enhancing
detection of explosives and other high-threat objects during the
passenger screening process. However, DHS stated, and we agree, that
the need to make immediate SOP modifications in response to imminent
terrorist threats may preclude operational testing of some proposed
modifications.
Concerning our recommendation regarding improved documentation of
proposed SOP modifications, DHS stated that TSA intends to document the
source, intent, and reasoning behind decisions to implement or reject
proposed SOP modifications that TSA senior leadership deems
significant. Documenting this type of information will enable TSA to
justify significant modifications to passenger checkpoint screening
procedures to internal and external stakeholders, including Congress
and the traveling public. In addition, considering the ongoing
personnel changes TSA has experienced, such documentation should enable
future decision makers in TSA to understand on what basis the agency
historically made decisions to develop new or revise existing screening
procedures.
In addition to commenting on our recommendations, DHS provided comments
on some of our findings, which we considered and incorporated in the
report where appropriate. One of DHS's comments pertained to TSA's
evaluation of the prohibited items list change. Specifically, while TSA
agrees that the agency could have conducted a more methodologically
sound evaluation of the impact of the prohibited items list change, TSA
disagrees with our assessment that the prohibited items list change may
not have significantly contributed to TSA's efforts to free up TSO
resources to focus on detection of high-threat items, such as
explosives. As we identified in this report, based on interviews with
FSDs, airport visits to determine the types of items confiscated at
checkpoints, and a study to determine the amount of time taken to
conduct bag searches and the number of sharp objects collected as a
result of these searches, TSA concluded that the prohibited items list
change would free up TSO resources. DHS also stated that interviews
with TSOs following the prohibited items list change confirmed that the
change had freed up TSO resources. However, based on our analysis of
the data TSA collected both prior to and following the prohibited items
list change, we continue to believe that TSA did not conduct the
necessary analysis to determine the extent to which the removal of
small scissors and tools from the prohibited items list would free up
TSA resources.
As agreed with your office, unless you publicly announce its contents
earlier, we plan no further distribution of this report until 21 days
from the date of this report. At that time, we will send copies of the
report to the Secretary of the Department of Homeland Security, the TSA
Assistant Secretary, and interested congressional committees as
appropriate. We will also make copies available to others on request.
If you or your staff have any questions about this report, please
contact me at (202) 512-3404 or berrickc@gao.gov. Contact points for
our Offices of Congressional Relations and Public Affairs may be found
on the last page of this report. GAO staff that made major
contributions to this report are listed in appendix IV.
Signed by:
Cathleen A. Berrick:
Director, Homeland Security and Justice Issues:
[End of section]
Appendix I: Objectives, Scope and Methodology:
To assess the Transportation Security Administration's (TSA) process
for modifying passenger checkpoint screening procedures and how TSA
monitors compliance with these procedures, we addressed the following
questions: (1) How and on what basis did TSA modify passenger screening
procedures and what factors guided the decisions to do so? (2) How does
TSA determine whether TSOs are complying with the standard procedures
for screening passengers and their carry-on items?
To address how TSA modified passenger screening procedures and what
factors guided the decisions to do so, we obtained and analyzed
documentation of proposed standard operating procedures (SOP) changes
considered between April 2005 and September 2005, as well as threat
assessments and operational studies that supported SOP
modifications.[Footnote 61] The documentation included a list of
proposed changes considered, as well as the source, the intended
purpose, and in some cases the basis for recommending the SOP
modification--that is, the information, experience, or event that
encouraged TSA officials to propose the modifications--and the
reasoning behind decisions to implement or reject proposed SOP
modifications. We also obtained documentation of the proposed SOP
changes considered by TSA's Explosives Detection Improvement Task
Force, which was the deliberating body for proposed changes that were
considered between October 2005 and December 2005. We also reviewed and
analyzed similar documentation for proposed SOP modifications
considered between August 2006 and November 2006 in response to the
alleged terrorist plot to detonate liquid explosives onboard multiple
aircraft en route from the United Kingdom to the United States. We
included modifications to passenger checkpoint screening procedures
related to this particular event because they provided the most recent
information available of TSA's approach to modifying screening
procedures in response to an immediate perceived threat to civil
aviation. The documentation included notes from internal meetings,
slides for internal and external briefings on proposed SOP
modifications, data on customer complaints and screening efficiency,
and the results of liquid explosives testing conducted by the
Department of Homeland Security (DHS) Science and Technology
Directorate and the Federal Bureau of Investigation (FBI). We also
obtained each revision of the passenger checkpoint screening SOP that
was generated between April 2005 and December 2005 and August 2006 and
November 2006,[Footnote 62] as well as accompanying documentation that
highlighted all of the changes made in each revision. In addition, we
met with TSA headquarters officials who were involved in the process
for determining whether proposed passenger checkpoint screening
procedures should be implemented. We also met with officials in the DHS
Science and Technology Directorate as well as the FBI to discuss the
methodology and results of their liquid explosives tests, which were
used to support TSA's decisions to modify the SOP in September 2006. We
also met with TSA Office of Inspection and DHS Office of Inspector
General staff to discuss their covert testing at passenger checkpoints
and the recommended changes to the passenger checkpoint screening SOP
that were generated based on testing results. We also obtained and
analyzed data and information collected by TSA on the proposed
procedures that were evaluated in the operational environment. In
addition, we met or conducted phone interviews with Federal Security
Directors (FSD) and their management staff, including Assistant FSDs
and Screening Managers, and Transportation Security Officers (TSO) with
passenger screening responsibilities, at 25 commercial airports to gain
their perspectives on TSA's approach to revising the passenger
checkpoint screening SOP. We also met with officials from four aviation
associations--the American Association of Airport Executives, Airports
Council International, the Air Transport Association, and the Regional
Airline Association--to gain their perspectives on this objective.
Finally, we met with five aviation security experts to obtain their
views on methods for assessing the impact of proposed passenger
checkpoint screening procedures. We selected these experts based on
their depth of experience in the field of aviation security, employment
history, and their recognition in the aviation security community.
However, the views of these experts may not necessarily represent the
general view of other experts in the field of aviation security. We
compared TSA's approach to revising its passenger checkpoint screening
SOP with the Comptroller General's standards for internal control in
the federal government[Footnote 63] and risk management guidance.
To address how TSA determines whether TSOs are complying with the
standard procedures for screening passengers and their carry-on items,
we obtained documentation of compliance-related initiatives, including
guidance, checklists, and SOP quizzes used to assess TSO compliance
under the Performance Accountability and Standards System (PASS), and
guidance provided to FSDs for developing local compliance audit
programs. We also obtained the fiscal year 2005 recertification and
Screener Training Exercises and Assessments (STEA) test results, which
were used, in part, to assess TSO compliance with and knowledge of the
passenger checkpoint screening SOP. In addition, we reviewed the
results of covert testing conducted by TSA's Office of Inspection,
which were also used, in part, to assess TSO compliance with the
passenger checkpoint screening SOP. We assessed the reliability of the
compliance-related data we received from TSA, and found the data to be
sufficiently reliable for our purposes. In addition, we interviewed TSA
headquarters officials who were responsible for overseeing efforts to
monitor TSO compliance with standard operating procedures. This
included officials in the Office of Security Operations, Office of
Human Capital, and the Office of Operational Process and Technology.
Our audit work also included visits to or phone conferences with 25
airports, where we interviewed FSDs, members of their management teams,
and Transportation Security Officers with passenger screening
responsibilities.[Footnote 64] However, the perspectives of these FSDs
and their staff cannot be generalized across all airports. In July
2006, we submitted two sets of follow-up questions to FSD staff,
related to their experiences with implementing PASS and STEA tests. We
also obtained documentation of local compliance audit programs from the
FSD staff at several of these airports. We compared TSA's approach for
monitoring TSO compliance with the Comptroller General's standards for
internal control in the federal government.[Footnote 65]
As previously mentioned, we conducted site visits and/or phone
interviews at 25 airports[Footnote 66] (8 category X airports, 7
category I airports, 4 category II airports, 4 category III airports,
and 2 category IV airports) to discuss issues related to TSA's approach
to revising the passenger checkpoint screening SOP, and the agency's
approach to monitoring TSO compliance with the SOP.[Footnote 67] We
visited 7 of these airports during the design phase of our study. These
airports were selected based on variations in size and geographic
location, and whether they were operationally testing any proposed
passenger checkpoint screening procedures or passenger screening
technology. We also selected 2 airports that participated in the
Screening Partnership Program.[Footnote 68]
After visiting the 7 airports during the design phase of our review, we
selected an additional 15 airports to visit based on variations in
size, geographic distribution, and performance on compliance-related
assessments. Specifically, we obtained and analyzed fiscal year 2005
Screener Training Exercise and Assessments results and fiscal year 2005
recertification testing results to identify airports across a range of
STEA and recertification scores. Additionally, we visited 3 additional
airports that operationally tested the proposed Unpredictable Screening
Process (USP) and the Screening Passengers by Observation Technique
(SPOT) procedure.
In July 2006, we received from 19 FSDs answers to follow-up questions
on their experiences with implementing pilot testing of SPOT or USP.
This included 14 FSDs that were not part of our initial rounds of
interviews. Nine of these 14 FSDs were from airports that participated
in SPOT pilots. The remaining 5 of 14 FSDs that were not part of our
initial rounds of interviews were from airports that were participants
in USP pilots.
We conducted our work from March 2005 through March 2007 in accordance
with generally accepted government auditing standards.
[End of section]
Appendix II: Sources of SOP Changes:
Of the 92 proposed screening changes considered by TSA between April
2005 and December 2005, 63 were submitted by TSA field staff, including
Federal Security Directors and Transportation Security
Officers.[Footnote 69] Thirty proposed screening changes were submitted
by TSA headquarters officials. Last, TSA senior leadership, such as the
TSA Assistant Secretary, recommended 5 of the 92 proposed screening
changes considered during this time period. One SOP modification was
also proposed through a congressional inquiry. TSA's solicitation of
input from both field and headquarters officials regarding changes to
the passenger checkpoint screening SOP was consistent with internal
control standards,[Footnote 70] which suggest that there be mechanisms
in place for employees to recommend improvements in operations.
The FSDs with whom we met most frequently identified periodic
conference calls with the Assistant Secretary, the SOP Question and
Answer mailbox, or electronic mail to Security Operations officials as
the mechanisms by which they recommended changes to the SOP. The TSOs
with whom we met identified their chain of command and the SOP Question
and Answer mailbox as the primary mechanisms by which they submitted
suggestions for new or revised procedures. According to TSA officials,
the SOP mailbox entails FSDs and their staff, including TSOs,
submitting suggestions, questions, or comments to TSA's Security
Operations division via electronic mail, either directly or through
their supervisors. Submissions are then compiled and reviewed by a
single Security Operations official, who generates responses to the
questions that have clear answers. However, for submissions for which
the appropriate response is not obvious or for submissions that include
a suggestion to revise the SOP, this official forwards the submissions
to other Security Operations officials for further deliberation. SOP
mailbox responses are provided to all TSA airport officials. If TSA
headquarters revised a screening procedure based on a mailbox
submission, the revision is noted in the mailbox response.
Thirty of the screening changes considered by TSA between April 2005
and December 2005 were proposed by TSA headquarters officials,
including Security Operations officials, who are responsible for
overseeing implementation of checkpoint screening. According to
Security Operations officials, they recommended changes to checkpoint
screening procedures based on communications with TSA field officials
and airport optimization reviews. Security Operations officials conduct
optimization reviews to identify best practices and deficiencies in the
checkpoint screening and checked baggage screening processes. As part
of these reviews, Security Operations officials may also assess
screening efficiency and whether TSOs are implementing screening
procedures correctly.
Other TSA headquarters divisions also suggested changes to passenger
checkpoint screening procedures. For example, the Office of Law
Enforcement recommended that there be an alternative screening
procedure for law enforcement officials who are escorting prisoners or
protectees. Previously, all armed law enforcement officers were
required to sign a logbook at the screening checkpoint, prior to
entering the sterile area of the airport. The officials in the Office
of Passengers with Disabilities also recommended changes to checkpoint
screening procedures. For example, in the interest of disabled
passengers, they suggested that TSOs be required to refasten all
wheelchair straps and buckles undone during the screening process.
Last, TSA senior leadership suggested 5 of the 92 procedural changes
considered by TSA between April 2005 and December 2005. TSA senior
leadership also proposed a procedure that would allow TSOs to conduct
the pat-down procedure on passengers of the opposite gender at airports
with a disproportionate ratio of male and female TSOs.
[End of section]
Appendix III: Comments from the Department of Homeland Security:
U.S. Department of Homeland Security:
Washington, DC 20528:
March 6, 2007:
Ms. Cathleen A. Berrick:
Director, Homeland Security and Justice Issues:
U.S. Government Accountability Office:
441 G. Street, NW:
Washington, DC 20548:
Dear Ms. Berrick:
Thank you for the opportunity to comment on the draft report: "Risk,
Experience, and Customer Concerns Drive Changes to Airline Passenger
Screening Procedures, but Evaluation and Documentation of Proposed
Changes Could be Improved (GAO-07-57SU)." The Department of Homeland
Security (DHS) and Transportation Security Administration (TSA)
appreciate GAO's work in planning, conducting, and issuing this report.
The Department and TSA appreciates GAO's conclusion that we have
implemented a reasonable approach to modifying passenger checkpoint
screening procedures through our assessment of risk factors, the
expertise of TSA employees, and input from the traveling public and
other stakeholders, as well as our efforts to balance security,
operational efficiency, and customer service while evaluating proposed
changes. In its report, GAO recommends that TSA improve its methods to
evaluate and document changes to passenger screening procedures. TSA
agrees that there is opportunity for improvement and will continue to
take steps to use sound evaluation methods, when possible, to assess
the impact of proposed changes. However, the urgency and nature of some
procedural changes may not always lend themselves to a resource
consuming (time and personnel) evaluation.
Above all, TSA must remain flexible and able to respond quickly to
disrupt new terrorist threats. TSA demonstrated such flexibility in
August 2006, when the challenge presented by the United Kingdom
Terrorist Bomb Plot caused us to quickly and effectively implement
important changes to airport security. In just a few hours, literally
overnight, TSA rolled out a new airport security checkpoint process at
every airport in the Nation. We trained tens of thousands of
Transportation Security Officers (TSOs) and rewrote dozens of
regulations affecting aviation security around the globe. In addition,
TSA effected changes world-wide for every flight bound for the United
States and silently (as always) deployed hundreds of Federal Air
Marshals to saturate affected flights flown by U.S. carriers. TSA
accomplished all of this while maintaining effective security - that
demonstrates flexibility. TSA will continue to be challenged by other
terrorist plots and will be prepared for the unknown new threat as well
as address all the known threats; this puts a priority on flexibility.
In December 2005, TSA also demonstrated the flexibility necessary to
protect the Nation's transportation system by making small but
important changes to the Prohibited Items List, which allowed TSOs to
focus more effort on detecting high-risk threats which have the ability
to cause catastrophic damage to an airplane in flight (e.g., Improvised
Explosive Devices [IEDs]). GAO, in its report, states that TSA's data
collection and analysis efforts, in this case, do not allow TSA to
conclude with certainty that these changes did indeed free-up TSO
resources. TSA disagrees as our decision to remove small scissors and
small tools from the Prohibited Items List was not only based on an
analysis of data but was also firmly rooted in our assessment of risk,
professional judgment, and experience. For example, when considering
the changes, experienced Federal Security Directors (FSDs) who were
interviewed unanimously indicated the changes would free up TSO
resources.
To further support this theory, TSA Headquarters officials visited
several airports, to observe first hand, the type of items surrendered
by passengers. TSA also considered standard values for how long it
takes to conduct a bag search along with data indicating how many sharp
items are collected. While this particular data collection and analysis
effort may not have been methodologically rigorous, it did serve to
provide us with insights regarding the type and quantity of items
collected at the passenger checkpoint. Perhaps most importantly, post-
change interviews with TSOs (TSA personnel with the most hands-on
experience concerning prohibited items) confirmed this change did
indeed save time. In sum, data collection and analysis, along with
several other factors, played a role in TSA's decision-making process.
GAO further states in its report that TSA could use a methodology
employed by Customs and Border Patrol (CBP) to assess whether proposed
screening procedures enhance detection. GAO suggests that, like CBP,
TSA could compare items collected from passengers identified by TSA's
program for Screening Passengers by Observation Technique (SPOT) to
those collected from a random sample of passengers. TSA agrees in
principle, that measuring SPOT effectiveness, if possible, may provide
valuable insights. However, CBP's behavior detection program seems to
be fundamentally different from TSA's SPOT program; consequently, such
a comparison may not be valid.
CBP and TSA are seeking to identify two completely different types of
threats: (1) CBP hopes to identify passengers who are attempting to
smuggle contraband or attempting to enter the country illegally; while
(2) TSA, on the other hand, seeks to identify passengers who intend to
cause catastrophic harm onboard an aircraft. In CPB's case, possession
of illegal contraband or fraudulent documentation is the target
variable. Possession of a prohibited item is not a good measure of SPOT
effectiveness, in TSA's case, since it does not include those
individuals who would plot to cause harm or commandeer an aircraft
without the benefit of a prohibited item. In addition, possession of a
prohibited item is often an oversight and not an intentional act. TSA's
SPOT program makes a significant contribution to passenger security as
a deterrent. As GAO indicates in its report, deterrence is not readily
measured.
GAO also indicates that TSA could have used covert testing to measure
the effectiveness of the Unpredictable Screening Process (USP). The
basic premise of USP is based on risk and current understanding of
terrorist tradecraft. For example, it is thought that terrorists wait
for opportunities where the likelihood of success is greater. Based on
historical intelligence, we know that terrorists conduct pre-
operational surveillance prior to an attack in an attempt to understand
all aspects of the security processes and use that knowledge against
their enemy. USP is a mechanism which effectively adds a layer of
unpredictability to the screening process and ensures the terrorists
cannot use standardized and predictable security protocols against us.
By its very design, the TSO understands exactly what process they
should use at a given time, but the terrorist will not be able to
exactly predict the type of screening they will undergo. Because the
security process is now unpredictable, terrorists are prevented from
ever developing the necessary confidence they require for their efforts
to succeed. Deterrence is not measured through covert testing.
Finally, TSA appreciates GAO's conclusion that it is important for TSA
to continue taking steps, such as automating Performance Accountability
and Standards System (PASS) data-entry functions, to address resource
challenges. TSA has organized a multi-disciplined team of subject
matter experts (SMEs) under one umbrella, the Optimization Program, to
help FSDs improve screening efficiency, effectiveness, and address
other resource challenges. SMEs scrutinize aspects of the operation,
such as screening processes, manpower allocation, equipment mix, and
safety programs. The optimization team develops recommendations and
then works with the FSD to ensure implementation achieves the desired
efficiency or effectiveness goal.
In summary, TSA is committed to protecting the security of the
traveling public. To this end, TSA must be flexible and able to adapt
quickly to changes in terrorist tactics. When possible, TSA will
continue to supplement our assessment of risk, professional judgment,
and experience with empirical data and analysis, while optimizing
resources and screening operations.
The following represents our responses to the recommendations.
Recommendation 1: When operationally testing proposed SOP
modifications, develop sound evaluation methods, when possible, that
can be used to assist TSA in determining whether proposed SOP
procedures would achieve their intended result, such as enhancing TSA's
ability to detect prohibited items and suspicious persons and freeing
up existing TSO resources that could be used to implement proposed
procedures:
Concur: TSA will continue to perform operational testing of proposed
SOP modifications, when practicable, as a method for determining their
impact on passenger checkpoint security effectiveness and efficiency.
Moving forward, TSA intends to make better use of generally accepted
research design principles and techniques. Some examples include:
* clearly defined testing protocols and success criteria (independent
and dependent variables);
* test objectives which are directly linked to program objectives;
* pre- test/ post-test and longitudinal studies;
* control groups to isolate the effect of chance and other confounding
variables;
* use of-:
- representative samples,
- randomization, and:
- statistical analysis.
TSA Office of Security Operations has already begun partnering with
other agency SMEs to ensure operational tests are well designed,
executed, and produce results that are scientifically valid and
reliable. Of course, the need to dispatch SOP changes to combat an
immediate terrorist threat may, at times, preclude operational testing
of new procedures.
Recommendation 2: For future proposed SOP modifications that TSA senior
leadership determines are significant, generate and maintain
documentation to include, at minimum, the source, intended purpose, and
reasoning behind decisions to implement or not implement proposed
modifications.
Concurs: As noted in this report, TSA maintained the recommended
documentation for the majority of the SOP issues studied by GAO. TSA
intends to raise this standard by establishing additional processes and
controls to ensure that all SOP change proposals that could have a
significant impact on resources or the integrity of the process of
ensuring security are documented as recommended. These controls are
being implemented in the current SOP change cycle. For each level of
review (consolidation, interagency staff, and senior decision making),
documentation includes the source, intent, and reasoning for
implementing or rejecting those SOP change proposals determined to be
significant by senior leadership.
TSA has already begun making progress implementing GAO's
recommendations. This progress demonstrates our commitment to continual
improvement to ensure the security of the traveling public.
Thank you for the opportunity to provide comments to the draft report.
Sincerely,
Signed by:
Steven J. Pecinovsky:
Director:
Departmental GAO/OIG Liaison Office:
[End of section]
Appendix IV: GAO Contact and Staff Acknowledgments:
GAO Contact:
Cathleen A. Berrick, (202) 512-3404 or berrickc@gao.gov:
Acknowledgments:
In addition to the person named above, Maria Strudwick, Assistant
Director; David Alexander; Christopher W. Backley; Amy Bernstein;
Kristy Brown; Yvette Gutierrez-Thomas; Katherine N. Haeberle; Robert D.
Herring; Richard Hung; Christopher Jones, Stanley Kostyla; and Laina
Poon made key contributions to this report.
[End of section]
GAO Related Products:
Aviation Security: TSA's Staffing Allocation Model Is Useful for
Allocating Staff among Airports, but Its Assumptions Should Be
Systematically Reassessed. GAO-07-299. Washington, D.C.: February 28,
2007:
Aviation Security: Progress Made in Systematic Planning to Guide Key
Investment Decisions, but More Work Remains. GAO-07-448T. Washington,
D.C.: February 13, 2007.
Homeland Security: Progress Has Been Made to Address the
Vulnerabilities Exposed by 9/11, but Continued Federal Action Is Needed
to Further Mitigate Security Risks. GAO-07-375. Washington, D.C.:
January 24, 2007.
Aviation Security: TSA Oversight of Checked Baggage Screening
Procedures Could Be Strengthened GAO-06-869. Washington, D.C.: July 28,
2006.
Aviation Security: TSA Has Strengthened Efforts to Plan for the Optimal
Deployment of Checked Baggage Screening Systems, but Funding
Uncertainties Remain GAO-06-875T. Washington, D.C.: June 29, 2006.
Aviation Security: Management Challenges Remain for the Transportation
Security Administration's Secure Flight Program. GAO-06-864T.
Washington, D.C.: June 14, 2006.
Aviation Security: Further Study of Safety and Effectiveness and Better
Management Controls Needed if Air Carriers Resume Interest in Deploying
Less-than-Lethal Weapons. GAO-06-475. Washington, D.C.: May 26, 2006.
Aviation Security: Enhancements Made in Passenger and Checked Baggage
Screening, but Challenges Remain. GAO-06-371T. Washington, D.C.: April
4, 2006.
Aviation Security: Transportation Security Administration Has Made
Progress in Managing a Federal Security Workforce and Ensuring Security
at U.S. Airports, but Challenges Remain. GAO-06-597T. Washington, D.C.:
April 4, 2006.
Aviation Security: Progress Made to Set Up Program Using Private-Sector
Airport Screeners, but More Work Remains. GAO-06-166. Washington, D.C.:
March 31, 2006.
Aviation Security: Significant Management Challenges May Adversely
Affect Implementation of the Transportation Security Administration's
Secure Flight Program. GAO-06-374T. Washington, D.C.: February 9, 2006.
Aviation Security: Federal Air Marshal Service Could Benefit from
Improved Planning and Controls. GAO-06-203. Washington, D.C.: November
28, 2005.
Aviation Security: Federal Action Needed to Strengthen Domestic Air
Cargo Security. GAO-06-76. Washington, D.C.: October 17, 2005.
Transportation Security Administration: More Clarity on the Authority
of Federal Security Directors Is Needed. GAO-05-935. Washington, D.C.:
September 23, 2005.
Aviation Security: Flight and Cabin Crew Member Security Training
Strengthened, but Better Planning and Internal Controls Needed. GAO-05-
781. Washington, D.C.: September 6, 2005.
Aviation Security: Transportation Security Administration Did Not Fully
Disclose Uses of Personal Information during Secure Flight Program
Testing in Initial Privacy Notes, but Has Recently Taken Steps to More
Fully Inform the Public. GAO-05-864R. Washington, D.C.: July 22, 2005.
Aviation Security: Better Planning Needed to Optimize Deployment of
Checked Baggage Screening Systems. GAO-05-896T. Washington, D.C.: July
13, 2005.
Aviation Security: Screener Training and Performance Measurement
Strengthened, but More Work Remains. GAO-05-457. Washington, D.C.: May
2, 2005.
Aviation Security: Secure Flight Development and Testing Under Way, but
Risks Should Be Managed as System Is Further Developed. GAO-05-356.
Washington, D.C.: March 28, 2005.
Aviation Security: Systematic Planning Needed to Optimize the
Deployment of Checked Baggage Screening Systems. GAO-05-365.
Washington, D.C.: March 15, 2005.
Aviation Security: Measures for Testing the Effect of Using Commercial
Data for the Secure Flight Program. GAO-05-324. Washington, D.C.:
February 23, 2005.
Transportation Security: Systematic Planning Needed to Optimize
Resources. GAO-05-357T. Washington, D.C.: February 15, 2005.
Aviation Security: Preliminary Observations on TSA's Progress to Allow
Airports to Use Private Passenger and Baggage Screening Services. GAO-
05-126. Washington, D.C.: November 19, 2004.
General Aviation Security: Increased Federal Oversight Is Needed, but
Continued Partnership with the Private Sector Is Critical to Long-Term
Success. GAO-05-144. Washington, D.C.: November 10, 2004.
Aviation Security: Further Steps Needed to Strengthen the Security of
Commercial Airport Perimeters and Access Controls. GAO-04-728.
Washington, D.C.: June 4, 2004.
Transportation Security Administration: High-Level Attention Needed to
Strengthen Acquisition Function. GAO-04-544. Washington, D.C.: May 28,
2004.
Aviation Security: Challenges in Using Biometric Technologies. GAO-04-
785T. Washington, D.C.: May 19, 2004.
Nonproliferation: Further Improvements Needed in U.S. Efforts to
Counter Threats from Man-Portable Air Defense Systems. GAO-04-519.
Washington, D.C.: May 13, 2004.
Aviation Security: Private Screening Contractors Have Little
Flexibility to Implement Innovative Approaches. GAO-04-505T.
Washington, D.C.: April 22, 2004.
Aviation Security: Improvement Still Needed in Federal Aviation
Security Efforts. GAO-04-592T. Washington, D.C.: March 30, 2004.
Aviation Security: Challenges Delay Implementation of Computer-
Assisted Passenger Prescreening System. GAO-04-504T. Washington, D.C.:
March 17, 2004.
Aviation Security: Factors Could Limit the Effectiveness of the
Transportation Security Administration's Efforts to Secure Aerial
Advertising Operations. GAO-04-499R. Washington, D.C.: March 5, 2004.
Aviation Security: Computer-Assisted Passenger Prescreening System
Faces Significant Implementation Challenges. GAO-04-385. Washington,
D.C.: February 13, 2004.
Aviation Security: Challenges Exist in Stabilizing and Enhancing
Passenger and Baggage Screening Operations. GAO-04-440T. Washington,
D.C.: February 12, 2004.
The Department of Homeland Security Needs to Fully Adopt a Knowledge-
based Approach to Its Counter-MANPADS Development Program. GAO-04-341R.
Washington, D.C.: January 30, 2004.
Aviation Security: Efforts to Measure Effectiveness and Strengthen
Security Programs. GAO-04-285T. Washington, D.C.: November 20, 2003.
Aviation Security: Federal Air Marshal Service Is Addressing Challenges
of Its Expanded Mission and Workforce, but Additional Actions Needed.
GAO-04-242. Washington, D.C.: November 19, 2003.
Aviation Security: Efforts to Measure Effectiveness and Address
Challenges. GAO-04-232T. Washington, D.C.: November 5, 2003.
Airport Passenger Screening: Preliminary Observations on Progress Made
and Challenges Remaining. GAO-03-1173. Washington, D.C.: September 24,
2003.
Aviation Security: Progress since September 11, 2001, and the
Challenges Ahead. GAO-03-1150T. Washington, D.C.: September 9, 2003.
Transportation Security: Federal Action Needed to Enhance Security
Efforts. GAO-03-1154T. Washington, D.C.: September 9, 2003.
Transportation Security: Federal Action Needed to Help Address Security
Challenges. GAO-03-843. Washington, D.C.: June 30, 2003.
Federal Aviation Administration: Reauthorization Provides Opportunities
to Address Key Agency Challenges. GAO-03-653T. Washington, D.C.: April
10, 2003.
Transportation Security: Post-September 11th Initiatives and Long-Term
Challenges. GAO-03-616T. Washington, D.C.: April 1, 2003.
Airport Finance: Past Funding Levels May Not Be Sufficient to Cover
Airports' Planned Capital Development. GAO-03-497T. Washington, D.C.:
February 25, 2003.
Transportation Security Administration: Actions and Plans to Build a
Results-Oriented Culture. GAO-03-190. Washington, D.C.: January 17,
2003.
Aviation Safety: Undeclared Air Shipments of Dangerous Goods and DOT's
Enforcement Approach. GAO-03-22. Washington, D.C.: January 10, 2003.
Aviation Security: Vulnerabilities and Potential Improvements for the
Air Cargo System. GAO-03-344. Washington, D.C.: December 20, 2002.
Aviation Security: Registered Traveler Program Policy and
Implementation Issues. GAO-03-253. Washington, D.C.: November 22, 2002.
Airport Finance: Using Airport Grant Funds for Security Projects Has
Affected Some Development Projects. GAO-03-27. Washington, D.C.:
October 15, 2002.
Commercial Aviation: Financial Condition and Industry Responses Affect
Competition. GAO-03-171T. Washington, D.C.: October 2, 2002.
Aviation Security: Transportation Security Administration Faces
Immediate and Long-Term Challenges. GAO-02-971T. Washington, D.C.: July
25, 2002.
Aviation Security: Information Concerning the Arming of Commercial
Pilots. GAO-02-822R. Washington, D.C.: June 28, 2002.
Aviation Security: Vulnerabilities in, and Alternatives for, Preboard
Screening Security Operations. GAO-01-1171T. Washington, D.C.:
September 25, 2001.
Aviation Security: Weaknesses in Airport Security and Options for
Assigning Screening Responsibilities. GAO-01-1165T. Washington, D.C.:
September 21, 2001.
Homeland Security: A Framework for Addressing the Nation's Efforts. GAO-
01-1158T. Washington, D.C.: September 21, 2001.
Aviation Security: Terrorist Acts Demonstrate Urgent Need to Improve
Security at the Nation's Airports. GAO-01-1162T. Washington, D.C.:
September 20, 2001.
Aviation Security: Terrorist Acts Illustrate Severe Weaknesses in
Aviation Security. GAO-01-1166T. Washington, D.C.: September 20, 2001.
FOOTNOTES
[1] In addition to passenger checkpoint screening, TSA's layers of
aviation security include, among other things, the screening of all
checked baggage for explosives and the deployment of Federal Air
Marshals on designated high-risk flights.
[2] Specifically, TSA modified the list of items prohibited and
permitted on aircraft by allowing metal scissors with pointed tips and
a cutting edge of 4 inches or less, as measured from the fulcrum, and
small tools of 7 inches or less, including screwdrivers, wrenches, and
pliers, to pass through the passenger screening checkpoint. See 70 Fed.
Reg. 72,930 (Dec. 8, 2005).
[3] We plan to issue a report on the impact of the prohibited items
list changes on public safety and screening operations later this year.
[4] We began our review period in April 2005 to coincide with TSA's
consideration of proposed SOP modifications related to the second major
revision of the passenger checkpoint screening SOP since TSA's
inception.
[5] TSA security activities at airports are overseen by FSDs. Each FSD
is responsible for overseeing security activities, including passenger
screening, at one or more commercial airports. We visited or conducted
phone interviews with officials at 25 airports. However, we met with
only 24 FSDs, as 1 FSD was responsible for 2 of the airports we
visited.
[6] The Aviation and Transportation Security Act (ATSA), Pub. L. No.
107-71, 115 Stat. 597 (2001), established TSA and assigned TSA with the
responsibility of building a federal workforce to conduct screening of
airline passengers and their checked baggage. See 49 U.S.C. §§ 114(a),
44901(a). ATSA also required that TSA allow commercial airports to
apply to TSA to transition from a federal to a private screener
workforce. See 49 U.S.C § 44920. To support this effort, TSA created
the Screening Partnership Program to allow all commercial airports an
opportunity to apply to TSA for permission to use qualified private
screening contractors and private screeners. There are currently 6
airports participating in the Screening Partnership Program, including
Jackson Hole, Kansas City International, Greater Rochester
International, San Francisco International, Sioux Falls Regional, and
Tupelo Regional.
[7] We used the following criteria to identify aviation security
experts: present and past employment in aviation security, depth of
experience in aviation security, and recognition in the aviation
industry.
[8] GAO, Internal Control: Standards for Internal Control in the
Federal Government, GAO/AIMD-00-21.3.1 (Washington, D.C.: August 2001).
[9] Covert testing involves TSA headquarters officials (national
testing) or TSA field staff and other federal employees (local testing)
attempting to carry simulated threat objects through the checkpoint
without the objects being detected by TSOs. The results of the national
covert tests are classified and therefore are not included in this
report.)
[10] GAO, Transportation Security Administration: Actions and Plans to
Build a Results Oriented Culture, GAO-03-190 (Washington, D.C.: January
2003).
[11] The results of local covert testing are sensitive security
information and, therefore, are not included in this report.
[12] Sterile areas are located within the terminal where passengers are
provided access to boarding aircraft. Access to these areas is
controlled by Transportation Security Officers (or by nonfederal
screeners at airports participating in the Screener Partnership
Program) at checkpoints where they conduct physical screening of
individuals and their carry-on baggage for weapons, explosives, and
other prohibited items.
[13] Transportation Security Officers must deny passage beyond the
screening location to any individual or property that has not been
screened or inspected in accordance with passenger screening standard
operating procedures. If an individual refuses to permit inspection of
any item, that item must not be allowed into the sterile area or
onboard an aircraft.
[14] CAPPS is a computer-assisted system that, based on information
obtained from airline reservation systems, identifies passengers that
may pose a high risk to aviation security. These high-risk passengers
and their carry-on baggage are subject to additional and more thorough
screening.
[15] GAO, Aviation Security: Screener Training and Performance
Measurement Strengthened, but More Work Remains, GAO-05-457
(Washington, D.C.: May 2, 2005).
[16] Private screeners conduct passenger and checked baggage screening
at six airports as part of TSA's Screening Partnership Program. TSA
requires that private screeners screen passengers using the same
standard operating procedures as TSOs.
[17] Between April 2005 and December 2005, TSA considered a total of
189 proposed modifications to passenger checkpoint screening SOPs.
However, 97 of the proposed modifications were not intended to alter
the way in which passengers and their carry-on items are screened;
rather, these modifications were generally intended to correct, edit,
or clarify SOP language. For example, TSA modified SOP language to
ensure that TSA field staff were aware that tribal law enforcement
officers should be granted the same screening exemptions as other law
enforcement officers. TSA also amended the SOP to help ensure the
occupational safety of TSOs. For example, TSA headquarters officials
proposed that procedures for reporting potential radiation hazards
regarding X-ray equipment be incorporated into the SOP. The remaining
92 proposed SOP modifications were intended to alter the way in which
passengers and their carry-on items were screened, and 48 of those
proposed modifications were subsequently implemented.
[18] TSA issued six revised versions of the passenger checkpoint
screening SOP during the 9-month period under review: April 7, 2005;
July 7, 2005; August 26, 2005; September 12, 2005; October 25, 2005;
and December 7, 2005. However, we did not include the April 2005
revised SOP in our review since the changes incorporated in that
revision were deliberated by TSA officials outside of our 9-month
period of review.
[19] Of the 48 proposed modifications that were implemented, TSA made
the decision to implement 16 of these modifications following our 9-
month review period. However, because much of TSA's deliberation of
these 16 procedures occurred during our review period, we included
these procedures among those that were implemented.
[20] The Office of Security Operations is the TSA division responsible
for overseeing the implementation of passenger and property screening
at airport checkpoints.
[21] In order to achieve its goal of improving IED detection, in
addition to modifying passenger checkpoint screening SOPs, the task
force established several initiatives, including enhanced bomb
detection training for TSOs and increased use of explosives detection
canine teams.
[22] Three of the 92 proposed SOP modifications were considered by TSA
under both processes.
[23] The number of airports at which any one proposed change was pilot
tested ranged from 3 to 14, and the duration of the pilot testing
ranged from 5 days to several weeks.
[24] The number of proposed SOP modifications that fall under the
various "basis" categories (e.g., threat and vulnerability information)
does not total 92 because documentation was not available for all
proposed modifications and some of the proposed modifications had more
than one basis.
[25] We did not assess the quality of the intelligence information used
by TSA's Office of Intelligence and Analysis to generate its civil
aviation threat assessments.
[26] The pat-down procedure is performed for three purposes: (1) as a
substitute for walk-through metal detector screening, (2) to resolve
walk-through metal detector alarms, and (3) as a standard procedure for
screening passengers selected for additional screening. The details of
the pat-down procedures are sensitive security information and are not
discussed in this report.
[27] The recommendations made by the Office of Inspection are sensitive
security information or classified information. Therefore, they are not
discussed in this report.
[28] The DHS Office of Inspector General conducts similar covert tests,
and historically has recommended changes to the passenger checkpoint
screening SOP as a result of these tests. However, the Office of
Inspector General did not make any recommendations that resulted in
procedural changes between April 2005 and December 2005.
[29] A risk-based approach generally involves consideration of the
following when making decisions: threat--capability and intent of
terrorists to carry out an attack, vulnerability--weakness that may be
exploited by identified threats, and criticality or consequence--the
impact of an attack if it were to be carried out.
[30] Explosives Trace Portal screening entails a passenger stepping
into the portal, after which puffs of air are emitted onto the
passenger. The portal then draws in any residue that was loosened as a
result of the puffs of air, and analyzes the residue to determine if
there are explosive traces.
[31] TSA defines SOP modifications related to efficiency as changes
that will improve screening flow, clarify TSO duties, update equipment
procedures, or enhance the working environment of screening locations.
[32] The SOP modifications made by TSA on August 10, 2006, August 12,
2006, September 26, 2006, and November 21, 2006, were designed to
address only one particular hydrogen peroxide-based liquid explosives
mixture, which, according to TSA officials, was the same mixture that
the alleged terrorists had planned to detonate on U.S.-bound flights
originating in the United Kingdom. DHS and FBI have identified
additional liquid explosives mixtures that could pose a threat to
commercial aviation. DHS has ongoing evaluations of the additional
mixtures to determine their explosive potential and the extent of
damage that detonation of these mixtures could cause to an aircraft.
DHS is also evaluating explosives detection technology to determine the
extent to which it can be used at the checkpoint to defend against the
liquid explosives threat. We are currently evaluating DHS's and TSA's
progress in planning for, managing, and deploying research and
development programs in support of airport checkpoint screening
operations. We expect to report on our results in August 2007.
[33] In February 2007, DHS Science and Technology directorate conducted
aircraft vulnerability tests to determine the extent of damage the
liquid explosives that were to be used in the alleged August 2006
London terror plot would cause to an aircraft. The results of these
tests, however, are sensitive security information and are not included
in this report.
[34] The intelligence information regarding the August 2006 London
terror plot is classified and, therefore, is not included in this
report.
[35] In the event that TSOs cannot determine the reason for a
passenger's suspicious behavior, the TSO refers the passenger to law
enforcement officials. TSA officials responsible for SPOT told us that
in designing the implementation of SPOT, they worked closely with FBI
staff, Secret Service staff, Israeli security experts, and state police
with experience in recognizing suspicious behaviors.
[36] Another SOP change was operationally tested and subsequently
rejected. TSA did not provide documentation or other information on the
reason it was rejected.
[37] SPOT was operationally tested at 1 airport beginning in December
2003, at 2 additional airports beginning in October 2004, and at 2
other airports beginning in October 2005. The remaining 9 airports
began participating in the operational testing of SPOT in December
2005.
[38] Statistically significant means that it is highly unlikely to
obtain a difference of a given size or more by chance, assuming that
there is actually no difference in the probability of finding
prohibited items between targeted and randomly selected passengers.
[39] CBP officials could not comment on whether a similar methodology
could be used by TSA, since they were not familiar with the SPOT
procedure.
[40] Following the September 11 terrorist attacks, the items terrorists
reportedly used to carry out the attacks--box cutters--were
subsequently prohibited onboard aircraft.
[41] The fifth expert we interviewed said that he was uncertain how to
assess the effectiveness of passenger checkpoint screening procedures.
[42] TSA's Performance Management Information System is designed to
collect, analyze, and report passenger and baggage screening
performance data, such as wait times at selected airports, workload
data, and the performance and utilization of passenger and baggage
screening equipment. TSA headquarters uses PMIS data to support
external reporting on performance and internal decision-making
processes.
[43] To conduct our analysis we used TSA data that showed (1) it takes,
on average, about 1.89 minutes to conduct a bag search that was
initiated because a TSO identified a prohibited item (such as a pair of
scissors) in the X-ray image of a carry-on bag--this average search
time was derived from an informal TSA property search time study
conducted at 9 airports--and (2) there were 28,785 actual full-time-
equivalent (FTE) passenger screening TSOs during fiscal year 2005. One
FTE is equal to 1 work year or 2,080 nonovertime hours. To determine
the number of minutes per day, on average, each TSO spent searching for
sharp objects found during the 6-month period, we took the following
steps. First, we calculated the total amount of time (in minutes) taken
to conduct the searches by multiplying the number of sharp objects
found (1,762,571) by the average time to conduct targeted searches
(1.89 minutes), assuming that one item was found per search. This
totaled 55,521 hours. Next, we calculated the amount of time, on
average, each TSO spent searching for the sharp objects found by
dividing 55,521 hours by 28,785 TSO FTEs. The result was 1.93 hours per
TSO. Finally, we converted average hours per TSO to minutes and divided
by 130 days--the number of days worked by a TSO for 26 weeks over a 6-
month period (assuming 5 work days per week at 8 hours per day). The
result was an average of 0.89 minutes per day per TSO over the 6-month
period.
[44] The number of bags searched is sensitive security information.
[45] The results of the informal follow-on studies, which were
conducted at 6 to 9 airports, show that the percentage of carry-on bags
searched increased slightly at the time of the 30-day study, then
decreased slightly at the time of the 60-day and 90-day studies,
respectively. However, the results of these informal studies may not be
reliable due to the limitations in the methodology TSA used to conduct
the studies. Specifically, TSA did not use a methodology that would
control for factors other than the prohibited items list change that
may influence the percentage of carry-on bags searched by TSOs. To do
this, TSA would have had to develop a formal, systematic methodology
for randomly selecting various times of day, location of checkpoints,
number of checkpoints, and so on for data collection. By not
controlling for such factors, TSA may not know the extent to which a
reduction in the percentage of carry-on bags searched is due to the
prohibited items list changes.
[46] TSA officials told us that TSA's Office of Intelligence assessed
the potential impact each of these CAPPS changes would have on security
and, based in its analysis, determined that none of the CAPPS changes
would compromise security.
[47] Passengers can be selected for secondary screening through CAPPS
or other TSA-approved processes, such as the Selectee List. CAPPS rules
are sensitive security information and, therefore, are not discussed in
this report.
[48] Ticket checkers are aircraft operator or TSA employees who are
positioned before the screening checkpoint to perform identification
check and sterile area access responsibilities as required by TSA. For
passengers, ticket checkers verify travel documents and make sure the
identifying information on the travel document is consistent with the
information on the individual's personal identification documents
(e.g., licenses, passport, etc.) Ticket checkers are also responsible
for directing passengers designated as selectees to the appropriate
screening lane. For nonpassengers, ticket checkers verify required
identification before allowing access to the sterile area.
[49] The task force reported that 1 FSD was unsure of the security
benefits provided by USP, though this FSD did support the concept of
introducing unpredictability into the screening process.
[50] Of the remaining 6 FSDs, 5 said that TSO resources were freed up
as a result of the prohibited items list and CAPPS rules changes, and 1
was uncertain whether TSO resources were actually freed up.
[51] Since its inception in November 2001, TSA has had multiple
Assistant Secretaries (originally titled Under Secretaries of
Transportation for Security). In addition, between January 2005 and
August 2006, TSA issued seven press releases regarding senior-level
personnel changes within the agency.
[52] GAO, Internal Control: Internal Control Management and Evaluation
Tool, GAO-01-1008G (Washington, D.C.: August 2001).
[53] In July 2005, prior to the implementation of PASS, TSA required
all FSDs to implement an audit program of screening checkpoint
operations, primarily focused on assessing TSO compliance with
checkpoint screening SOPs. Specifically, each airport is to have an
audit program that evaluates TSOs' ability to detect threat objects
taken through the checkpoint, as well as TSOs' compliance with SOPs for
screening passengers and their accessible property. The audit program
is also intended to evaluate screening supervisors' and lead TSOs'
compliance with the SOP.
[54] GAO-03-190.
[55] ATSA requires that each TSO receive an annual proficiency review
to ensure he or she continues to meet all qualifications and standards
required to perform screening functions. See 49 U.S.C. § 44935(f)(5).
[56] GAO-01-1008G.
[57] As of February 2006, STEA test results had been recorded for a
total of 417 airports.
[58] The results of STEA testing are sensitive security information
and, therefore, are not included in this report.
[59] As of December 2006, TSA was in the process of modifying STEA into
a performance measurement program. TSA plans to implement the new STEA
program during the second quarter of fiscal year 2007.
[60] The covert testing results, including the reasons for failure, and
the recommendations made by the Office of Inspection are classified and
cannot be discussed in this report.
[61] We began our review period in April 2005 to coincide with TSA's
consideration of proposed SOP modifications related to the second major
revision of the passenger checkpoint screening SOP since TSA's
inception.
[62] We did not assess all of the proposed SOP modifications associated
with the SOP revisions issued between August 2006 and November 2006;
rather, we only reviewed the proposed modifications associated with
screening for liquids, gels, and aerosols.
[63] GAO/AIMD-00-21.3.1.
[64] We visited 25 airports. However, we met with only 24 FSDs, as 1
FSD was responsible for 2 of the airports we visited.
[65] GAO/AIMD-00-21.3.1.
[66] The list of airports we visited is sensitive security information.
Therefore, we do not identify those airports in this report.
[67] TSA classifies the more than 400 commercial airports in the United
States into one of five categories--X, I, II, III, and IV. Generally,
category X airports have the largest number of passenger boardings and
category IV airports have the smallest number.
[68] The Aviation and Transportation Security Act (ATSA) required that
TSA begin allowing commercial airports to apply to TSA to transition
from a federal to a private screener workforce. See 49 U.S.C. § 44920.
To support this effort, TSA created the Screening Partnership Program
to allow all commercial airports an opportunity to apply to TSA for
permission to use qualified private screening contractors and private
screeners. There are currently 6 airports participating in the
Screening Partnership Program, including Jackson Hole, Kansas City
International, Greater Rochester International, San Francisco
International, Sioux Falls Regional, and Tupelo Regional.
[69] There were 10 SOP modifications that were proposed both by
multiple sources. We attributed 9 of these proposed modifications to
each of the relevant sources. TSA did not identify the sources for the
1 remaining modification that was proposed by multiple sources.
[70] GAO/AIMD-00-21.3.1.
GAO's Mission:
The Government Accountability Office, the audit, evaluation and
investigative arm of Congress, exists to support Congress in meeting
its constitutional responsibilities and to help improve the performance
and accountability of the federal government for the American people.
GAO examines the use of public funds; evaluates federal programs and
policies; and provides analyses, recommendations, and other assistance
to help Congress make informed oversight, policy, and funding
decisions. GAO's commitment to good government is reflected in its core
values of accountability, integrity, and reliability.
Obtaining Copies of GAO Reports and Testimony:
The fastest and easiest way to obtain copies of GAO documents at no
cost is through GAO's Web site (www.gao.gov). Each weekday, GAO posts
newly released reports, testimony, and correspondence on its Web site.
To have GAO e-mail you a list of newly posted products every afternoon,
go to www.gao.gov and select "Subscribe to Updates."
Order by Mail or Phone:
The first copy of each printed report is free. Additional copies are $2
each. A check or money order should be made out to the Superintendent
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or
more copies mailed to a single address are discounted 25 percent.
Orders should be sent to:
U.S. Government Accountability Office 441 G Street NW, Room LM
Washington, D.C. 20548:
To order by Phone: Voice: (202) 512-6000 TDD: (202) 512-2537 Fax: (202)
512-6061:
To Report Fraud, Waste, and Abuse in Federal Programs:
Contact:
Web site: www.gao.gov/fraudnet/fraudnet.htm E-mail: fraudnet@gao.gov
Automated answering system: (800) 424-5454 or (202) 512-7470:
Congressional Relations:
Gloria Jarmon, Managing Director, JarmonG@gao.gov (202) 512-4400 U.S.
Government Accountability Office, 441 G Street NW, Room 7125
Washington, D.C. 20548:
Public Affairs:
Paul Anderson, Managing Director, AndersonP1@gao.gov (202) 512-4800
U.S. Government Accountability Office, 441 G Street NW, Room 7149
Washington, D.C. 20548: