Defense Logistics
Improving Customer Feedback Program Could Enhance DLA's Delivery of Services
Gao ID: GAO-02-776 September 9, 2002
The Defense Logistics Agency (DLA) performs a critical role in supporting America's military forces worldwide by supplying every consumable item--from food to jet fuel--that the military services need to operate. Although customers at the eight locations GAO visited were satisfied with some aspects of routine service, such as delivery time for routine parts and certain contractor service arrangements, customers also raised a number of points of dissatisfaction, particularly with regard to the detrimental impact of DLA's service on their operations. The agency's approach for obtaining customer service feedback has been of limited usefulness because it lacks a systematic integrated approach for obtaining adequate information on customer service problems. Although DLA has initiatives under way to improve its customer service, there are opportunities to enhance these initiatives to provide for an improved customer feedback program.
Recommendations
Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.
Director:
Team:
Phone:
GAO-02-776, Defense Logistics: Improving Customer Feedback Program Could Enhance DLA's Delivery of Services
This is the accessible text file for GAO report number GAO-02-738
entitled 'Defense Logistics: Improving Customer Feedback Program Could
Enhance DLA's Delivery of Services' which was released on September 23,
2002.
This text file was formatted by the U.S. General Accounting Office
(GAO) to be accessible to users with visual impairments, as part of a
longer term project to improve GAO products' accessibility. Every
attempt has been made to maintain the structural and data integrity of
the original printed product. Accessibility features, such as text
descriptions of tables, consecutively numbered footnotes placed at the
end of the file, and the text of agency comment letters, are provided
but may not exactly duplicate the presentation or format of the printed
version. The portable document format (PDF) file is an exact electronic
replica of the printed version. We welcome your feedback. Please E-mail
your comments regarding the contents or accessibility features of this
document to Webmaster@gao.gov.
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed
in its entirety without further permission from GAO. Because this work
may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this
material separately.
United States General Accounting Office:
GAO:
Report to Congressional Committees:
September 2002:
Defense Logistics:
Improving Customer Feedback Program Could Enhance DLA's Delivery of
Services:
GAO-02-776:
GAO Highlights:
Highlights of GAO-02-776, a report to the Committee on Armed Services,
U.S. Senate, and the Committee on Armed Services, House of
Representatives.
Why GAO Did This Study:
The Defense Logistics Agency supports America‘s military forces
worldwide by supplying almost all consumable items”from food to jet
fuel”that the military services need. The Floyd D. Spence Defense
Authorization Act for Fiscal Year 2001 mandated that GAO conduct
reviews of the agency, including its relationship with its military
service customers. For this report, GAO determined (1) how customers
perceive the quality of the agency‘s service, (2) how useful its
approaches are for obtaining customer feedback, and (3) whether
opportunities exist to enhance its initiatives for improving customer
service.
What GAO found:
Military service customers at eight judgmentally selected locations GAO
visited had mixed views of the Defense Logistics Agency‘s services”
satisfied with aspects of routine service, such as the delivery time for
routine parts, but dissatisfied with other areas, such as the
detrimental impact that the agency‘s service has had on their
operations. Customers cited difficulties, for example, in getting
critical weapons systems parts by the needed time.
The agency‘s approach for obtaining systematic customer service
feedback is limited. It:
* lacks an integrated method to obtain adequate data on problems;
* does not effectively use surveys or local representatives to obtain
feedback to identify the importance or depth of customers‘ issues;
* has not adequately defined or identified its customers; and;
* does not provide a ’single face“ to its customers, thus fragmenting
accountability for customer satisfaction.
Agency management acknowledged that the agency has not been customer
focused and has been slow to respond to customer support concerns. The
agency is acting to improve its customer relationships and provide a
single face to its customers. But these initiatives do not fully
address the limitations in its current approaches to obtain feedback and
do not incorporate other soliciting and analytical approaches, such as
those used in the private sector. Research of best practices for
customer satisfaction suggests that multiple approaches and the
integration of feedback data are needed to effectively listen to and
understand customers‘ perceptions and needs and to take appropriate
actions to meet those needs.
Figure: Defense Logistics Agency‘s Process for Providing Customers with
Needed Materiel:
[See PDF for image]
This figure is an illustration of the various Defense Logistics
Agency‘s processes for providing customers with needed materiel, with
the following data depicted:
Customers:
* Requisitions to Agency supply centers;
* Orders from Agency supply centers to Agency distribution depots;
* Material is delivered.
Customers:
* Requisitions to Agency supply centers;
* Orders from Agency supply centers to Manufacturers/Vendors;
* Material is delivered.
Customers:
* Orders to vendors;
* Material is delivered.
[End of figure]
What GAO Recommends:
GAO recommends that the Secretary of Defense direct the Defense
Logistics Agency, along with the military services, as appropriate, to:
* develop a comprehensive customer-feedback plan to better determine
customer needs and solutions to the needs;
* determine who its customers are and their needs, and;
* clarify guidance for customer representatives to help create a
’single face“ for customers.
DOD generally concurred with GAO‘s recommendations and agreed that DLA
needs to increase its focus on customer satisfaction.
This is a test for developing Highlights for a GAO report. The full
report, including GAO's objectives, scope, methodology, and analysis is
available at [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-02-776].
For additional information about the report, contact Charles I. Patton,
Jr. (202-512-4412). To provide comments on this test Highlights,
contact Keith Fultz (202-512-3200) or e-mail HighlightsTest@gao.gov.
[End of section]
Contents:
Results in Brief:
Background:
Customer Satisfaction with DLA Services Is Mixed:
Usefulness of Customer Feedback Approaches Has Been Limited:
Initiatives for Achieving a Better Customer Focus Could Be Enhanced
Through Improved Customer Feedback Approaches:
Conclusions:
Recommendations for Executive Action:
Agency Comments and Our Evaluation:
Appendix I: Scope and Methodology:
Appendix II: Comments from the Department of Defense:
Appendix III: GAO Contact and Staff Acknowledgements:
Table:
Table 1: DLA Customer Segments and Illustrative Military Commands:
Figures:
Figure 1: DLA‘s Supply-Chain Management Process:
Figure 2: Example of Relationship between DODAACs and Army Customer
Activities:
Figure 3: AT&T Customer Feedback and Listening Strategies:
Figure 4: DLA Customer Locations Visited by GAO:
Abbreviations:
DLA: Defense Logistics Agency:
DOD: Department of Defense:
DODAACs: DOD Activity Address Codes:
GAO: General Accounting Office:
[End of section]
United States General Accounting Office:
Washington, DC 20548:
September 9, 2002:
The Honorable Carl Levin:
Chairman:
The Honorable John W. Warner:
Ranking Minority Member:
Committee on Armed Services:
United States Senate:
The Honorable Bob Stump:
Chairman:
The Honorable Ike Skelton:
Ranking Minority Member:
Committee on Armed Services:
House of Representatives:
The Defense Logistics Agency (DLA) performs a critical role in
supporting America‘s military forces worldwide by supplying almost every
consumable item”from food to jet fuel”that the military services need to
operate. To fulfill this role, the agency oversees a staff of more than
28,000 civilian and military employees who work in all 50 states and 27
foreign countries. It manages approximately 4 million supply items and
processes over 23 million requisitions annually. DLA reported that, in
fiscal year 2001, these operations resulted in sales to the military
services of about $15 billion, of which $12 billion was for supplies.
This report is one in a series mandated under the Floyd D. Spence
Defense Authorization Act for Fiscal Year 2001. [Footnote 1] The act
directed that we review DLA‘s efficiency and effectiveness in meeting
customer requirements, the application of best business practices, and
opportunities for improving the agency‘s operations. As agreed with
your offices, this report focuses on the relationship between DLA and
its military service customers. More specifically, we determined (1)
how customers perceive the quality of service they receive, (2) how
useful the agency‘s approaches are for obtaining customer service
feedback, and (3) whether there are opportunities to enhance the
agency‘s initiatives for improving customer service. To address these
objectives, we used a case study approach to obtain customers‘ views.
Our scope was limited to a judgmentally selected number of materiel
management customers. We visited eight military service customer
locations within the continental United States. The results of our work
at these locations are not projectable to the agency as a whole.
However, studies conducted by the Joint Chiefs of Staff, DLA surveys,
and comments from agency headquarters officials suggest that many of
the issues we raise in this report are systemic in nature. The details
on our objectives, scope, and methodology are in appendix I.
Results in Brief:
Customers at the eight locations we visited expressed both satisfaction
and dissatisfaction with the services the agency provides. While they
were satisfied with some aspects of routine service, such as the
delivery time for routine parts and certain contractor service
arrangements, customers also raised a number of points of
dissatisfaction, particularly with regard to the detrimental impact of
DLA‘s service on their operations. For example, many customers cited
difficulties in getting critical weapons systems parts in time to meet
their needs, resulting in equipment readiness deficiencies as well as
the cannibalization of other equipment to obtain needed parts. Not
getting accurate and timely information on the status and/or
availability of critical items frustrated other customers. Some of the
difficulties that customers encountered in trying to get parts from DLA
included inaccurate dates from automated systems on the status of
deliveries, difficulty in obtaining additional information on the
availability of parts, and a lack of support from DLA in identifying
alternate vendors or other means to obtain critical items that were
unavailable through DLA.
The agency‘s approach for obtaining customer service feedback has been
of limited usefulness because it lacks a systematic integrated approach
for obtaining adequate information on customer service problems. For
example, DLA has not adequately defined or identified all of its
customers, leaving it without a sufficient means to initiate and
maintain contact with its many thousands of customers to solicit
meaningful feedback. In addition, although DLA reaches out to selected
customers through satisfaction surveys and the use of local customer
support representatives at various locations, these mechanisms do not
provide the customer feedback that DLA needs to identify the
significance or depth of issues that particularly trouble its
customers. Furthermore, the satisfaction survey response rates are too
low to provide meaningful statistical analyses of customer
satisfaction. Lastly, DLA‘s current customer support system does not
provide a ’single face“ to its customers, leaving accountability for
ensuring high customer satisfaction fragmented throughout the agency.
While DLA has initiatives under way to improve its customer service,
there are opportunities to enhance these initiatives to provide for an
improved customer feedback program. DLA management at the highest
levels has acknowledged that the agency has not been as customer
focused as it should be, has been slow to respond to customer-support
concerns, and is taking actions to improve its customer relationships.
However, the agency‘s initiatives do not completely address the
limitations we identified in its current approaches for obtaining
customer service feedback. For example, while DLA‘s new strategy lays
out a means to provide a single face to its customers, it does not
incorporate other approaches, such as those used in the private sector,
to solicit and analyze feedback from those customers. Research on best
practices in the area of customer satisfaction suggests that multiple
approaches are needed to effectively listen to customers about their
perceptions of quality service and needs. Such approaches include
customer service surveys, telephone interviews, and customer complaint
programs. Best practices research also highlights the need to integrate
all data obtained through various customer feedback approaches so that
service providers can completely understand customer perceptions and
take appropriate actions to meet customer needs.
This report includes recommendations for executive action to help DLA
better identify customers‘ needs and solutions for meeting them through
an integrated customer feedback framework. The Department of Defense
(DOD) generally concurred with our recommendations and agreed that DLA
needs to increase its focus on customer satisfaction. The department‘s
comments on our report are reprinted in their entirety in appendix II.
Background:
DLA is a DOD Combat Support Agency under the supervision, direction,
authority, and control of the Under Secretary of Defense for
Acquisition, Technology, and Logistics. DLA‘s mission is to provide its
customers”the military services and federal civilian agencies”with
effective and efficient worldwide logistics support as required.
[Footnote 2] DLA buys and manages a vast number and variety of items
for its customers, including commodities such as energy, food,
clothing, and medical supplies. DLA also buys and distributes hardware
and electronics items used in the maintenance and repair of equipment
and weapons systems.
Customers determine their requirements for materiel and supplies and
submit requisitions to any of four DLA supply centers. [Footnote 3] The
centers then consolidate the requirements and procure the supplies for
their customers. DLA provides its customers with requested supplies in
two ways: some items are delivered directly from a commercial vendor
while other items are stored and distributed through a complex of
worldwide distribution depots that are owned and managed by both DLA
and the military services. DLA refers to this ordering and delivery
process as materiel management or supply-chain management. [Footnote 4]
Figure 1 provides a snapshot of this process.
Figure 1: DLA‘s Supply-Chain Management Process:
[See PDF for image]
This figure is an illustration of the various Defense Logistics
Agency‘s processes for providing customers with needed materiel, with
the following data depicted:
Customers:
* Requisitions to Agency supply centers;
* Orders from Agency supply centers to Agency distribution depots;
* Material is delivered.
Customers:
* Requisitions to Agency supply centers;
* Orders from Agency supply centers to Manufacturers/Vendors;
* Material is delivered.
Customers:
* Orders to vendors;
* Material is delivered.
Source: GAO‘s analysis of DLA‘s process.
[End of figure]
Because DLA is the sole supplier for many critical items that can
affect the readiness of the military services, the agency strives to
provide its customers with the most efficient and effective logistics
support. Thus, DLA has adopted a policy to provide customers with ’the
right item, at the right time, right place, and for the right price,
every time.“ In an effort to institutionalize this customer support
concept, DLA has adopted the Balanced Scorecard approach [Footnote 5]
to measure the performance of its logistics operations. The scorecard,
a best business practice used by many private and public organizations,
is intended to measure DLA‘s performance by integrating financial
measures with other key performance indicators around customers‘
perspectives; internal business processes; and organization growth,
learning, and innovation.
Customer Satisfaction with DLA Services Is Mixed:
Our work showed that customers at the eight locations we visited
expressed satisfaction and dissatisfaction with the services the agency
provides. On the one hand, customers are generally satisfied with DLA‘s
ability to quickly respond to and deliver requests for routine, high-
demand, in-stock items; provide customers with an easy-to-use ordering
system; and manage an efficient prime vendor program. On the other hand,
customers at some locations were dissatisfied that, among other things,
DLA is unable to obtain less frequently needed, but critical, items and
parts and provide accurate and timely delivery status information. Some
customers did not express an opinion on the overall quality of customer
service.
Customers Generally Satisfied with Routine Services:
One aspect of DLA customer support is to provide customers with
supplies when they need them. Common supplies include vehicle parts
such as pumps, hoses, filters, and tubing. Timeliness, which sometimes
requires deliveries to be made in a day or less, can vary with
customers, depending on the particular item. However, customers at all
locations we visited commented that they were generally satisfied with
DLA‘s ability to provide most supply items in a time frame that meets
their needs. Customers stated that the majority of the routine,
frequently demanded supplies they order through DLA are delivered
quickly”a view that is also supported by a February 2002 DLA
performance review. The review concluded that the majority of
requisitions (over 85 percent) was filled from existing inventories
within DLA‘s inventory supply system. Similarly, a 2001 Joint Staff
Combat Support Agency Review Team assessment of DLA‘s support to the
unified commands indicated that overall, DLA received outstanding
comments regarding its ability to provide its customers with timely
supplies and services. [Footnote 6]
Customers were also satisfied with the ease in ordering supplies such as
the pumps, hoses, and filters mentioned above. Customers stated that
even though they conduct large amounts of business through DLA, they
had few problems with the ordering process. This occurs because,
according to some customers, ordering is facilitated by effective on-
line systems that work well and have readily available information.
Another method that DLA uses to ensure customer satisfaction is its
prime vendor program, which DLA instituted to simplify the procurement
and delivery of such items as subsistence and medical or pharmaceutical
supplies that commonly have a short shelf life. The program enables
customers to directly interact with vendors, thereby reducing the
delivery time for these supplies. Two customers of these DLA-managed
prime vendor programs told us the programs effectively reduced delivery
time. For example, at one location, prime vendors reduced the delivery
time of food items from 7 days”the time it took to deliver the items
when purchased from DLA”to 2 days for items purchased directly from
prime vendors. [Footnote 7] The customers we spoke with at a medical
supply unit told us they were so pleased with the prime vendor‘s quick
delivery time that they intend to obtain even more medical supplies
from the prime vendor. They also told us that the prime vendor provides
an additional service in the form of monthly visits to assess customer
satisfaction with its services. The unit pointed out that DLA‘s
customer support representatives [Footnote 8] are less likely to make
such frequent visits.
Customers Also Expressed Dissatisfaction with Some DLA Services:
Although customers seemed pleased with the way DLA handles routinely
available items, some raised concerns over the agency‘s ability to
provide critical items such as weapon system parts, timely and accurate
information on the status of ordered items, and proactive management for
high-priority requisitions. A Combat Support Agency Review Team
assessment in 1998 also surfaced similar issues. Additionally, customers
we talked to criticized how DLA manages customer-owned assets in DLA
warehouses.
Difficulties in Obtaining Critical Parts:
As previously noted, DLA strives to provide the timely delivery of all
supplies and parts, including common consumable supply items like food;
clothing and hardware; and critical parts for weapons systems such as
tanks, helicopters, and missiles. Customers at four locations we visited
told us that DLA was not able to timely deliver some critical items,
such as weapons systems parts, which significantly affected their
equipment readiness. A number of customers told us that the items they
have difficulty obtaining from DLA are those that are more costly or
infrequently required. At two locations, customers used parts from
existing equipment (known as ’parts cannibalization“) because they were
unable to obtain the parts they needed. At two other locations,
customers said they grounded aircraft and/or deployed units without
sufficient supplies. Customers at one location experienced an over-6-
month delay in obtaining helicopter parts. As a result, customers at
this location told us that some of the unit‘s helicopters were unable
to fly their missions. We reported in November 2001 that equipment
cannibalizations adversely affect the military services, resulting in
increased maintenance costs, and lowered morale and retention rates
because of the increased workload placed on mechanics. [Footnote 9]
One customer also told us that DLA does not provide adequate
information about items requiring long procurement lead times. The
customer stated that having this information more readily available
would aid customers in making decisions about the types and quantities
of items they should retain to minimize the impacts of long DLA lead
times.
The 1998 Combat Support Agency Review Team‘s assessment conducted
at military service field activities found that even though DLA met its
overall supply availability goal of 85 percent, the remaining 15
percent of items that were not available ’almost certainly includes a
number of items that are critical to the operation of essential weapon
systems.“ The assessment attributed this shortfall to flaws in DLA‘s
requirements determination models, which are used to estimate
customers‘ demands so that DLA can maintain sufficient inventory
quantities.
The study further stated that customers are not satisfied with the
delivery time for items that are not in stock. In fact, in April 2002,
the overall logistics response time was almost 100 days for nonstocked
items”a problem that appears to have persisted for the last several
years, in spite of efforts to reduce this time. Customers at four
locations provided us with examples of back-ordered items having lead
times in excess of 1 year, such as navigational instruments and
airframe parts. In discussing this issue further with DLA headquarters
officials, they acknowledged that this is a problem and are working on
a number of initiatives to address customers‘ concerns.
Inaccurate and Untimely Status Information:
Customers need accurate and timely information on the status of their
orders so they can plan equipment maintenance schedules to optimize the
readiness of existing equipment. However, customers at six locations
were frustrated with obtaining accurate and timely information from DLA
item managers and the automated systems that are intended to provide
status information on requisitions. Customers at three locations said
that when they tried to directly contact item managers by telephone,
the managers often could not be reached and voice-mail messages were
seldom returned.
Furthermore, military service customers told us that DLA‘s automated
requisition systems often do not contain accurate status data. Of
particular concern to customers are the expected shipping or delivery
dates posted on the automated systems. These dates show when parts will
be available and allow units to coordinate maintenance schedules. If
the dates are incorrect, units cannot effectively plan to have
equipment available to be repaired. We discussed this concern with DLA
headquarters officials, who told us they are investigating the problem.
Lack of Proactive Management for High-Priority Requisitions:
Another significant concern raised by customers at three locations was
that DLA is not proactive in seeking alternate ways to obtain critical
items that are not immediately available within DLA‘s supply system. DLA
typically places such items on back order, which, to meet mission needs,
places a burden on customers to find their own means to obtain the
necessary items right away. A number of customers at these three
locations said they felt that DLA, in an effort to be more customer
focused, should do more to seek out alternate sources of supply to
alleviate these high-priority back orders. Some customers also remarked
that the required efforts for them to call vendors and solicit bids is
a problem for their unit because of limited staffing levels and lack of
contracting capabilities.
In one instance, an aviation supply unit requisitioned a critical part
from DLA that was needed to repair a helicopter unable to fly its
mission. This requisition was placed on back order by DLA, and delivery
was not expected to occur until 8 months later. Because of the critical
nature of the needed part, the unit had to search for other means to
obtain the part sooner. In fact, the unit directly contacted the same
vendor that DLA was working with to fill the back orders and learned
that the vendor had stock on hand and would be able to ship the item
immediately. The unit subsequently purchased the part from that vendor
instead of waiting for it to be available from DLA.
In another instance, a DLA item manager informed an aircraft
maintenance depot customer that $2 million worth of critical parts for a
helicopter engine overhaul program would be placed on back order
because the parts were not available from the DLA vendor. In researching
listings for property to be disposed of, [Footnote 10] the customer
found the required parts”still new and unopened in the manufacturers‘
container”available for redistribution or sale within DLA‘s disposal
system. As a result, the customer initiated a shipping request to
procure the $2 million in helicopter parts for only the cost to ship
the items.
Ineffective Management of DLA Warehouses:
DLA manages all warehousing functions at locations where a DLA
distribution depot [Footnote 11] is collocated with a military
activity. Management functions include, among other things, logging in
and storing equipment. During the course of our interviews, customers
raised concerns over DLA‘s handling of these functions. At three of the
sites we visited, the customers perceived that their assets were not
being serviced and maintained as required. Their concerns centered on
DLA‘s process for recording the ownership of equipment and the
commingling of different customers‘ inventories.
To assign asset ownership, DLA ’codes“ items in its automated inventory
system. That is, DLA assigns unique codes to differentiate between Army,
Navy, Marine Corps, Air Force, and DLA-owned assets. However, customers
at three locations we visited stated that in numerous instances, DLA
assigned inventory items to the wrong management account, thus creating
the possibility that an item ordered and paid for by one unit or
service could be issued to another. One location we visited had
documented over $1 million worth of items coded into the wrong
management account. Another location identified $621,000 worth of
incorrectly coded items. Before the errors were corrected, neither
activity could access the materials they needed. As a result, both
locations invested unnecessary amounts of time and money in correcting
DLA‘s error. During our review, we brought this issue to the attention
of DLA officials, who indicated that they would investigate the
problem.
Customers also expressed concerns about the commingling of service-owned
assets with DLA-owned assets in DLA-managed warehouses. Like inaccurate
coding, commingling creates a significant risk that items will be
issued by the warehouse to someone other than the purchasing unit. As a
result, the items would not be available to the true owner when needed.
Also, for equipment items that need periodic inspection and repair,
there is a risk the owner will expend resources to perform maintenance
or repairs but not be able to retrieve the item because DLA mistakenly
issued that item to a different requisitioning entity or military
service. As a result, the ’true owner“ could have needlessly spent
resources on items given to somebody else and also be left with items
still needing repair. In discussions with DLA headquarters officials,
they acknowledged the problem and told us that DLA is taking steps to
address it with a National Inventory Management Strategy, which is part
of DLA‘s goal to better manage its supply chain effectiveness.
Usefulness of Customer Feedback Approaches Has Been Limited:
DLA‘s approach for obtaining customer service feedback has been of
limited usefulness because it lacks a systematic integrated approach for
obtaining adequate information on customer service problems. As a
result, the agency does not have the information necessary to identify
its customers‘ concerns, and more importantly, to initiate actions for
improving customer service, thereby placing at risk DLA‘s ability to
meet its overall goal of providing quality service to the war fighter.
In particular, DLA has not (1) adequately identified all of its
customers, (2) effectively solicited customer feedback, and (3) clearly
identified those accountable for ensuring customer satisfaction.
DLA Has Not Adequately Identified All of Its Customers:
Obtaining good meaningful feedback from customers means knowing who
those customers are. DLA broadly defines a ’customer“ as someone who
purchases items or directly causes products to be bought, but DLA has
not identified who those individuals are from the multitude of
organizations it deals with. DLA‘s current portfolio of customers is
identified by approximately 49,000 address codes, known as DOD Activity
Address Codes (DODAACs). [Footnote 12] The military services assign
DODAACs to various organizations and activities for ordering supplies.
However, these address codes, a legacy of a system built in the 1960s,
contain little information about the customer‘s organization beyond a
physical address. No meaningful customer contact point is associated
with the codes or, in many cases, a specific organization that DLA can
use as a basis for interaction with the customers using their services.
As a result, DLA has no effective process to initiate and maintain
contact with its customers for soliciting feedback. Without such a
customer interface process, DLA has no routine means to understand
customers‘ needs and to take appropriate corrective actions to address
those needs.
Our efforts to identify and interview DLA customers were hindered
because a single DODAAC does not necessarily equate to a single
customer. In many cases we found that one organization interacts with
DLA using a number of DODAACs. For example, DLA‘s customer database
shows over 580 DODAACs for Fort Bragg. However, according to DLA and
Army officials, the number of Fort Bragg customer organizations
interacting with DLA for these same DODAACs is smaller. The reason for
this is that, in part, central order points at Fort Bragg are
responsible for submitting and tracking orders for a number of smaller
organizations, thereby covering multiple DODAACs. In addition, each of
these organizations also uses multiple DODAACs to differentiate between
various types of supply items, such as repair parts and construction
materials. For example, one DODAAC is used for ordering numerous
repair parts while another is used for ordering construction materials.
One of these customer organizations at Fort Bragg is the Division
Support Command of the 82nd Airborne Division, which interacts with DLA
for supplies ordered using 159 different DODAACs. Thus, many DODAACs
could represent only one customer. Figure 2 illustrates the relationship
between the DODAACs used by DLA to define customers and the Division
Support Command.
Figure 2: Example of Relationship between DODAACs and Army Customer
Activities:
[See PDF for image]
This figure is an illustration of the relationship between DODAACs and
Army Customer activities. The following data is depicted:
DODAACs:
1. W36B2U:
Name: PR LT MNT CO MSB A;
2. W36BY3:
Name: XR E CO 782D DX;
3. W36BYM:
Name: PR 505 INF 1 BN HHC;
...
159. W36NOT:
Name: SR 82 AVN D CO ASL.
This is equivalent to:
Army Customer:
Division Support Command, 82nd Airborne Division.
Source: GAO‘s analysis of DLA- and Army-provided data.
[End of figure]
DLA Does Not Adequately Solicit Customer Feedback:
A principal aspect of DLA‘s strategic plan is for managers to focus on
customers‘ needs and improve customer satisfaction by listening to
customers about the quality of service they receive”both good and bad”
and making changes necessary to enhance that service. DLA uses
customer surveys, customer support representatives, and focus groups to
obtain feedback from its customers on their level of satisfaction with
the services DLA provides. For example, DLA conducts quarterly mail-out
surveys to measure overall customer satisfaction levels. It also places
customer support representatives at selected customer organizations to
assist customers in planning, implementing new supply initiatives, and
solving problems. However, we noted several weaknesses in these
methods. Specifically, (1) the satisfaction survey response rates are
too low to provide meaningful statistical analyses of customer
satisfaction, (2) the survey instrument does not provide a sufficient
means to understand why customers may be less than satisfied, and (3)
customer support representatives are more reactive than proactive in
soliciting customer feedback.
Quarterly Mail-out Surveys Have Low Response Rates:
The quarterly mail-out surveys that DLA uses to measure customer
satisfaction elicit a relatively low number of responses from DLA
customers, significantly limiting its usefulness in soliciting customer
feedback. The survey response rates were too low to provide meaningful
statistical analyses of customer satisfaction. The response rate for the
33,000 surveys that DLA mailed out in fiscal year 2001 averaged around
23 percent, and only about 20 percent for the August 2001 cycle (the
latest cycle where results have been made available). As such, less
than one quarter of DLA‘s customers are providing input on how they
perceive DLA support and what problems they are experiencing that may
need to be addressed.
Large survey organizations like Gallup attempt to get response rates of
between 60 and 70 percent for their mail surveys. Experts on customer
satisfaction measurement have stated that although survey response rates
are never 100 percent, an organization should strive to get its rate as
close as possible to that number. [Footnote 13] They suggest that
ideally, organizations can obtain response rates of over 70 percent.
The experts also noted that organizations conducting surveys commonly
make the mistake of assuming that if a final sample size is large, the
response rate is unimportant. This leads organizations to accept
response rates well under 25 percent. However, such low rates can lead
to serious biases in the data.
Having an inadequate understanding of who its customers are likely
contributes to DLA‘s problem with low response rates. The surveys are
mailed to addresses associated with the DODAACs and include with each
survey a message asking that the survey be provided to a person most
familiar with requisitioning and ordering supplies. However, during the
fiscal year 2001 survey period, over 2,200 of the 33,000 surveys mailed
(about 7 percent) were returned to DLA as ’undeliverable“ or were
delivered to people who were no longer customers. Furthermore, another
128 respondents noted in their survey returns that they do not consider
themselves to be customers. DLA officials stated that the undeliverable
rate increases when there are many units that move to other locations or
when service officials do not update DODAACs for changed addresses.
Surveys Are Insufficient for Identifying Causes of Customer
Dissatisfaction:
The quarterly mail-out survey asks customers to rate their overall
satisfaction with DLA products and services, along with specific
aspects of support, such as providing products in time to meet needs
and effectively keeping customers informed. While these surveys provide
general aggregate information on the levels of customer satisfaction,
they do not provide the means to understand why customers may be less
than satisfied. For example, a number of customers we interviewed voiced
concern over the fact that status dates for back-ordered items were
either sometimes wrong or varied between different inventory systems.
The survey might indicate only an overall low level of satisfaction in
the area of keeping customers informed but would not provide a reason.
If this problem were systemic throughout DLA, there would be less of an
opportunity to take immediate corrective action. Most recently, in June
1999, DLA supplemented a quarterly survey with two focus groups
targeted at soliciting specific customer feedback on DLA‘s communication
efforts. While DLA determined the focus groups to be an excellent
feedback mechanism, the sample size was too small for DLA to run a
statistical analysis of the data obtained; and the topics for
discussion were limited to customer communication.
DLA officials stated that they use a number of methods to obtain
customer feedback. These include analyses of survey results, focus
groups, and structured interviews. However, they acknowledged that the
usefulness of these methods is somewhat limited owing either to low
response rates; limited discussion topics; small sample sizes; or, in
the case of structured interviews, the fact that the most recent ones
were conducted in 1997.
DLA‘s own survey results also indicate the flaws with its survey
techniques. For example, DLA‘s fiscal year 2000 survey results show that
customers rated as ’low satisfaction“ their ability to reach the right
DLA person to meet their needs. However, the survey noted that ’due to
its high importance to customers and the myriad of interpretations of
…less than satisfied‘ responses to this attribute, more information
will need to be gathered“ to determine what issues are preventing
customers from reaching the right person. This indicates that DLA‘s
survey was not adequate to get behind the underlying causes of customer
dissatisfaction. In fact, with respect to low satisfaction ratings, the
survey reports for fiscal years 2000 and 2001 recommended that DLA
conduct one-on-one interviews to identify why customers were not
satisfied with DLA services.
Another difficulty that DLA encounters in using mail-out satisfaction
surveys to identify customer problems is that the surveys are designed
to protect the confidentiality of the respondents, which limits DLA‘s
ability to follow up with customers for adequate feedback. As a result,
there is no means to follow-up with customers expressing low
satisfaction levels to identify specific problems or to determine what,
if any, corrective actions are needed. During our meetings with DLA
customers, we were able to identify specific problems only by engaging
in a dialogue with them about their experiences. In conducting these in-
depth discussions on aspects of the supply process such as placing
orders, obtaining the status of outstanding requisitions, receiving
supply items, and obtaining customer service, we were able to ask
follow-up questions to determine exactly what problems they were
experiencing in some of these areas.
Customer Support Representatives Not Proactive in Soliciting Feedback:
Another method DLA uses to facilitate customer service is the placement
of customer support representatives at key customer locations. The use
of these on-site representatives has the potential to provide DLA with
a good link to its customers. In fact, some customers at three
locations we visited specifically noted their satisfaction with the
assistance the representatives provided. However, according to DLA
headquarters officials, customer support representatives have been more
reactive in that they help customers resolve only specific problems or
assist in implementing new initiatives as requested. DLA headquarters
officials told us that the representatives neither proactively solicit
feedback on a regular basis from the multitude of customers in their
geographical area nor reach out to identify the types of problems
customers are experiencing.
Furthermore, not all representatives are in contact with all DLA
customers at their assigned locations. For example, at one location we
visited, the representative was working closely with a specific
customer organization. According to officials at this location, the
representative has been very helpful to them in resolving supply
problems and implementing new initiatives. However, a number of other
customers at this location said they do not use the customer support
representative at all because they use other options, such as call
centers. Some customers noted that they were not even aware that there
was such a representative in the area. The Combat Support Agency Review
Team‘s assessment in 1998 also found that some customers were unaware
that customer support representatives even existed. The study
identified a need for DLA to improve its interaction with customers and
suggested that DLA ’get out more and visit the customers“ to identify
and correct problems. Headquarters officials told us they assign
customer support representatives to DLA‘s larger customers, which
account for about 5 percent of the overall customer population and 80
percent of the agency‘s business. Officials also stated they recognize
that the customer support representative program is not as effective as
it should be. As a result, the agency currently has initiatives under
way to (1) provide more customer support representatives and training,
(2) standardize the representatives‘ roles, and (3) make the
representatives more proactive in serving customers.
Current Customer Feedback Framework Is Too Fragmented and Lacks
Accountability:
An important part of providing effective customer service is simplifying
customers‘ access to the organization, such as through centralized
contact points. In addition, best practices research emphasizes the
need for a single, centralized management framework for receiving
customer feedback so that all information about the customers can be
linked together to facilitate a more complete knowledge of the customer.
However, DLA does not provide a ’single face“ to its customers for
addressing their issues. To obtain assistance, customers sometimes need
to navigate through a number of different channels, none of which are
interconnected. This process causes confusion with customers and
fragmented accountability throughout DLA for customer satisfaction.
When customers order multiple types of supply items, they must use many
channels, depending on the type of item, to obtain assistance from DLA.
However, as DLA has noted, there is no single DLA contact point
responsible for resolving customers‘ problems for all the items they
requisition. For example, the supply centers are responsible for
managing specific weapons system parts or types of commodities. As
such, problem resolution is performed through each supply center,
depending on the type of item the customer is ordering. To obtain
assistance with requisitions, customers must contact the appropriate
supply center, generally through its customer ’call center,“ which is
an activity dedicated to provide customer assistance for the particular
items. In addition, Emergency Supply Operation Centers are available at
each supply center for high priority items. Also, customers can contact
individual item managers at the supply centers to resolve problems with
their orders. At three locations, some customers told us they are
sometimes confused over whom to call and reported difficulties with
getting in touch with the right person to resolve their problems.
Customers at four locations were also frustrated with the quality of
assistance provided by DLA, noting that while some of the DLA
representatives were helpful, others were not able to give them the
assistance they needed.
To illustrate further, one aviation supply unit we visited had high-
priority, back-ordered requisitions from each of the three DLA supply
centers in Richmond, Virginia; Columbus, Ohio; and Philadelphia,
Pennsylvania. As a result of these back orders, some of the unit‘s
aircraft were unable to operate because of maintenance needs. In order
to get assistance with these requisitions, either to request help in
expediting the order or to obtain better status information, unit
supply personnel needed to contact the call centers or the Emergency
Supply Operation Centers at each of the supply centers, depending on
the item. If there were a single DLA point of contact, the unit could
go to that contact for assistance with all the items on its list of
priority requisitions.
Another problem with DLA‘s having many separate lines of communication
with its customers is that meaningful information about those customers
is not collected centrally for analysis. For example, each of the
supply centers accumulates vital information about customer
satisfaction through its contacts with customers. For instance,
customers express specific problems they are having when getting help
through the call centers. They might also convey information on
problems they are having to various supply center teams conducting on-
site visits for purposes of training or other liaison activities.
However, this information is neither shared between the supply centers
nor provided to the DLA corporate level for a global review. As a
result, no analysis of this information can be made to identify
systemic problems or any accountability at one place for a given
customer to ensure that its concerns are being addressed.
Initiatives for Achieving a Better Customer Focus Could Be Enhanced
Through Improved Customer Feedback Approaches:
While DLA has initiatives under way to improve its customer service,
there are opportunities to enhance these initiatives to provide for an
improved customer feedback program. DLA has recognized that it is not
as customer focused as it should be and is developing a new strategy to
improve its relationship with its customers. This new strategy,
referred to as the Customer Relationship Management initiative, lays
out an improved approach to customer service that creates a single DLA
face to customers and focuses on customer segments to develop a better
understanding of the customer. However, DLA‘s initiatives do not
completely address the limitations we identified in its current
approaches for obtaining customer service feedback, such as by
improving the way that it solicits feedback from individual customers.
Research on best practices for customer service shows that successful
organizations utilize multiple approaches to listen to their customers.
These approaches include transaction surveys, customer interviews, and
complaint programs that provide qualitative and quantitative data. The
research also points to a need for centrally integrating all customer
feedback so that managers can achieve a better understanding of
customers‘ perceptions and needs.
DLA Is Developing a Strategy to Improve the Relationship with Its
Customers:
In February 2002, DLA‘s Deputy Director stated that DLA ’has been
internally focused rather than customer focused“ and that its culture
has been to talk to customers only ’when problems arose.“ To address
this problem, DLA has begun a multimillion-dollar initiative aimed at
focusing its business operations to better deliver important customer
outcomes and actively managing relationships with its customers. This
effort, known as Customer Relationship Management, is being developed
in conjunction with DLA‘s broader strategic planning initiatives such
as Business Systems Modernization and implementation of the Balanced
Scorecard approach to performance measurement. To implement Customer
Relationship Management, DLA expects to spend about $73 million during
fiscal years 2002-2008. According to DLA officials, when this effort is
complete, DLA expects its customer service program to be on the same
level as those in place at leading organizations in the private sector.
The concept of the Customer Relationship Management initiative is a step
in the right direction toward significantly improving DLA‘s relationship
with its customers. For example, part of the management initiative is a
plan to radically change the focus of its business practices and
improve its interactions with customers. To do this, DLA is grouping
customers by business segment, collaborating with these segments to
achieve a better understanding of their needs, and tailoring logistics
programs to the unique needs of the segments. Examples of business
segments include deployable combat forces, industrial facilities, and
training activities. Table 1 illustrates the proposed customer
segments, which will include major military service commands.
Table 1: DLA Customer Segments and Illustrative Military Commands:
Segment: Deployed;
Army: Commanders-in-chief by geographic area of responsibility (e.g.,
U.S. European Command, U.S. Pacific Command, U.S. Central Command);
Navy: [Empty];
Marine Corps: [Empty];
Air Force: [Empty].
Segment: Deployable;
Army: Forces Command;
Navy:Commander-in-Chief, U.S. Atlantic Fleet;
Marine Corps: II Marine Expeditionary Force;
Air Force: Air Combat Command.
Segment: Training;
Army: Training and Doctrine Command;
Navy: Chief of Naval Education and Training;
Marine Corps: Marine Corps Combat Development Command;
Air Force: Air Education and Training Command.
Segment: Industrial;
Army: Army Materiel Command;
Navy: Naval Air Systems Command;
Marine Corps: Marine Corps Materiel Command;
Air Force: Air Force Materiel Command.
Segment: Other;
Army: [A];
Navy: Naval Supply Systems Command;
Marine Corps: [A];
Air Force: [A].
[A] No Army, Marine Corps, or Air Force commands designated by DLA for
this segment.
Source: DLA.
[End of table]
In an effort to streamline the numerous customer-reporting channels
currently in place, DLA plans to establish a multilevel-focused account
manager structure and increase accountability. DLA hopes that this
effort will reduce the number of channels a customer must navigate to
obtain assistance and focus accountability for customer satisfaction on
account managers rather than on item managers. DLA plans to establish
account managers at three levels:
* National Account Managers are to collaborate with military services
at the departmental level, for demand planning and problem resolution.
* Customer Account Managers are to be the ’single DLA face“ to each
customer segment. These managers are to collaborate with executives at
the segment level to develop service-level agreements that outline
customer segment needs and to resolve issues at the segment level.
* Customer Support Representatives are working-level DLA personnel who,
on a day-to-day basis, work with specific customers within a segment,
providing on-site assistance as appropriate.
In addition, DLA plans to place its existing customer contact points,
such as call centers and Emergency Supply Operation Centers, under the
control of account managers instead of the supply centers.
DLA Improvement Efforts Do Not Include New Approaches to Obtain
Customer Feedback:
Although the Customer Relationship Management initiative is conceptually
sound, the program‘s implementation actions do not completely address
the limitations we identified in its current practices. For example,
the new strategy does not lay out milestones for implementing the
program or specific improvements on how DLA solicits detailed feedback
from its individual customers on their perceptions of service and the
specific problems they are experiencing. The strategy also does not
include a process for developing actions in response to issues that
customers have identified and involving customers in that process.
Furthermore, even though the plans include making account managers
responsible for collecting customer feedback and exploring the idea of
using Web-based tools to obtain customer feedback, they do not lay out
specific tools or processes to accomplish this.
To further illustrate, under the new Customer Relationship Management
plan, an account manager would be created with responsibility for all
customers within the U.S. Army Forces Command, which represents the
Army‘s deployable forces segment. (See table 1.) This manager would
work with the Army‘s customer representatives to identify customers‘
needs at the Forces Command level and reach formal agreements on
service. However, there is no revised set of tools in the plan for
collecting detailed feedback on an ongoing basis from the individual
customer organizations representing the more than 6,600 DODAACs
(address codes that represent mailboxes, locations, or people) in the
Forces Command.
Furthermore, the improvement initiatives do not provide for actions to
link military service customer DODAACs to specific accountable
organizations. Under the Customer Relationship Management program,
DLA has developed a customer profile database that links DODAACs to
major military commands, such as the U.S. Army Forces Command. It also
plans to link each DODAAC to a business segment through this database
sometime in the future. However, as noted previously, the major command
and business segment levels comprise numerous DODAACs. Interaction
with customers to get detailed feedback on their level of satisfaction
requires better identification of customer organizations beyond the data
currently associated with a DODAAC.
Best Practice Organizations Use Multiple Approaches:
Studies examining best practices in the area of customer service have
found that leading organizations use multiple approaches to listen to
their customers‘ concerns. [Footnote 14] In particular, a 2001 Mid-
American Journal of Business study pointed out that best practice
companies [Footnote 15] use multiple tools to gather these data rather
than relying on a single method such as a customer survey, which might
be too narrow in scope and limited in its application to fully capture
customers‘ concerns. [Footnote 16] The 2001 Mid-American Journal study
and others concluded that the best approach for obtaining customer
feedback is to use a broad measurement system with more than one
listening tool to capture customers‘ input from many different
perspectives.
Using different tools alone is not enough to effectively obtain customer
feedback. Centrally linking the feedback obtained is also important.
Best practices research shows that information obtained through various
methods needs to be integrated in order to gain a more complete
understanding of customers. Thus, by linking all the various feedback
tools in a standard and consistent manner, the organization would have
better diagnostic information to guide improvement efforts.
On the basis of our discussions with private sector experts and our
reviews of literature on customer service best practices, leading
organizations such as AT&T WorldNet Services, U.S. West, and Eastman
Chemical combine quantitative and qualitative listening tools to obtain
customer feedback and then centrally integrate the data in one location.
Quantitative tools include such methods as customer satisfaction surveys
and customer complaints, which can provide measurable data for use in
performance scorecards. Qualitative tools include focus groups,
personal interviews, and observation and are used by organizations to
provide a more in-depth understanding of their customers. According to
the research, not all tools are appropriate for all organizations, and
the research points out that careful selection is therefore important.
Examples of ’listening“ tools being used by the best practices
organizations we identified through our reviews of best practice
studies follow:
* Customer satisfaction surveys. Research shows that most major
organizations use listening tools such as relational and critical
incident surveys [Footnote 17] to periodically capture customers‘
overall perceptions about their organization and to measure
satisfaction with specific transactions soon after they occur. These
surveys can be administered through the mail, such as with DLA‘s
quarterly satisfaction survey; by telephone; in person; or
electronically via the Internet. However, feedback from mail and
electronic-based surveys can be more limited than that obtained through
other methods because there is no opportunity to probe the respondent
for better, more-detailed information. AT&T WorldNet Services, U.S.
West, Eastman Chemical, and Hewlett-Packard are among the leading
organizations that are turning to critical incident surveys in
conjunction with other tools to learn more about customers‘ perceptions.
Critical incident surveys are becoming more popular in the private
sector because they provide information related to specific processes,
which can be used to make specific improvements.
* Customer complaints. Gathering complaint data is a standard practice
for most companies. All aspects of the customer complaint process are
measured and tracked through this mechanism. Information collected and
analyzed from this approach includes the nature of the complaint, speed
of resolution, and customer satisfaction with the resolution. Eastman
Chemical, for example, uses customer complaint data in conjunction with
a survey tool to obtain customer feedback. It organizes the complaint
data along the same attributes as the survey data.
* Benchmark surveys. Benchmark surveys gather perceptions of
performance from the entire market. These surveys usually gather
customer perceptions of performance about top competitors in an
industry. This allows the company to examine its customer-perceived
strengths and weaknesses in the overall marketplace. Best practices
companies, such as Sun Microsystems, use this information primarily in
the strategic planning process to identify their competitive advantage
in the marketplace and to identify opportunities and shortfalls in the
industry. While continuous improvement may be a result of this listening
tool, the real value, according to the research in this area, comes from
breakthrough thinking to gain a sustainable advantage.
* Won-lost and why surveys. ’Lost“ customers”those who do not replace
orders with a company”can be an excellent source of valuable
information. Some companies, such as Eastman Chemical, employ ’won-lost
and why“ surveys to measure actual customer behavior and the rationale
behind the behavior. This survey is utilized on a current basis, being
administered to customers soon after they are ’won“ or ’lost“ (i.e.,
decide to drop a company). For example, if a customer is won or lost,
the company then probes the customer as to why its business was won or
lost. For companies with a large number of customers, this tool may be
implemented in a survey.
* Focus groups. Organizations use focus groups to get better information
from customers than survey results provide. In these groups, customers
are probed about why they answered survey questions the way they did.
DLA has used focus groups to get detailed feedback on a single topic,
but as noted previously, the number of individuals making up the focus
groups was too small to draw agency-wide conclusions. AT&T Universal
Card Services (now part of Citigroup) conducts multiple focus groups
per year to discuss a wide range of topics. In these forums, both
satisfied and dissatisfied customers discuss the company‘s service,
products, and processes.
* Customer interviews. Conducting interviews with customers can provide
a way to get very detailed information about their specific needs and
problems. Like focus groups, this tool is used by leading customer
service organizations to probe survey respondents as to why they
answered survey questions a certain way. U.S. West identifies
dissatisfied customers from its surveys and follows up with them to
determine what problems they are having and how they can be fixed.
* Customer observation. In performing observations, organizations send
teams to visit customers where they observe how those customers interact
on a daily basis with the organization. This tool complements verbal
information obtained through customer interviews and focus groups in
that it provides confirmation to and a deeper understanding of that
information.
* Management listening. Using this tool, managers listen in on actual
customer calls to the organization to learn first-hand about what
customers are experiencing. In an example of this technique, one best
practice company encourages all of its managers, including the chief
executive officer, to listen to customer calls.
* Customer service representatives. Collecting information from those
employees who are in continuous direct contact with customers provides
valuable information to best practice organizations. Often, these
representatives are among the first to recognize customer problems. As
mentioned previously, DLA uses customer support representatives to
obtain feedback. However, according to DLA officials, it does not
currently have enough representatives assigned to its customers, and the
representatives generally are not proactive in obtaining customer
feedback. Furthermore, while DLA‘s representatives provide headquarters
with monthly written reports on customer support, best practice
organizations have taken this a step further by using electronic
feedback mechanisms. Research shows that best practice organizations
have their customer service representatives gather ideas, perceptions,
and opinions from customers and report them electronically through a
corporate intranet system. These data are then coded and distributed
throughout the organization, thereby centrally integrating the feedback
information.
Figure 3 shows an example of how multiple approaches can be linked, as
illustrated by AT&T Universal Card Services‘ use of a ’Customer
Listening Post“ team.
Figure 3: AT&T Customer Feedback and Listening Strategies:
[See PDF for image]
This figure is an illustration of AT&T Customer Feedback and Listening
Strategies as follows:
Scenario One:
Customer Feedback and Listening Strategies:
Level one: Customer Impacting Business Processes:
Level two: Solicit Feedback - Strategies;
Level three:
* Customer Expectation Research;
* Performance Research;
Level four: Organizational Improvement Activities: Modification of
Existing Policies/Procedures and Standards;
Level five: Customer Listening Post Team;
Level six: Short term/Long term.
Level four feeds information back to level one to allow the process to
cycle.
Scenario Two:
Customer Feedback and Listening Strategies:
Level one: Customer Impacting Business Processes:
Level two: Operations Feedback - Strategies;
Level three:
* Direct Customer Feedback;
* Process Management;
Level four: Organizational Improvement Activities: Modification of
Existing Policies/Procedures and Standards;
Level five: Customer Listening Post Team;
Level six: Short term/Long term.
Level four feeds information back to level one to allow the process to
cycle.
Scenario one and two occur simultaneously.
Note: AT&T Universal Card Services integrates methods such as monthly
satisfaction surveys, telephone surveys/interviews, data mining (from
customer commendations, letters, and phone calls), annual focus groups,
and monthly management‘s listening to customer calls.
Source: Best Practices, LLC.
[End of figure]
Conclusions:
While high-quality service to its customers is an overall goal, DLA
lacks the information necessary to systematically assess the quality of
service it is providing its customers with. Indications are that
customers, while satisfied in some areas, are dissatisfied in others.
The failure to address areas of dissatisfaction means opportunities to
improve supply readiness are being missed. DLA is in the process of
developing a program to improve its customer service relationships, but
it currently does not have in place an effective mechanism that
systematically gathers and integrates information on customer service
views so that solutions can be identified to better meet their needs.
The agency‘s current practices do not always surface these concerns, or
more importantly, provide information on why they exist or how they can
be corrected. To its credit, DLA is undertaking a number of initiatives
to improve the effectiveness of its customer relationship improvement
efforts. However, these initiatives do not completely address the
limitations of its current approaches for obtaining customer feedback
because DLA (1) has not yet fully determined who its customers are or
how best to serve their needs; (2) has not established the means to
determine the underlying causes for customer dissatisfaction in order
to fully achieve its strategic goal of providing customers with the
most efficient and effective worldwide logistics support; and (3) lacks
a centralized, customer-driven integrated framework in which to solicit
feedback from its customers. Also, customer mail-out surveys are
insufficient for identifying the causes of customer dissatisfaction.
Finally, DLA is not yet making full use of best practice techniques, as
discussed in this report, to identify and address customers‘ concerns.
Recommendations for Executive Action:
To improve DLA‘s ability to determine its customers‘ needs, identify
solutions for better meeting those needs, improve the supply readiness
of military units, and improve the efficiency and effectiveness of depot
maintenance repair activities, we recommend that the Secretary of
Defense direct the Under Secretary of Defense for Acquisition,
Technology, and Logistics to require the Director of DLA, as part of the
agency‘s customer relationship improvement efforts, to take the
following actions:
* Develop a comprehensive plan for obtaining customer feedback that
includes but is not limited to the following actions:
- Work with the military services to arrive at a mutually agreed
determination of the military organizations that function as DLA
’customers.“ In doing so, both DLA and the services should identify
officials accountable for providing and receiving customer feedback.
- Develop a customer feedback program that uses a variety of approaches
such as those depicted in the best practices research discussed in this
report. In developing this program, pilot tests could be used to
determine which approaches meet agency and customer needs.
- Establish milestones for implementing the customer feedback program
and for identifying the office accountable for its implementation.
- Integrate all customer feedback into an overall assessment to provide
managers with a better understanding of customers‘ perceptions and
concerns.
- Establish a process for developing actions in response to issues that
are identified from the customer feedback program and involve customers
in that process.
- Establish processes for providing customers with information on
actions that are being taken to address customer feedback issues.
* Improve the usefulness of its customer survey instruments by
identifying ways to improve customer response rates, such as the use of
effective follow-up procedures.
* Clarify guidance for customer support representatives to ensure that
they are responsible for routinely contacting customers to obtain
customer feedback.
We also recommend that the Secretary of Defense direct the Secretaries
of the Army, Navy, and Air Force to identify specific organizations
that will be responsible for working with DLA in establishing a
mutually agreed determination of those activities, organizations, and
individuals that function as DLA ’customers“ and for working with DLA
as it implements its customer feedback program.
Agency Comments and Our Evaluation:
The Department of Defense provided written comments on a draft of this
report, which are reprinted in their entirety in appendix II. DOD
generally concurred with our recommendations and agreed that DLA needs
to increase its focus on customer satisfaction. The department also
noted that DLA is taking or is planning to take a number of actions to
respond to our recommendations. For example, under DLA‘s Customer
Relationship Management program, DLA National Account Managers are to
identify customer organizations in concert with their military service
negotiating partners. In addition, DOD intends to use its Defense
Logistics Executive Board as a forum to obtain input from each of the
services on the specific organizations that will be responsible for
working with DLA on customer feedback issues. Furthermore, DLA intends
to better integrate customer feedback into an overall assessment and to
improve its processes for providing customers with information on
actions that are being taken to address customers‘ issues.
DOD did not agree with our recommended action that DLA develop a
customer feedback program that uses a variety of approaches, such as
those depicted in the best practices research discussed in this report.
DOD stated that DLA‘s use of feedback mechanisms should not be dictated
by the best practices research we discussed. It further stated that DLA
should continue to have the latitude to use its customer satisfaction
measurement resources in the most efficient manner. Our discussion of
best practice approaches was only intended to illustrate various
techniques that some best practices organizations use to improve the
ways they collect and analyze customer feedback. It was not our intent
to prescribe specific approaches that DLA should use. Rather, we
included examples of some of the approaches to best illustrate the
concept of using multiple and integrated customer feedback approaches
to better listen to customers‘ opinions and concerns. We continue to
believe that DLA‘s customer feedback program could benefit from
studying best practice organizations, such as those discussed in this
report as well as others, to identify additional feedback approaches
that could be pilot-tested and implemented to help strengthen its
current customer feedback efforts.
We are sending copies of this report to the Secretary of Defense; the
Secretary of the Army; the Secretary of the Navy; the Secretary of the
Air Force; the Commandant of the Marine Corps; the Director, Defense
Logistics Agency; the Director, Office of Management and Budget; and
other interested congressional committees and parties. We will also make
copies available to others upon request. In addition, the report will be
available at no charge on the GAO Web site at [hyperlink,
http://www.gao.gov].
Please contact me on (202) 512-4412 if you or your staff have any
questions concerning this report. Major contributors to this report are
included in appendix III.
Signed by:
Charles I. Patton, Jr.
Director, Defense Capabilities and Management:
[End of section]
Appendix I: Scope and Methodology:
To determine how customers perceived the quality of service they
received, we examined customer satisfaction studies and surveys such as
the Defense Logistics Agency‘s (DLA) fiscal year 2000 and fiscal year
2001 quarterly satisfaction surveys and the Joint Staff Combat Support
Agency Review Team‘s 1998 and 2001 assessments. In addition, we
performed a case study analysis using a judgmentally selected sample of
DLA customers that included the use of structured interviews to identify
customers‘ perceptions and levels of satisfaction with DLA service. The
details of our customer selection process, interview techniques, and
sampling methodology follow:
* We initially selected customers using DLA-provided databases of its
’top“ military customers, which DLA primarily based on sales volume. DLA
identified customers by Department of Defense Activity Address Codes
(DODAACs) or military installation. We compiled the DLA information
into a single database that included over 800 customer records
accounting for about $5.6 billion of DLA‘s total $7.8 billion nonfuel
supply sales (about 72 percent) to the military services for fiscal
year 1999, the most recent available data at the time of our review.
* We judgmentally selected customers from the database to maximize our
coverage of the following significant variables: dollar sales,
geographic location, DLA-defined customer type (i.e., deployed and
deployable forces, industrial organizations, training activities, and
the ’other“ segment), commodity type, and military service branch. We
did not validate the accuracy of the DLA sales data, since the data‘s
purpose was to provide us with general customer sales activity.
* Because the DLA-provided customer DODAAC and installation data did
not provide us with sufficient information about specific customer
organizations and related points of contact, we held discussions with
DLA and military service officials to further define customers and
subsequently visited those customer organizations and activities.
* We conducted over 50 structured interviews with customers at more than
20 selected activities. We designed the interview questions on the
basis of aspects of DLA‘s supply process: submitting requisitions,
following up on the status of open requisitions, contacting DLA for
customer service, and receiving supplies. We also discussed other
factors related to DLA support, such as the availability, price, and
quality of DLA-provided supply items. Some customers did not express an
opinion on the overall quality of customer service.
* Our initial sample of DLA customers included customers from more than
20 locations throughout the continental United States and overseas,
covering multiple customer types within each military service. However,
because of the September 11, 2001, terrorist attacks on the World Trade
Center in New York, and the Pentagon in Washington, D.C., we did not
complete our planned visits. As a result, we limited our visits to eight
military service customer locations within the continental United
States, as shown in figure 4. Our selection of customers included all
four military services and each of the DLA customer types except for
deployed forces.
* Because we did not draw a statistical sample and we limited our
selection of customers, the results of our work cannot be projected to
DLA as a whole. However, DLA surveys, Combat Support Agency Review Team
assessments, and comments from DLA officials suggest that many of the
issues we raise are systemic problems.
Figure 4: DLA Customer Locations Visited by GAO:
[See PDF for image]
This figure is a map of the continental United States, depicting the
following sites visited by GAO:
Marine Corps Base, Quantico, VA:
Langley Air Force Base, VA:
U.S. Atlantic Fleet/Fleet and Industrial Supply Center, Norfolk, VA:
Fort Bragg, NC:
Camp Lejeune/Marine Corps Air Station, New River, NC:
Marine Corps Logistics Base, Albany; GA:
Corpus Christi Army Depot, TX:
Tinker Air Force Base/Oklahoma City Air Logistics Center, OK.
[End of figure]
To determine how useful the agency‘s approaches are for obtaining
customer service feedback, we met with DLA headquarters officials to
discuss current processes and planned initiatives for measuring customer
service and obtaining feedback. We also discussed with DLA customers,
feedback mechanisms such as the use of DLA customer support
representatives and quarterly surveys. We reviewed relevant reports,
briefing documents, and other key information related to the agency‘s
processes and mechanisms for soliciting customer feedback. Additionally,
we examined the agency‘s customer feedback survey techniques and
methods, such as the use of quarterly mail-out surveys and focus
groups.
Furthermore, we conducted an extensive literature search of best
practice organizations to determine popular techniques for collecting
customer feedback, and their advantages and disadvantages.
To determine whether there are opportunities to enhance DLA‘s
initiatives to improve customer service, we performed a comparative
analysis between DLA‘s current practices and planned initiatives, and
best practices that we identified through extensive literature
searches. We reviewed related DLA planning documents and met with
agency officials to discuss the agency‘s plans. Through our literature
search, we identified relevant research performed in the area of best
practices in customer satisfaction. We reviewed a number of pertinent
studies and held discussions with customer satisfaction experts from
industry and academia to identify methods and techniques used in
leading organizations to obtain meaningful feedback from their
customers.
We performed our work from March 2001 to June 2002 in accordance with
generally accepted government auditing standards.
[End of section]
Appendix II: Comments from the Department of Defense:
Deputy Under Secretary Of Defense For Logistics And Materiel Readiness:
3500 Defense Pentagon:
Washington, DC 20301-3500:
August 12, 2002:
Mr. Charles I. Patton, Jr.
Director, Defense Capabilities and Management:
U.S. General Accounting Office:
Washington, D.C. 20548:
Dear Mr. Patton:
This is the Department of Defense (DoD) response to the General
Accounting Office (GAO) draft report, GAO-02-776, "Defense Logistics:
Improving Customer Feedback Program Could Enhance DLA's Delivery of
Services," dated July 8, 2002 (GAO Code 350159).
The DoD agrees that DLA needs to increase its focus on customer
satisfaction, and generally concurs with the draft report's specific
recommendations. Detailed comments on the GAO recommendations are
provided in the attachment. The DoD appreciates the opportunity to
comment on the draft report.
Sincerely,
Signed by:
Allen W. Beckett:
Principal Assistant:
Attachment: As stated:
GAO Draft Report Dated July 8, 2002:
(GAO CODE 350159):
"Defense Logistics: Improving Customer Feedback Program Could Enhance
Dla's Delivery Of Services"
Department Of Defense Comments To The GAO Recommendations:
Recommendation 1: To improve DLA's ability to determine its customers'
needs, identify solutions for better meeting those needs, improve the
supply readiness of military units, and improve the efficiency and
effectiveness of depot maintenance repair activities, the GAO
recommended that the Secretary of Defense direct the Under Secretary of
Defense for Acquisition, Technology, and Logistics to require the
Director of DLA, as part of the agency's customer relationship
improvement efforts, to take the following actions:
* Develop a comprehensive plan for obtaining customer feedback that
includes but is not limited to the following actions;
- Work with the military services to arrive at a mutually agreed
determination of the military organizations that function as DLA
"customers." In doing so, both DLA and the services should identify
officials accountable for providing and receiving customer feedback.
- Develop a customer feedback program that uses a variety of approaches
such as those depicted in the best practices research discussed in this
report. In developing this program, pilot tests could be used to
determine which approaches meet agency and customer needs.
- Establish milestones for implementing the customer feedback program
and for identifying the office accountable for its implementation.
- Integrate all customer feedback into an overall assessment to provide
managers with a better understanding of customer perceptions and
concerns.
- Establish a process for developing actions in response to issues that
are identified from the customer feedback program and involve customers
in that process.
- Establish process for providing customers with information on actions
that are being taken to address customer feedback issues.
* Improve the usefulness of its customer survey instruments identifying
ways to improve customer response rates, such as the use of effective
follow-up procedures.
* Clarify guidance for customer support representatives to ensure that
they are responsible for routinely contacting customers to obtain
customer feedback. (pp. 32-33/GAO Draft Report)
DOD Response: Generally concur. DoD agrees that DLA should work with
the Military Services to identify DLA customers. The National Account
Managers established under DLA's Customer Relationship Management
program will identify the customer organizations in concert with their
negotiating partners. DoD does not concur with the GAO recommendation
that DLA's use of feedback mechanisms should be dictated by the best
practices research discussed in the report. DLA should continue to have
the latitude to uses its customer satisfaction measurement resources in
the most efficient manner. DoD concurs that DLA should establish
milestones for implementing the customer feedback program and for
identifying the office accountable for its implementation. DLA has
established such milestones in its Balanced Scorecard Customer Quadrant
and has established the DLA Readiness and Customer Support office as
the accountable office. DoD concurs that DLA should strive to integrate
customer feedback into an overall assessment. DLA does this to the
extent possible in the DLA Readiness and Customer Support office. Once
fully implemented, the Customer Relationship Management program will
provide a comprehensive integration capability. DoD concurs that DLA
should establish a process for responding to issues that are identified
through the customer feedback program. DLA currently reacts to customer
feedback at both the field activity level and the Agency level. As DLA
proceeds with the Customer Relationship Management program, these
processes will be reengineered. DoD concurs that DLA should establish
processes for providing customers with information on actions that are
being taken to address issues identified by customers. DLA currently
involves and informs customers, but will seek ways to improve the
process as Balanced Scorecard implementation progresses. DoD agrees
that DLA should clarify guidance for customer support representatives
to ensure that they are responsible for routinely contacting customers
to obtain customer feedback.
Recommendation 2: The GAO recommended that the Secretary of Defense
direct the Secretaries of the Army, Navy, and Air Force to identify
specific organizations that will be responsible for working with DLA in
establishing a mutually agreed determination of those activities,
organizations, and individuals that function as DLA "customers" and for
working with DLA as it implements its customer feedback program. (p.
33/GAO Draft Report)
DOD Response: Concur. The Defense Logistics Executive Board will be
used as the forum to obtain Military Department input on the specific
organizations that will be responsible for working with DLA on customer
feedback issues.
[End of section]
Appendix III: GAO Contact and Staff Acknowledgements:
GAO Contact:
Kenneth R. Knouse, Jr. (202) 512-9280:
Acknowledgements:
Elizabeth G. Mead, Cary B. Russell, David R. Warren, Jeffrey A. Kans,
Jack Kriethe, David Schmitt, Patricia Albritton, Brian G. Hackett,
Latrealle Lee, and Stanley J. Kostyla also made significant
contributions to this report.
[End of section]
Footnotes:
[1] P.L. 106-398, sec. 917.
[2] Since the early 1990s, DLA has been striving to better define and
refine its understanding of ’customer.“ Currently, the agency defines
its military customers, or war fighters, as those who purchase items,
and directly cause products to be bought or not bought, and the
commanders-in-chief of the military services. For this report, we did
not include DLA‘s interaction with its federal civilian customers.
[3] DLA‘s four supply centers are (1) Defense Supply Center, Columbus,
Ohio, which is responsible for land, maritime and missile support; (2)
Defense Energy Support Center, Fort Belvoir, Va., the lead center for
comprehensive energy solutions, such as contract support and the
management of petroleum-based fuels; (3) Defense Supply Center,
Richmond, Va., which is responsible for air, aviation, and space
support; and (4) Defense Supply Center, Philadelphia, Pa., the lead
center for troop support items, such as food, clothing, and medical
supplies.
[4] DLA performs five major business functions: distributing materiel
ordered from its inventory; purchasing fuels for DOD and the U.S.
government; storing strategic materiel; marketing surplus DOD materiel
for reuse, reutilization, or disposal; and providing numerous
information services, such as item cataloging, for DOD and the U.S. and
selected foreign governments.
[5] The Balanced Scorecard, introduced by Professor Robert Kaplan and
Dr. David Norton in 1992, is a strategic management system for
describing, implementing, and managing strategy at all levels of an
organization by linking objectives, initiatives, and measures to an
organization‘s strategic plan.
[6] Under 10 U.S.C. 193, the Joint Staff conducts a biennial Combat
Support Agency Review, including a review of DLA. The January 2001
review of DLA surveyed the unified commands and Joint Staff directors
with responsibility to the Commander, Joint Chiefs of Staff. The review
focused on services that DLA provides the unified commands with.
[7] Although customers were satisfied with DLA‘s prime vendor program
in these instances, in recent years, the DOD Office of Inspector
General reported that the program has failed to demonstrate an
effective shift to commercial, industrial-base resources as an
integrated logistics solution or provide the best value for DLA
customers. As a result, the prime vendor program did not reduce total
logistics costs, improve financial accountability, streamline defense
infrastructure, or add value to the defense supply system. 8 DLA places
customer support representatives at selected locations such as those
with high business volume or readiness needs to monitor the agency‘s
overall success of its relations with its customers. The
representatives are to provide a corporate face to particular customer
sites.
[8] DLA places customer support representatives at selected locations
such as those with high business volume or readiness needs to monitor
the agency‘s overall success of its relations with its customers. The
representatives are to provide a corporate face to particular customer
sites.
[9] See Military Aircraft: Services Need Strategies to Reduce
Cannibalizations, GAO-02-86 (Washington, D.C.: Nov. 21, 2001).
[10] Often, when items are not immediately available, customers can
check excess property listings provided by DLA‘s Defense Reutilization
and Marketing Service to see if the needed parts are available
elsewhere.
[11] In an effort to reduce warehousing costs, DOD decided in 1989 to
consolidate military service and DLA warehousing functions. This
resulted in the collocation of both militaryservice-owned and DLA-owned
parts in the same warehouse, referred to as a Distribution Depot.
[12] A DODAAC is a six-position numeric code that uniquely identifies a
unit, activity, or organization that has the authority to requisition
and/or receive materiel.
[13] See J. Anton and D. Perkins, Listening to the Voice of the
Customer, 16 Steps to a Successful Customer Satisfaction Measurement
Program, The Customer Service Group (New York City: 1997).
[14] See M.S. Garver and R.L. Cook, ’Best Practice Customer Value and
Satisfaction Cultures,“ Mid-American Journal of Business, vol. 16, no.
1 (2001); M.S. Garver, ’Modeling Best Practices for Government
Agencies: Implementing Customer Satisfaction Programs“ (Jan. 28, 2002);
Best Practices, LLC, ’Achieving World-Class Customer Service: An
Integrated Approach“ (copyright 1998-2001); Federal Benchmarking
Consortium, Serving the American Public: Best Practices in Customer-
Driven Strategic Planning (Feb. 1997).
[15] Best practice companies used in the research met at least three of
the following criteria: ’has won a quality award such as the Malcom
Baldridge award; has been discussed as a best practices company in a
respected publication; has presented a best practice at a customer
value and satisfaction practitioner conference; is respected as a
leading edge company; met the researchers‘ best practices
characteristics such as innovation, uniqueness, and high data
utilization rates.“
[16] See M.S. Garver, ’Listening to Customers,“ Mid-American Journal of
Business, vol. 16, no. 2 (2001).
[17] Relationship surveys are viewed as traditional customer
satisfaction surveys and are administered to customers typically on a
quarterly basis; transaction surveys are typically short in length and
are used to measure the performance of a specific process. They are
administered immediately following a certain type of service encounter,
event, or interaction with the customer.
[End of section]
GAO‘s Mission:
The General Accounting Office, the investigative arm of Congress,
exists to support Congress in meeting its constitutional
responsibilities and to help improve the performance and accountability
of the federal government for the American people. GAO examines the use
of public funds; evaluates federal programs and policies; and provides
analyses, recommendations, and other assistance to help Congress make
informed oversight, policy, and funding decisions. GAO‘s commitment to
good government is reflected in its core values of accountability,
integrity, and reliability.
Obtaining Copies of GAO Reports and Testimony:
The fastest and easiest way to obtain copies of GAO documents at no
cost is through the Internet. GAO‘s Web site [hyperlink,
http://www.gao.gov] contains abstracts and fulltext files of current
reports and testimony and an expanding archive of older products. The
Web site features a search engine to help you locate documents using
key words and phrases. You can print these documents in their entirety,
including charts and other graphics.
Each day, GAO issues a list of newly released reports, testimony, and
correspondence. GAO posts this list, known as ’Today‘s Reports,“ on its
Web site daily. The list contains links to the full-text document
files. To have GAO e-mail this list to you every afternoon, go to
[hyperlink, http://www.gao.gov] and select ’Subscribe to daily E-mail
alert for newly released products“ under the GAO Reports heading.
Order by Mail or Phone:
The first copy of each printed report is free. Additional copies are $2
each. A check or money order should be made out to the Superintendent
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or
more copies mailed to a single address are discounted 25 percent.
Orders should be sent to:
U.S. General Accounting Office:
441 G Street NW, Room LM:
Washington, D.C. 20548:
To order by Phone:
Voice: (202) 512-6000:
TDD: (202) 512-2537:
Fax: (202) 512-6061:
To Report Fraud, Waste, and Abuse in Federal Programs Contact:
Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]:
E-mail: fraudnet@gao.gov:
Automated answering system: (800) 424-5454 or (202) 512-7470:
Public Affairs:
Jeff Nelligan, managing director, NelliganJ@gao.gov:
(202) 512-4800:
U.S. General Accounting Office:
441 G Street NW, Room 7149:
Washington, D.C. 20548: