Tax Administration
Planning for IRS's Enforcement Process Changes Included Many Key Steps but Can Be Improved
Gao ID: GAO-04-287 January 20, 2004
In recent years, the Internal Revenue Service (IRS) has experienced declines in most of its enforcement programs, including declines in audits and in efforts to collect delinquent taxes. Increasing enforcement productivity is one strategy that can help reverse these declines. To this end, IRS is currently planning and has begun implementing enforcement process improvement projects. GAO was asked to assess the extent to which the planning for the projects followed steps consistent with both published GAO guidance and the experiences of private sector and government organizations. Specifically, GAO assessed the extent to which four judgmentally selected projects followed the 20 planning steps.
Planning for the four enforcement process improvement projects GAO reviewed included most of the 20-step framework developed to assess the projects. This increases the likelihood that projects target the right processes for improvement, choose the best target process from among alternatives, effectively implement the project, accurately assess project outcomes, and properly manage the change to the new process. However, none of the projects completed all of the steps. For example, some projects did not fully identify the causes of productivity shortfalls, leaving a risk that the project did not fix the right problem. In the course of this work, GAO found that IRS managers do not have guidance about the steps to follow in planning process improvement projects, increasing the possibility of omitting steps. A recurring issue in the four projects was that IRS's enforcement data only partially adjust for the complexity and quality of cases worked. This issue is also a problem for IRS enforcement productivity data generally. Failing to adjust for both complexity and quality increases the risk that trends in productivity will be misunderstood. For example, a decline in the number of cases closed per employee at the same time that case complexity is increasing may not be a real decline in productivity. GAO recognizes that some options for improving productivity data could be costly. However, costs could be mitigated by using existing statistical methods and IRS complexity and quality data.
Recommendations
Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.
Director:
Team:
Phone:
GAO-04-287, Tax Administration: Planning for IRS's Enforcement Process Changes Included Many Key Steps but Can Be Improved
This is the accessible text file for GAO report number GAO-04-287
entitled 'Tax Administration: Planning for IRS's Enforcement Process
Changes Included Many Key Steps but Can Be Improved' which was released
on February 19, 2004.
This text file was formatted by the U.S. General Accounting Office
(GAO) to be accessible to users with visual impairments, as part of a
longer term project to improve GAO products' accessibility. Every
attempt has been made to maintain the structural and data integrity of
the original printed product. Accessibility features, such as text
descriptions of tables, consecutively numbered footnotes placed at the
end of the file, and the text of agency comment letters, are provided
but may not exactly duplicate the presentation or format of the printed
version. The portable document format (PDF) file is an exact electronic
replica of the printed version. We welcome your feedback. Please E-mail
your comments regarding the contents or accessibility features of this
document to Webmaster@gao.gov.
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed
in its entirety without further permission from GAO. Because this work
may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this
material separately.
Report to the Chairman and Ranking Minority Member, Committee on
Finance, U.S. Senate:
January 2004:
TAX ADMINISTRATION:
Planning for IRS's Enforcement Process Changes Included Many Key Steps
but Can Be Improved:
GAO-04-287:
GAO Highlights:
HIGHLIGHTS OF GAO-04-287, A REPORT TO THE COMMITTEE ON FINANCE, U.S.
SENATE
Why GAO Did This Study:
In recent years, the Internal Revenue Service (IRS) has experienced
declines in most of its enforcement programs, including declines in
audits and in efforts to collect delinquent taxes. Increasing
enforcement productivity is one strategy that can help reverse these
declines. To this end, IRS is currently planning and has begun
implementing enforcement process improvement projects.
GAO WAS ASKED TO ASSESS THE EXTENT TO WHICH THE PLANNING FOR THE
PROJECTS FOLLOWED STEPS CONSISTENT WITH BOTH PUBLISHED GAO GUIDANCE
AND THE EXPERIENCES OF PRIVATE SECTOR AND GOVERNMENT ORGANIZATIONS.
SPECIFICALLY, GAO ASSESSED THE EXTENT TO WHICH FOUR JUDGMENTALLY
SELECTED PROJECTS FOLLOWED THE 20 PLANNING STEPS SUMMARIZED IN THE
FIGURE.
What GAO Found:
Planning for the four enforcement process improvement projects GAO
reviewed included most of the 20-step framework developed to assess
the projects. This increases the likelihood that projects target the
right processes for improvement, choose the best target process from
among alternatives, effectively implement the project, accurately
assess project outcomes, and properly manage the change to the new
process. However, none of the projects completed all of the steps. For
example, some projects did not fully identify the causes of
productivity shortfalls, leaving a risk that the project did not fix
the right problem. In the course of this work, GAO found that IRS
managers do not have guidance about the steps to follow in planning
process improvement projects, increasing the possibility of omitting
steps.
A RECURRING ISSUE IN THE FOUR PROJECTS WAS THAT IRS‘S ENFORCEMENT DATA
ONLY PARTIALLY ADJUST FOR THE COMPLEXITY AND QUALITY OF CASES WORKED.
THIS ISSUE IS ALSO A PROBLEM FOR IRS ENFORCEMENT PRODUCTIVITY DATA
GENERALLY. FAILING TO ADJUST FOR BOTH COMPLEXITY AND QUALITY INCREASES
THE RISK THAT TRENDS IN PRODUCTIVITY WILL BE MISUNDERSTOOD. FOR
EXAMPLE, A DECLINE IN THE NUMBER OF CASES CLOSED PER EMPLOYEE AT THE
SAME TIME THAT CASE COMPLEXITY IS INCREASING MAY NOT BE A REAL DECLINE
IN PRODUCTIVITY. GAO RECOGNIZES THAT SOME OPTIONS FOR IMPROVING
PRODUCTIVITY DATA COULD BE COSTLY. HOWEVER, COSTS COULD BE MITIGATED
BY USING EXISTING STATISTICAL METHODS AND IRS COMPLEXITY AND QUALITY
DATA.
What GAO Recommends:
GAO RECOMMENDS THAT THE COMMISSIONER OF INTERNAL REVENUE TAKE ACTIONS
TO (1) PUT IN PLACE A FRAMEWORK TO GUIDE THE PLANNING FOR PROCESS
IMPROVEMENT PROJECTS AND (2) INVEST IN BETTER ENFORCEMENT PROGRAM
PRODUCTIVITY DATA, RECOGNIZING THE COSTS AND BENEFITS OF DOING SO.
www.gao.gov/cgi-bin/getrpt?GAO-04-287.
To view the full product, including the scope and methodology, click
on the link above. FOR MORE INFORMATION, CONTACT JAMES WHITE AT (202)
512-9110 OR WHITEJ@GAO.GOV.
[End of section]
Contents:
Letter:
Results in Brief:
Background:
SB/SE Process Improvement Projects Included Most Key Steps but
Productivity Measurement Could Be Improved:
Conclusions:
Recommendations for Executive Action:
Agency Comments and Our Evaluation:
Appendixes:
Appendix I: Process Improvement Project Framework:
Decision to Change:
Target Process Development:
Implementation:
Outcome Assessment:
Change Management:
Appendix II: Scope and Methodology:
Appendix III: GAO Assessment of Four Selected SB/SE Process Improvement
Projects:
Appendix IV: Descriptions of SB/SE Process Improvement Projects:
Appendix V: Comments from the Internal Revenue Service:
Table:
Table 1: Major Process Improvement Projects in IRS's Small Business/Self
Employed Operating Division:
Figures:
Figure 1: 20-Step Process Improvement Framework:
Figure 2: Key Steps Included in Selected SB/SE Process Improvement
Projects:
Figure 3: 20-Step Process Improvement Framework:
Figure 4: Key Steps Included in the Field Examination Reengineering
Project:
Figure 5: Key Steps Included in the Compliance Support Case Processing
Redesign Project:
Figure 6: Key Steps Included in the Collection Taxpayer Delinquent
Account Support Project:
Figure 7: Key Steps Included in the Collection Field Function
Consultative Initiative Project:
Abbreviations:
CQMS: Collection Quality Measurement System:
EQMS: Examination Quality Measurement System:
GAO: General Accounting Office:
IRS: Internal Revenue Service:
SB/SE: Small Business/Self Employed Division:
Letter January 20, 2004:
The Honorable Charles E. Grassley:
Chairman:
The Honorable Max Baucus:
Ranking Minority Member:
Committee on Finance:
United States Senate:
In recent years, as we have reported, the Internal Revenue Service
(IRS) has experienced declines in most of its enforcement programs,
including steep declines in audits and broad declines in its efforts to
collect delinquent taxes.[Footnote 1] Factors we have cited as
contributing to these declines include the growth in tax returns filed
each year, reallocation of enforcement resources to improve telephone
and other services to taxpayers, the addition of new taxpayer rights
and protections by the IRS Restructuring and Reform Act of
1998,[Footnote 2] decreased enforcement staffing, and decreased
enforcement productivity as measured by cases closed per staff
time.[Footnote 3] The declines have triggered concerns that taxpayers'
motivation to voluntarily comply with their tax obligations could be
undermined.
Increasing enforcement productivity is one strategy that could help
reverse these declines. To this end, IRS's Small Business/Self Employed
(SB/SE) operating division is currently planning and has begun
implementing 15 enforcement process improvement projects. These
projects aim to make incremental improvements to enforcement processes
and are distinct from the major changes expected to result from IRS's
long-term business systems modernization effort. Because of your
concern about the declines in IRS's enforcement programs, including the
reported declines in their productivity, you asked us to examine IRS's
planning of its enforcement process improvement projects. Specifically,
as agreed with your offices, our objective was to assess the extent to
which SB/SE's planning followed steps consistent with both GAO guidance
and the experiences of private sector and government organizations. Our
focus was on SB/SE's planning and development of their projects; we did
not evaluate the results of any projects because only 1 of the 15
projects had started implementation in August 2003, when we were
choosing projects to review, and that project was too new to be
evaluated.
To assess SB/SE's planning, we needed to identify criteria--the steps
to follow in planning process improvement projects. We developed a 20-
step framework based on both GAO's Business Process Reengineering
Assessment Guide and discussions with private sector and other
government managers with experience planning process improvement
projects.[Footnote 4] GAO's Guide recognizes that the steps for
planning process improvement need to be adapted for the magnitude of
the projects and the particular circumstances of an organization. To do
this, we first held a roundtable meeting with the Private Sector
Council and two of its member companies, Northrop Grumman and
CNF.[Footnote 5] We also discussed process improvement planning with
managers from the tax departments of California, Florida, and
Minnesota; and the Minerals Management Service in the U.S. Department
of the Interior. We then used GAO's guidance and the experiences of the
above organizations to develop 20 key planning steps appropriate for
SB/SE's incremental improvement projects. The 20 key steps, organized
by project stages, are summarized in figure 1.
Figure 1: 20-Step Process Improvement Framework:
[See PDF for image]
[End of figure]
The first six steps are related to an organization's decision to change
a process. These steps help an organization to understand the extent
and causes of any weaknesses that need to be addressed. For example, in
step 1, an organization should begin investigating change by using
productivity data to define a baseline. Productivity measures generally
take the form of a ratio of outputs to inputs. The remaining steps in
this stage refine the organization's understanding of current
performance and where the organization wants to be.
The next six steps help an organization develop the new, or target,
process. These steps emphasize understanding and analyzing alternatives
and planning for implementation.
The next four steps concern an organization's implementation of the new
process. These steps help an organization through the most difficult
phase of process improvement, where ideas about the new process are
turned into actions. Two of these steps are related to pilot testing.
Pilot testing allows corrective actions to be taken if needed to
correct problems prior to full implementation. The other two steps
address employee responsibilities.
In the next stage, an organization makes plans to assess the outcome of
the new process. The single step in this stage ensures that an
organization makes plans early for evaluating the success the success
of the new process. It is important to develop assessment plans prior
to full project implementation in order to ensure that the data
necessary for evaluation are collected.
The last three steps are related to how an organization manages the
change to the new process. Successfully managing change reduces the
risk that improvement efforts will fail due to the natural resistance
to change within an organization.
Our 20-step framework emphasizes productivity measurement and analysis
both because the SB/SE projects focus on productivity gains and because
the managers we consulted, particularly our roundtable participants,
said that meaningful productivity data are an important foundation for
design and implementation of process improvement efforts.
We recognized in our review that there is some judgment involved in
defining these steps and that some steps may not be appropriate on
every project. GAO's Business Process Reengineering Assessment Guide
notes that the guide only provides a general framework for assessing
key issues. It says that it should not be regarded as a specific, rigid
set of steps for conducting a project because reengineering is far too
situational for such a rigid approach.[Footnote 6] The same caveat
applies to the process improvement 20-step framework we developed for
this review. Appendix I provides an expanded discussion of each
planning step we identified as appropriate for SB/SE's projects.
We judgmentally selected four projects to study in detail, including at
least one project in each of the three main enforcement areas that SB/
SE was revamping--audits (or examinations), collection, and compliance
support. We also looked for projects that were sufficiently far along
in their planning for us to expect to see either completed steps or
plans for the remaining steps.
For the four projects we selected, we assessed the degree to which IRS
followed our 20-step framework by first interviewing officials and
examining the extensive documentation they provided. We then returned
to officials responsible for each project and asked for additional
information, particularly in areas where our initial assessment was
that key steps were not taken. Where IRS officials showed us that
certain steps had been addressed, we then revised our initial
assessment. We also recognized the need for flexibility in the
application of our criteria, in that some of the steps we identified
may not necessarily be appropriate for every project. For instance, one
project determined that completing a pilot was not necessary;
accordingly they could not adjust the process based on the pilot, so
that step was listed "not applicable" in our assessment. We describe
the development of our framework and our assessment of selected SB/SE
projects in detail in appendix II.
Results in Brief:
Planning for the four projects we reviewed included most of the 20-step
framework we developed to assess SB/SE's planning of enforcement
process improvement projects. By including so many key planning steps
in its projects, SB/SE increased the likelihood that project teams
target the right processes for improvement, choose the best target
process from among alternatives, effectively implement the project,
accurately assess project outcomes, and properly manage the change to
the new process. However, none of the projects completed all of the
steps. For example, some projects did not fully identify the causes of
productivity shortfalls, leaving a risk that the projects would not fix
the right problem. In the course of this work, we found that SB/SE
managers do not have guidance about the steps to follow in planning
process improvement projects, increasing the possibility of omitting
steps. GAO's Business Process Reengineering Assessment Guide notes that
a framework, such as the one in this report, could help ensure that key
steps are followed.[Footnote 7]
A recurring issue in the four projects we examined in detail was that
SB/SE's enforcement data only partially adjust for the complexity and
quality of cases worked. This issue is also a problem for SB/SE
enforcement productivity data generally. Failing to adjust for both
complexity and quality increases the risk that trends in productivity
will be misunderstood. For example, a decline in the number of cases
closed per employee at the same time that case complexity is increasing
may not be a real decline in productivity--more complex cases may
require more time per case. We recognize that some options for
improving productivity data, such as collecting more data on complexity
and quality, could be costly. However, costs could be mitigated by
using existing statistical methods to analyze SB/SE's current
complexity and quality.
To improve the planning of future enforcement process changes, we are
recommending that the Commissioner of Internal Revenue ensure that SB/
SE (1) puts in place a framework to guide planning of future process
improvement projects and (2) invests in enforcement productivity data
that better adjust for complexity and quality, taking into
consideration the costs and benefits of doing so.
In commenting on a draft of this report, the Commissioner of Internal
Revenue agreed with our first recommendation and agreed in principle
with our second, but also raised concerns about cost and feasibility.
The Commissioner's comments are discussed later in this report. The
Commissioner's letter is reprinted in appendix V.
Background:
SB/SE is one of IRS's four business operating divisions. SB/SE is
responsible for enforcement, taxpayer education, and account services
for about 45 million taxpayers, including 33 million self-employed
taxpayers and 7 million small businesses with assets of less than $10
million. SB/SE also performs some collection functions for other IRS
operating divisions.
SB/SE managers told us that the reorganization of IRS in 2000--
including the creation of SB/SE--presented an opportunity for them to
examine enforcement-related processes from a new perspective. Prior to
this, the agency was organized around functional and geographic lines,
with separate groups responsible for activities such as processing
returns, audits, and collection in particular areas. The reorganization
eliminated or substantially modified this national, regional, and
district structure and established organizational units serving
particular groups of taxpayers with similar needs. Officials told us
that with the reorganization, they were now responsible for functions
that they had not controlled so directly before. They said that there
was general agreement among the managers of the newly created division
that there were opportunities to make processes more efficient and
effective, and that this led them to start several enforcement process
improvement projects. They also distinguished between enforcement
process improvement projects, which are generally incremental in their
approach, and more far-reaching efforts to modernize IRS and transform
processes through business systems modernization and other significant
changes. We noted in our recent Performance and Accountability Series
that IRS has made important progress in these larger efforts but its
transformation continues to be a work in progress.[Footnote 8]
Though many of the SB/SE projects include the word "reengineering" in
their titles, SB/SE managers agreed that process improvement projects
was a better description, given the scope of the changes these projects
were making. As described in GAO's Business Process Reengineering
Assessment Guide, reengineering entails fundamentally rethinking how an
organization's work should be done while process improvement efforts
focus on functional or incremental improvements.[Footnote 9] SB/SE
managers explained that they purposefully avoided technology-driven
changes of the sort under development in the IRS-wide business systems
modernization effort. They said that their goal was to make shorter
term, more SB/SE-focused changes in the meantime, while the more
sweeping changes, and their longer planning and implementation
horizons, were still years away from completion. In this report, we
refer to the 15 SB/SE efforts under way as of November 2003 as "process
improvement projects.":
We have reported on declining enforcement trends, finding in 2002 that
there were large and pervasive declines in six of eight major
compliance and collection programs we reviewed, with the only
exceptions in returns processing and in the automated underreporter
program. In addition to these declines, we reported on the large and
growing gap between collection workload and collection work completed
and the resultant increase in the number of cases where IRS has had to
defer collection action on delinquent accounts.[Footnote 10] In 2003,
we reported on the declining percentage of individual income tax
returns that IRS was able to examine each year, with this rate falling
from .92 percent to .57 percent between 1993 and 2002.[Footnote 11] We
also reported on enforcement productivity measured by cases closed per
full-time equivalent employees, finding that IRS's telephone and field
collection productivity declined by about 25 percent from 1996-2001 and
productivity in IRS's three audit programs--individual, corporate, and
other audit--declined by 31 to 48 percent.[Footnote 12]
Improving productivity by changing processes is a strategy SB/SE is
using to address these declining trends. As of November 2003, SB/SE had
15 ongoing process improvement projects under way, most of them in
three broad enforcement areas--audit, collection, and compliance
support. Audit projects entail changes to field and office examination
processes.[Footnote 13] Collection projects include changes to
automated collection programs, field collections, and other
programs.[Footnote 14] Compliance support is the term SB/SE uses to
describe processing functions related to audit and collection such as
updating IRS information systems for the results of enforcement work
and preparing examination closing letters and liens on taxpayer
property. Compliance support projects include changes to technical
services and case processing.
We selected four SB/SE process improvement projects to review in detail
for this report. Field Examination Reengineering includes changes to
preaudit processes to better identify specific issues on tax returns
for auditors to focus on, among other changes intended to improve
examination efficiency and reduce taxpayer burden. The Compliance
Support Case Processing Redesign project seeks to centralize data entry
into online information systems that monitor the status of active audit
and collection cases and their results from many different locations
with widely variable workload to just a few with more consistent,
predictable workload. The Collection Taxpayer Delinquent Account
Support Project involves the development of two computer models to
improve setting priorities for collections cases to assign to
collections staff. The Collection Field Function Consultative
Initiative seeks to improve timeliness on collections cases through
regular managerial involvement as cases are being worked. Brief
descriptions of all of SB/SE's projects can be found in appendix IV.
SB/SE Process Improvement Projects Included Most Key Steps but
Productivity Measurement Could Be Improved:
SB/SE process improvement project teams completed most of the steps we
identified as key to SB/SE's process improvement project planning, but
none of the projects we reviewed completed all of the key steps.
Guidance on project planning steps, such as our 20-step framework,
could help ensure that key steps are followed more consistently. Also,
SB/SE enforcement productivity data presented problems in that the data
available to SB/SE managers to assess the productivity of their
enforcement activities, identify processes that need improvement, and
assess the success of their process improvement efforts are only
partially adjusted for complexity and quality of cases worked.
Enforcement Process Improvement Projects Included Many Key Steps but
SB/SE Lacked a Planning Framework:
The planning for each of the four projects we reviewed included most of
the key steps in our process improvement framework, but none of the
projects included all of the steps. Figure 2 presents our findings,
organized by project stages, for each of the four projects we studied.
A full circle means a step was fully completed in project planning and
a partial circle means that only part of a step was completed. Our
basis for each "no" or "partial" finding is explained in appendix III.
Following figure 2, we discuss our findings in more detail with
selected examples from the four projects we reviewed.
Figure 2: Key Steps Included in Selected SB/SE Process Improvement
Projects:
[See PDF for image]
[A] SB/SE has not completed the implementation plan for this step in
the project.
[B] SB/SE has not completed the monitoring and evaluation plan for this
project.
N/A means a step was not applicable.
[End of figure]
The four SB/SE projects we reviewed largely included the productivity
baseline definition and process mapping steps under the "Decision to
Change" stage, where SB/SE had to determine whether any of its
processes should be improved. The Field Examination Reengineering
project team and both collection project teams had baseline data
showing that the time needed to complete casework was rising and all
four project teams had extensive flowcharts mapping the details of
current processes. By helping managers understand the strengths and
weaknesses of current processes, such information contributes to more
informed decisions about which processes to change.
However, SB/SE did not as consistently include the complexity and
quality of work being done in productivity baselines, compare
productivity data to external benchmarks, identify root causes of
productivity declines, or measure the gap between current and desired
productivity. Weaknesses in these steps leave SB/SE managers without
information that could be useful when making decisions about which
processes to change. For example, on three of the four projects,
productivity data were not adjusted for case complexity and only
partially adjusted for quality. This could cause productivity trends to
be misinterpreted, leaving SB/SE at risk of trying to redesign
processes that are already working well or missing opportunities to fix
processes with potential for improvement. Because GAO's Business
Process Reengineering Assessment Guide and our roundtable participants
stressed the importance of complete productivity data and because this
was a recurring issue we identified in our assessment of the four SB/SE
projects, we discuss the importance of adjusting for case complexity
and quality when measuring productivity in more detail in the next
section of this report.[Footnote 15]
Another example of not consistently following our key steps in the
"Decision to Change" stage is found in the Field Examination
Reengineering project. The project team sought the advice of many
large, noted organizations to benchmark its productivity. However, the
work did not lead to measuring how SB/SE's productivity compared to
others' because the team did not believe that operations in other
organizations were comparable. Without this benchmarking, the team did
not know whether and by how much it could improve productivity by
updating operations based on the experiences of other large
organizations. Both GAO's Business Process Reengineering Assessment
Guide and our roundtable participants stressed that although processes
may seem unique to government, they likely have counterparts at a
process level in the private sector. Moreover, GAO's Guide says that
looking at dissimilar organizations can actually lead to the most
fruitful improvements because it stimulates thinking about new
approaches.
During the "Target Process Development" stage, the projects we reviewed
consistently included the steps that prepare for implementation.
Planning on all four of the projects we studied included obtaining
executive support, assessing barriers to implementing changed
processes, and assessing resource needs and availability. The
Compliance Support Case Processing Redesign team, for example,
originally identified the need for a computer programming change to
implement part of their process redesign. When the programming change
could not be made immediately, they continued with a manual process in
order to keep the project moving forward.
However, SB/SE less consistently included key steps in this stage
related to designing the new process. For example, in the Collection
Taxpayer Delinquent Account Support project, SB/SE did not consider
alternatives to achieving the project's goal of identifying the best
cases to assign to collections staff. Because options were not
considered, the team ran the risk of missing a more effective approach
than the one they took. Another team did not design the new process
based on analysis of a gap between current and desired productivity. It
is important at this stage for projects to include fact-based
performance analysis to assess how to change processes that are in
greatest need of improvement in terms of cost, quality, and timeliness.
By analyzing the gap between an existing process's performance and
where that performance should be, projects can target those processes
that are most in need of improvement, analyze alternatives, and develop
and justify implementation plans. Using these steps can increase the
likelihood of determining the best new process.
During the "Implementation" stage, three of the four projects we
reviewed had completed implementation plans and all three included key
implementation steps. These steps focus on the challenge of turning
project concepts into a workable program. For example, in the
Collection Taxpayer Delinquent Account Support project, the team
clearly defined who was responsible for updating the existing computer
programs to select cases for priority collection action and who was
responsible for evaluating the implemented process. We also found that
three of the four teams conducted pilot tests and used their results to
modify the new processes prior to implementation--steps important for
ensuring that process problems are worked out prior to project
implementation.
SB/SE was less consistent, however, in establishing employee
performance expectations for the new processes. In the Field
Examination Reengineering project, SB/SE plans to implement changes to
audit planning steps in order to streamline audits and reduce demands
on taxpayers for additional information. SB/SE's plan includes
monitoring the deployment of the new process using measures such as the
percent of personnel trained. However, SB/SE's plan does not specify
performance expectations for employees or how it will measure whether
its auditors are using the new techniques properly.
Two projects had completed plans for outcome assessments at the time of
our review. One of these, the Collection Taxpayer Delinquent Account
Support project, included an evaluation plan using available data to
develop measures of how accurately the new models were working. The
other two projects were in the process of developing evaluation plans-
-an important step to ensure that the correct data are available and
collected once the change is implemented.
Three of four initiatives incorporated change management principles
throughout their initiatives. In the fourth, we agreed with SB/SE
managers that change management key steps were not a factor because the
changes to the method of prioritizing collection cases did not affect
collections staff. These are key steps because successful process
improvement depends on overcoming a natural resistance to change and
giving staff the training to implement the changes. The three project
teams where change management was a factor consistently completed all
of the key steps in the "Change Management" stage.
In the course of our discussions with SB/SE managers about the steps
that their projects did and did not include, we learned that SB/SE does
not have its own guidance or framework that describes the steps to be
followed in planning process improvement projects. SB/SE managers said
that projects had been planned and carried out without such a
framework. Contractors provided substantial assistance in designing SB/
SE's process improvement projects, and managers told us that they
relied in large part on the contractor staffs' expertise and experience
in planning the projects.
A framework laying out the steps to be followed is an important
internal control for projects such as these because it provides top
managers assurance that the steps that the organization has determined
to be important are either taken on each project or that project
managers have explained why they should be omitted. GAO's Business
Process Reengineering Assessment Guide notes that an established
framework is important for projects in that it defines in detail the
activities the project team needs to complete and alerts the team to
key issues that it must address.[Footnote 16] Without a process
improvement framework and a consistent set of steps to follow, IRS runs
the risk of future projects also missing key steps. This, in turn,
exacerbates the risk of projects not addressing appropriate process
problems, developing a less than optimal target process, ineffectively
implementing the project, inaccurately assessing project outcomes, or
mismanaging the change to the new process. A framework such as the one
we developed for this report is an important internal control tool for
SB/SE managers to guard against these risks. The internal control is
needed whether process improvement is planned by SB/SE staff or
contractors. Such a framework may also prove useful in other IRS units
besides SB/SE. As with the 20-step framework we used to assess SB/SE's
approach, however, any such guidelines should allow for appropriate
managerial discretion in cases where certain steps are not relevant.
IRS Enforcement Productivity Data Only Partially Adjusted for
Complexity and Quality:
The data available to SB/SE managers to assess the productivity of
their enforcement activities, identify processes that need improvement,
and assess the success of their process improvement efforts are only
partially adjusted for complexity and quality of cases worked.
Productivity measures the efficiency with which resources are used to
produce outputs. Specific productivity measures take the form of ratios
of outputs to inputs such as cases closed or dollars collected per
staff year. The accurate measurement of enforcement productivity
requires data about the quantity of outputs produced and inputs used
that are accurate and consistent over time and that link the outputs
directly to the inputs used to produce them.[Footnote 17] The accurate
measurement of productivity also requires good data on the relative
complexity or difficulty of the cases and the quality of the work done
by IRS staff. Case complexity can vary with the type of tax (employment
vs. income), the type of taxpayer (individual vs. business) and the
type and sources of income and expenses. A measure of productivity like
cases closed per staff year that shows an increase may not indicate a
real gain in efficiency if the mix of cases worked has shifted to less
difficult cases or the quality of the work has declined. This problem
of adjusting for quality and complexity is not unique to SB/SE process
improvement projects--the data available to process improvement project
managers are the same data used throughout SB/SE to measure
productivity and otherwise manage enforcement operations.
SB/SE managers used data on the number of cases completed and the time
it takes to complete them to measure output. Such data were usually
only partially adjusted for quality and only once were they adjusted
for complexity. Opportunities to make more such adjustments were
missed.
An example of a complete adjustment for complexity is the Compliance
Support Case Processing Redesign team's use of a proxy for complexity.
The project illustrates both the shortcomings of SB/SE's productivity
data and the feasibility of some adjustments using other currently
available information. The team wanted to measure the work needed to
enter examination and collection case data into the information system,
holding complexity constant, but direct measures of complexity were not
available. While developing their new process, the team knew that more
complex cases were to be assigned to higher-grade clerks.[Footnote 18]
The team used the grade of the clerk to adjust output for complexity.
Although not a direct measure of relative complexity, the grade level
of the clerks provided managers a means to adjust for complexity and
better identify performance increases that were due to changes in
productivity by holding complexity constant. Such an adjustment
increases the credibility of the team's estimate that IRS would save up
to 385 positions from the proposed redesign.
SB/SE has systems in place that measure quality against current
standards but do not account adequately for changes in standards of
quality. The Exam Quality Measurement System (EQMS) and the Collection
Quality Measurement System (CQMS) use samples of audit and collection
cases, respectively, to determine if IRS standards were followed and
compute scores that summarize the quality of the case.[Footnote 19]
Generally, the scoring is done on a numerical scale. For example, EQMS
uses quality scores that range on a scale from 0 to 100. To SB/SE's
credit, most of the projects that we reviewed used EQMS and CQMS scores
in an attempt to control for quality changes. Unfortunately, these
scores may not adequately reflect changes in standards of quality. For
example, the IRS Restructuring and Reform Act of 1998 placed additional
documentation requirements for certain collection actions on SB/SE
collections staff, such as certifications that they had verified that
taxes were past due and that sanctions were appropriate given the
taxpayers' circumstances. SB/SE has changed the standards used in EQMS
and CQMS to reflect the new requirements but has not changed its
quality scale to account for the new, higher level of quality implied
by the new standards. As a result, two exams with the same quality
scores, one done before passage of the act and one after, may not have
the same level of quality. If the way that SB/SE computes its quality
scores does not adequately reflect such changes in quality standards,
an increase in staff time needed to complete the additional requirement
may be misinterpreted as a decline in productivity.
Opportunities exist to improve SB/SE's enforcement productivity data.
Statistical methods that are widely used in both the public and private
sectors can be used to adjust SB/SE productivity measures for quality
and complexity. In particular, by using these methods, managers can
distinguish productivity changes that represent real efficiency gains
or losses from those that are due to changes in quality standards.
These methods could be implemented using data currently available at
SB/SE. The cost of implementation would be chiefly the staff time
required to adapt the statistical models to SB/SE. Although the
computations are complex, the methods can be implemented using existing
software.[Footnote 20] We currently have under way a separate study
that illustrates how these methods can be used to create better
productivity measures at IRS. We plan to report the results of that
study later in 2004.
We recognize that better incorporating the complexity and quality of
enforcement cases in enforcement productivity data could entail costs
to SB/SE. Collecting additional data on complexity and quality may
require long-term planning and investment of additional resources.
However, as discussed in the previous paragraph, there are options
available now to mitigate such costs. Existing statistical methods
could be used in the short term, with currently available data on case
complexity and quality to improve productivity measurement. In
addition, IRS's ongoing business systems modernization effort may
provide additional opportunities for collecting data.
Our roundtable participants stressed the benefits of productivity
analysis. They said that an inadequate understanding of productivity
makes it harder to distinguish processes with a potential for
improvement from those without such potential. GAO's Business Process
Reengineering Assessment Guide also highlighted the importance of being
able to identify processes that are in greatest need of
improvement.[Footnote 21]
Conclusions:
SB/SE deserves recognition for embracing process improvement and for
including so many key steps in planning the projects. To the extent
that IRS succeeds in improving enforcement productivity through these
projects, resources will be made available for providing additional
services to taxpayers and addressing the declines in tax enforcement
programs.
While the SB/SE projects we reviewed included most of the key steps in
our framework, putting guidance in place for future projects to follow
would help ensure that all key steps are included and improve project
planning. The 20-step framework that we developed for this report is an
example of such guidance.
More complete productivity data--input and output measures adjusted for
the complexity and quality of cases worked--would give SB/SE managers a
more informed basis for decisions on how to improve processes. We
recognize that better productivity will mean additional costs for SB/SE
and that, therefore, SB/SE will have to weigh these costs against the
benefits of better data. GAO currently has under way a separate study,
illustrating how data on complexity and quality could be combined with
output and input data to create better productivity measures. This may
prove useful to SB/SE managers as they evaluate the current state of
their productivity measures. We will report the results of that review
later in 2004.
Recommendations for Executive Action:
We recommend that the Commissioner of Internal Revenue ensure that SB/
SE take the following two actions:
* Put in place a framework to guide planning of future SB/SE process
improvement projects. The framework that GAO developed for this report
is an example of such a framework.
* Invest in enforcement productivity data that better adjust for
complexity and quality, taking into consideration the costs and
benefits of doing so.
Agency Comments and Our Evaluation:
The Commissioner of Internal Revenue provided written comments on a
draft of this report in a January 14, 2004, letter, which is reprinted
in appendix V. The Commissioner agreed with our recommendation that IRS
develop a framework to guide future improvement projects. He notes that
SB/SE used outside experts to help direct the projects we discuss in
our report, and how the expertise gained from SB/SE's projects puts the
organization in a position to create a framework for future projects.
In regard to our second recommendation, the Commissioner agreed in
principle with the value of adding to current enforcement productivity
data, but also expressed concerns about cost and feasibility. His
letter also discusses initiatives in progress to improve program
management and monitoring in the short term, as well as his intent to
explore the use of statistical methods to improve enforcement program
productivity measurement and to ensure that they are included in
modernization projects. The careful consideration of costs and benefits
and steps to improve measures in the long term are at the heart of our
recommendation and we encourage his ongoing commitment to these
efforts.
The Commissioner's letter also notes that employee performance goals--
one of the steps in our framework--must not violate legal restrictions
on the use of certain enforcement data to evaluate employee
performance. We agree and clarified language in our report to make it
clear that our framework step concerns employee performance
expectations, not using enforcement data to evaluate employees or
otherwise imposing production goals or quotas.
In addition to commenting on our recommendations, IRS provided
supplemental data on the results of some reengineering projects.
Reviewing project results was not part of the scope of our review and
time did not permit us to verify the supplemental data provided by IRS
on project results.
We conducted our work from September 2002 through November 2003 in
accordance with generally accepted government auditing standards.
As agreed with your offices, unless you publicly announce its contents
earlier, we plan no further distribution of this report until 30 days
after its date. At that time, we will send copies of this report to the
Secretary of the Treasury, The Commissioner of Internal Revenue, and
other interested parties. This report is available at no charge on
GAO's Web site at [Hyperlink, http://www.gao.gov].
If you or your staffs have any questions, please contact me at (202)
512-9110 or David Lewis, Assistant Director, at (202) 512-7176. We can
also be reached by e-mail at [Hyperlink, whitej@gao.gov]
or [Hyperlink, lewisd@gao.gov] lewisd@gao.gov, respectively. Key
contributors to this assignment were Tara Carter, Kevin Daly, Leon
Green, Landis Lindsey, and Amy Rosewarne.
Signed by:
James R. White:
Director, Tax Issues Strategic Issues Team:
[End of section]
Appendixes:
Appendix I: Process Improvement Project Framework:
The 20-step process improvement we identified is broken out into four
broad stages, from deciding to make changes to assessing the results of
the changes. A fifth stage deals with managing the changes being made
and takes place throughout the latter part of the project. Figure 3
places the stages in their chronological order, with the change
management stage shown taking place simultaneously with other stages.
Figure 3: 20-Step Process Improvement Framework:
[See PDF for image]
[End of figure]
Within each of the stages of this framework are specific key steps that
we developed based on GAO guidance and what we learned from managers in
other organizations about the steps they took to ensure that they were
embarking on the right projects, designing and implementing the
projects appropriately, and accurately assessing their projects'
results. The sections below describe the nature and purpose of the key
steps that fall under the different stages. We recognize that some
steps may not be appropriate for some projects and that managers need
to apply judgment in using this or any other process improvement
framework. Development of this framework is described in appendix II.
Decision to Change:
Organizations that base the decision to redesign processes on accurate
productivity data and a clear understanding of current processes
increase the likelihood that a project will avoid misdiagnosing a
problem or designing a less than optimal outcome target. Six key steps
are important to accomplishing this.
Identify Productivity Baseline:
Baseline data are information on the current process that provide the
metrics against which to compare improvements and use in benchmarking.
Productivity measures the efficiency with which an organization uses
resources, or inputs, to produce outputs. Specific productivity
measures generally take the form of a ratio of outputs to inputs. By
establishing a baseline using such measures, a process improvement can
be measured in terms of real efficiency gains. For example, the
baseline could be used to measure the effect of a successful process
improvement as additional output produced by the organization with the
same or fewer inputs.
Include Complexity and Quality in Productivity Measures:
Productivity measures may give misleading information if they do not
account for the relative complexity and quality of outputs and inputs.
A measure of productivity like cases closed per staff year that shows
an increase may not indicate a real gain in efficiency if the mix of
cases has shifted to less difficult cases or the quality of the work
has declined. Besides accounting for complexity and quality, the
organization must also choose the appropriate indicators of its outputs
and inputs and measure them accurately. Organizations like IRS that are
service providers have outputs that often consist of complex,
interrelated activities that, in many cases, may require multiple
indicators of outputs to accurately measure productivity. The specific
data needed depend on the characteristics of particular IRS processes.
For example, the number and type of output indicators appropriate for
services that have direct contact with taxpayers, such as audits, may
be larger and more varied (to reflect the full impact of these services
on taxpayers) than those appropriate for other services with less or no
direct contact, such as forms processing. However, factors like
complexity and quality are necessary for accurate productivity
measurement for any process in IRS, regardless of how the specific
quantitative measures are defined.
Compare Current Productivity to Internal and External Benchmarks:
A benchmark is a measurement or standard that serves as a point of
reference by which process performance is measured. During the
"Decision to Change" stage, benchmarking is the solution-building
component of process improvement through which an organization compares
data used to measure existing internal processes with external data on
similar processes in other organizations, or in other components of the
same organization, to identify process improvement needs and outcome
targets. Through benchmarking an organization is able to identify gaps
between an organization's process performance and that of other
organizations or other components of the same organization.
Benchmarking is a key tool for performance improvement because it
provides "real world" models and reference points for setting ambitious
improvement goals.
Map Current Process:
Process mapping is a graphical representation depicting the inputs,
outputs, constraints, responsibilities, and interdependencies of the
core processes of an organization. Acceptable modeling tools and other
analysis techniques include flowcharting, tree diagrams, fishbone
diagrams, and business activity maps. It is important that a process
map defines what the components of each process are, as well as the
process's boundaries, dependencies, and interconnections with other
processes. If initial process mapping is done at a high level, more
detailed modeling is necessary to identify all of the current process's
activities and tasks, staff roles and responsibilities, and links and
dependencies with other processes, customers, and suppliers.
Performance data (e.g., costs, time, throughput) for all activities and
tasks should be included on the map, or made available elsewhere. The
people who actually do the work as well as the process owner should
validate the mapping. Regulations, policies, laws, and assumptions
underlying the processes should be identified.
Identify Causes of Weak Performance:
Causal factors are the conditions that initiate the occurrence of an
undesired activity or state. Causal factors that are within the span of
control of an organization should be addressed during the Target
Process Development stage. Causal factors that are beyond the span of
control of an organization should be isolated when identifying a
problem. Examples of causal factors are legal requirements, mix of
inputs, quality of inputs, and staff constraints.
Measure Gap Between Current and Desired Productivity:
An empirical basis for the decision to make a process change is an
important step leading towards an improvement that is optimal and
attainable. An empirical basis can be established by using productivity
data to define the gap between where the organization currently is and
where it wants to be.
Target Process Development:
After deciding to undergo a process improvement, an organization can
increase the likelihood of determining the best new process by using
productivity data, assessing implementation barriers, and developing
feasible alternatives.
Understand the Best Practices of Others:
Solutions can be adapted from best practices found in other
organizations. Best practices are processes and procedures that high-
performing organizations use to achieve results. An organization should
evaluate the pros and cons of each best practice and if possible, apply
its own productivity standards. Ideally, this form of benchmarking
should be done with an external organization. Many processes that seem
unique to the government actually have counterparts in the private
sector, especially in generic areas such as claims processing, loan
management, inventory management, etc. Also, it is important to note
that the other organizations do not have to be particularly similar, or
even do similar work. For example, Xerox used L.L. Bean to improve
order fulfillment. Looking at processes in dissimilar organizations can
actually lead to the most fruitful improvements because it stimulates
new thinking about traditional approaches to doing work.
Analyze Alternatives:
Alternatives are different process designs that would likely result in
the same or a similar outcome. An organization's analysis of
alternative processes should consider benefits, costs, and risks.
Performance results that each alternative could be expected to achieve
should be determined. This can be done using methods such as
prototyping, limited pilot testing, and modeling and/or computer
simulation. In addition to performance, alternatives can be scored by
any number of factors including, feasibility, budget, political appeal,
implementation time, payback time, and risk. The team should explore
each alternative thoroughly enough to convincingly demonstrate its
potential to achieve the desired performance goals and fully describe
the types of technical and organization changes necessary to support
each goal, and if possible, test key assumptions.
Design New Process to Close Productivity Gap:
The selection of a target process from among alternatives needs an
empirical basis in the form of some sort of quantitative analysis. The
decision to improve and forming the target process should be linked by
an analysis of productivity data that shows how the new process can
close the gap between the productivity baseline and the desired
outcome.
Obtain Executive Support:
Executive support should come in the form of an executive steering
committee--a group headed by the organization's leader to support and
oversee the process improvement effort from start to finish. Executive
involvement is important because they are in a position to build
credible support among customers and stakeholders, mobilize the talent
and resources for a reengineering project, and authorize the actions
necessary to change agencywide operations. An executive steering
committee's roles include defining the scope of the improvement
project, allotting resources, ensuring that project goals align with
the agency's strategic goals and objectives, integrating the project
with other improvement efforts, monitoring the project's progress, and
approving the reengineering team's recommendations. While carrying out
these responsibilities the steering committee must also keep
stakeholders apprised of the reengineering team's efforts.
Assess Barriers to Implementing Changed Process:
Implementation barriers are obstacles that the organization will need
to overcome to implement a new process. Examples of implementation
barriers include political issues, entrenched workplace attitudes or
values, an insufficient number of employees with the skills required
for the redesigned roles, collective bargaining agreements,
incompatible organization or physical infrastructure, current laws and
regulations, and funding constraints. The impact of these barriers and
the costs of addressing them (such as staff training, hiring, and
relocation) need to be factored into the process selection decision. If
the reengineering team determines that the risks and costs of
implementing a preferred new process appear too great, they may need to
pursue one of the less ideal, but more feasible alternatives.
Assess Resource Needs and Availability:
Prior to taking on a process improvement project, GAO guidance and the
other organizations we consulted stress the importance of ensuring the
availability of staff and other resources necessary to complete design
and implementation of the changed process. Without adequate resources,
an organization undertaking a change runs the risk of an incompletely
implemented project.
Implementation:
A carefully designed process improvement project needs a similarly well
thought-out implementation in order to be successful.
Conduct Pilot Tests:
Pilot tests are trial runs of the redesigned process. Pilot testing is
a tool used to move the organization successfully to full
implementation. Pilot testing allows the organization to (1) evaluate
the soundness of the proposed process in actual practice, (2) identify
and correct problems with the new design, and (3) refine performance
measures. Also, successful pilot testing will help strengthen support
for full-scale implementation from employees and stakeholders.
Adjust Target Process Based on Pilot Results:
Postpilot adjustments are corrective actions taken to correct trouble
spots prior to full implementation. Trouble spots can be pinpointed
through the formal evaluation of pilot projects designed to determine
the efficiency and effectiveness of the new process.
Define Roles and Responsibilities:
Process owners are the individuals with the responsibility for the
process being improved. Designating process owners is necessary to
ensure accountability.
Establish Employee Expectations for New Process:
New employee and/or team performance expectations should be established
to account for changes in roles and career expectations caused by the
new process. Measurable indicators that are currently being used to
track and assess employee or team progress should be analyzed to
determine if adjustments will be required after the new process is
implemented. In the case of IRS enforcement activities, the agency must
ensure that the expectations do not violate the legal prohibition on
using tax enforcement results to evaluate employee performance or
imposing or suggesting production quotas or goals.[Footnote 22] In
2002, we reported on IRS's progress towards improving its performance
management system; these changes were brought on, in part, by this
requirement.[Footnote 23]
Outcome Assessment:
Careful assessment of the results of a process improvement project is
important in that it may lead to further changes in the process being
addressed and may suggest lessons for other projects.
Develop Plans to Monitor and Evaluate New Process:
An evaluation plan is a way to collect and analyze data in order to
determine how well a process is meeting its performance goals and
whether further improvements are needed. Good performance measures
generally include a mix of outcome, output, and efficiency measures.
Outcome measures assess whether the process has actually achieved the
intended results, such as an increase in the number of claims
processed. Efficiency measures evaluate such things as the cost of the
process and the time it takes to deliver the output of the process (a
product or service) to the customer. The data needed to conduct outcome
assessments later on need to be identified during project planning to
ensure that they are available and collected once implementation
begins.
Change Management:
Change management focuses on the adjustments that occur in the culture
of an organization as a result of a redesigned process. Research
suggests that the failure to adequately address--and often even
consider--a wide variety of people and cultural issues is at the heart
of unsuccessful organizational transformations. Similarly for process
improvement efforts, redesigning a process is not only the technical or
operational aspect of change, but also overcoming a natural resistance
to change. Successfully managing change reduces the risk that
improvement efforts will fail due to a natural resistance to change
within an organization.
Establish a Change Management Strategy:
An organization needs to establish a change management strategy that
addresses cultural changes, builds consensus among customers and
stakeholders, and communicates the planning, testing, and
implementation of all aspects of the transition to the new process.
Change management activities focus on (1) defining and instilling new
values, attitudes, norms, and behaviors within an organization that
support new ways of doing work and overcome resistance to change, (2)
building consensus among customers and stakeholders on specific changes
designed to better meet their needs, and (3) planning, testing, and
implementing all aspects of the transition from one organization
structure or process to another. Executive involvement is important for
successful change management. Executive support helps strengthen upper
management's support for the project and serves to reinforce the
organization's commitment to the proposed changes. In a roundtable
meeting held by GAO to obtain the perspectives of the private sector,
one organization mentioned that providing continuous feedback to its
employees is a critical element of a change management program. They
also described the importance of consistently updating those employees
who would be directly affected by a change initiative. Keeping
employees informed of decisions and recognizing their contributions are
important elements of developing positive employee attitudes toward
implementing process improvement initiatives. Ongoing communication
about the goals and progress of the reengineering effort is crucial,
since negative perceptions could be formed and harden at an early
stage, making the implementation of the new process more difficult to
achieve. If change management is delayed it will be difficult to build
support and momentum among the staff for implementing the new process,
however good it might be.
Establish a Transition Team:
A transition team is a group of people tasked with managing the
implementation phase of process improvement projects. A transition team
should include the project sponsor, the process owner, members of the
process improvement project team, and key executives, managers, and
staff from the areas directly affected by changeover from the old
process to the new. Agency executives and the transition team should
develop a detailed implementation plan that lays out the road to the
new process, including a fully executable communication plan. The
process owners responsible for managing the project will not
effectively convey the goals and implementation strategy of the project
if a viable mechanism is not set up by the transition team to keep
employees and external stakeholders informed.
Develop Workforce Training Plans:
Training and redeploying the workforce is often a major challenge and
generally requires substantial preparation time. When a process is
redesigned and new information systems are introduced, many of the
tasks workers perform are radically changed or redistributed. Some
positions may be eliminated or cut back, while others are created or
modified. Workers may need to handle a broader range of
responsibilities, rely less on direct supervision, and develop new
skills.
[End of section]
Appendix II: Scope and Methodology:
We began development of a process improvement framework by reviewing
previously developed GAO guidance related to business process
reengineering.[Footnote 24] We also reviewed guidance that GAO has
recently issued on assessment frameworks for other major management
areas.[Footnote 25]
GAO's Business Process Reengineering Assessment Guide recognizes that
the steps for planning process improvement need to be adapted for the
magnitude of the projects and the particular circumstances of an
organization. To supplement the GAO business process reengineering
guidance, we held a half-day roundtable meeting with the Private Sector
Council and two of its member companies, Northrop Grumman (a $25
billion defense enterprise) and CNF (a $5 billion transportation
services company).[Footnote 26] We also discussed process improvement
planning with public sector managers with experience in revamping
complex processes. Reviewing publicly available information and in
discussions with SB/SE staff, we found that the tax agencies in the
states of California, Minnesota, and Florida had gone through
substantial process improvement efforts in recent years. Similarly, the
Department of the Interior's Minerals Management Service had carried
out substantial process improvement projects. We interviewed officials
from these organizations and reviewed documents that they provided. We
then used all of this information to adapt GAO's guidance to a 20-step
framework appropriate to the SB/SE projects.
We judgmentally selected 4 projects to study in detail from the 15
projects SB/SE had under way. Our goal in selecting projects for
detailed review was to cover at least one project in each of the three
main enforcement areas that IRS was revamping--audit, collection, and
compliance support. We also looked for projects that were sufficiently
far along that we considered it reasonable to expect to see either
completed steps or plans for remaining steps for most of the project.
We selected one project each in the audit and compliance support areas.
We found that there were 2 projects underway in the collections area
that were significantly far along, so we selected both of them for our
detailed review.
For the four projects we selected, we used the documentation previously
provided to us to identify evidence that SB/SE managers had taken or
were in the process of taking the key process improvement project steps
we identified. We then discussed our initial findings with IRS
officials responsible for the four projects and they provided
additional evidence, both orally and in writing, concerning the
elements we had identified as present or not in our initial document
review. We then revised our initial assessments based on the additional
evidence that the officials provided. Our assessments also included
review by a GAO project design specialist, in addition to our usual
quality control procedures.
We also recognized the need for flexibility in the application of our
criteria, in that not all of the steps we identified necessarily make
sense for every project. Where a particular step did not logically
apply to a particular project, we listed it as "not applicable" in our
assessment. For instance, the Collection Taxpayer Delinquent Account
Support project we reviewed in detail did not change processes that
staff were asked to carry out, so we rated the step about developing a
training plan as "not applicable." Where a step was not fully completed
but the project team did a number of elements of the step, we assessed
that step as "partial" in our matrix. We did not evaluate the success
so far or the likelihood of success for any of the projects we
reviewed. We also did not evaluate the effectiveness with which project
steps were completed. For example, we did not evaluate the quality of
the pilot tests.
To determine the usefulness of IRS productivity data as a basis for
determining the direction and eventual success of SB/SE process
improvement efforts, we reviewed the literature on productivity
measurement in tax agencies and in the public sector generally. We also
reviewed studies on productivity measurement in service industries with
functions similar to IRS.
[End of section]
Appendix III: GAO Assessment of Four Selected SB/SE Process Improvement
Projects:
The following four figures provide summaries of the evidence we used to
make specific assessments of four selected SB/SE process improvement
projects.
Figure 4: Key Steps Included in the Field Examination Reengineering
Project:
[See PDF for image]
[End of figure]
Figure 5: Key Steps Included in the Compliance Support Case Processing
Redesign Project:
[See PDF for image]
[A] SB/SE has not completed the implementation plan for this step in
the project.
[End of figure]
Figure 6: Key Steps Included in the Collection Taxpayer Delinquent
Account Support Project:
[See PDF for image]
[End of figure]
Figure 7: Key Steps Included in the Collection Field Function
Consultative Initiative Project:
[See PDF for image]
[A] SB/SE has not completed the monitoring and evaluation plan for the
new process.
[End of figure]
[End of section]
Appendix IV: Descriptions of SB/SE Process Improvement Projects:
SB/SE management capitalized on the opportunity presented by the IRS
reorganization that created their operating division and saw declining
productivity trends as an impetus to change. SB/SE had 15 distinct
process improvement efforts under way as of November 2003, many with
multiple subprojects. Table 1 provides descriptive information of the
15 projects.
Table 1: Major Process Improvement Projects in IRS's Small Business/
Self Employed Operating Division:
Examination: Project: Office Examination & Field Examination;
Problem identified by IRS:
27 percent increase in calendar time associated with a field audit;
75 percent increase in the hours per Office Examination;
IRS‘s process improvement goals:
Improve examination efficiency, effectiveness, and consistent
treatment of taxpayers;
Improve examiner's ability to be flexible;
Reduce taxpayer burden through more focused audits;
Increase collaboration between examiner, manager, taxpayer, and
representative;
Status: Full implementation in progress;
Target Completion Date:
Office Exam: Phase 1 - February 2004;
Phase 2 - May 2004;
Field Exam:
Phase 1 - in progress;
Field Exam: Phase 2 - February 2004.
Project: Exam Life Cycle;
Problem identified by IRS:
SB/SE examination life cycle from date tax return is filed to
examination closing averages 780 days;
IRS‘s process improvement goals:
Identify short-term and long-term recommendations to dramatically
reduce examination life cycle;
Status: Recommendations pending;
Target Completion Date:
Short- term recommendations and preliminary long-term recommendations:
March 2004;
Implementation of short-term recommendations and final long- term
recommendations: April 2004.
Compliance Support: Project: Case Processing;
Problem identified by IRS:
Nonprofessional operations disbursed across 86 posts of duty;
Lack of flexibility in changing staff priorities and duties;
Inconsistent processes;
IRS‘s process improvement goals:
Centralize case processing to four SB/SE campuses;
Implement electronic case closure processing;
Status: Full implementation pending;
Target Completion Date:
Implementation - Between January 2005 and January 2006, with 90
percent completed by June 2005.
Compliance Support: Project: Technical Services;
Problem identified by IRS:
Low-volume, high-complexity programs are handled across all areas,
limiting development of staff expertise;
IRS‘s process improvement goals:
Realign workload with staff expertise;
Reassign work from professionals to paraprofessionals;
Centralize national and multiarea programs;
Automate responses to frequently asked questions;
Develop program knowledge and subject expertise with;
specialization;
Status: Full implementation in progress;
Target Completion Date:
Implementation complete - April 2004.
Compliance Support: Project: Collection Operations;
Problem identified by IRS:
Resource constraints limit ability to accomplish work within
established time frames;
Paper-intensive processes;
IRS‘s process improvement goals:
Identify opportunities for specialization and consolidation of work
processes;
Develop and implement automation initiatives that will reduce paper
processing;
Status: Recommendations pending;
Target Completion Date:
Recommendations due - April 2004.
Collection: Project: Taxpayer Delinquent Accounts Support Project;
Problem identified by IRS:
SB/SE can only work on 40 percent of its top priority delinquent tax
accounts;
Total dollars of tax accounts continues to rise;
IRS‘s process improvement goals:
Determine the optimal balance between workload and inventory for
telephone and field collection;
Identify characteristics of cases that provide for productive
disposition;
Develop and implement models that predict payment characteristics;
Status: Full implementation complete;
Target Completion Date:
Implementation completed - January 2003.
Collection: Project: Collection Field Function;
Problem identified by IRS:
Average calendar time to complete a collection case is 354 days;
IRS‘s process improvement goals:
Decrease calendar time and improve quality;
Status: Training and implementation partially completed;
Target Completion Date:
Final phase of training and implementation - March 2004.
Collection: Project: Tax Delinquent Investigations/Nonfiler
Initiative;
Problem identified by IRS:
Appropriate selection criteria must be developed to identify nonfiler
cases;
IRS‘s process improvement goals:
Identify characteristics of cases that provide for productive nonfiler
cases;
Status: Modeling implementation complete;
Target Completion Date:
Implementation due - July 04.
Collection: Project: Automated Collection System Operating Model;
Problem identified by IRS:
The volume of phone call from taxpayers to telephone collection
continues to rise resulting in capacity issues;
IRS‘s process improvement goals:
Determine optimal balance in telephone collection between IRS-initiated phone calls to taxpayers and taxpayer calls to IRS;
Status: Site implementation complete;
Target Completion Date:
Site implementation complete - December 2003.
Collection: Project: Organizational Structure;
Problem identified by IRS:
Additional compliance resources are needed for high-risk accounts;
IRS‘s process improvement goals:
Determine if Taxpayer Education and Communication resources can be
shifted to compliance with minimal impact on achieving mission;
Status: Implementation pending;
Target Completion Date:
Transition plan to be developed by January 2004;
Implementation complete by December 2004.
Collection: Project: Installment Agreement;
Problem identified by IRS:
Currently, 36 percent of installment agreements and 49 percent of
installment agreement dollars default within 5 years;
IRS‘s process improvement goals:
Segment taxpayers with installment agreements based on probability of
default;
Develop alternative treatment process to leverage current enforcement
tools based on taxpayer risk characteristics to reduce default rates
and increase dollars collected;
Develop protocols for applying the appropriate agreement structures to
taxpayer segments;
Status: Full implementation pending;
Target Completion Date:
Final phase of implementation to be completed in September 2004.
Collection: Project: Collection Cycle Time;
Problem identified by IRS:
Calendar days (cycle time) to process collection work from beginning
to end needs to be reduced;
IRS‘s process improvement goals:
Develop recommendations to decrease cycle time;
Status: Recommendations pending;
Target Completion Date:
Final short-term recommendations - March 2004;
Final long-term recommendations and implementation of short-term
recommendations - April 2004.
Collection: Project: Business Master File Tolerance Modeling;
Problem identified by IRS:
Tolerance levels for business taxpayers in submissions processing and
accounts management needs to be set consistently and effectively in
coordination with compliance levels;
IRS‘s process improvement goals:
Develop a revenue-sensitive model for setting business taxpayer
tolerances;
Status: Implementation in progress; Adjustments pending;
Target Completion Date:
Accounts management tolerances in place in August 2003;
Submission processing phase I - January 2004;
Additional changes based on a new model - January 2005.
Collection: Project: Workload Driver Model;
Problem identified by IRS:
Analysis is needed to identify causes of increased volume of work in SB/
SE;
IRS‘s process improvement goals:
Identify unknown causes for increased volume or work received in SB/
SE Customer Account Services using data-mining techniques and
regression analysis and develop improvements to mitigate impact;
Status: Recommendations pending;
Target Completion Date:
Initial recommendations - July 2004.
Source: IRS.
[End of table]
[End of section]
Appendix V: Comments from the Internal Revenue Service:
DEPARTMENT OF THE TREASURY
INTERNAL REVENUE SERVICE
WASHINGTON, D.C. 20224:
COMMISSIONER:
January 14, 2004:
Mr. James R. White
Director, Tax Issues
Strategic Issues Team
United States General Accounting Office
Washington, D.C. 20548:
Dear Mr. White:
I reviewed your report entitled, "Planning For IRS's Enforcement
Process Changes Included Many Key Steps But Can Be Improved" (GAO 04-
287), and I appreciate your recognition of our efforts to reengineer
our processes. As your report states, we are currently involved in 15
enforcement process improvement projects including a redesign of the
Office Audit program and various Automated Collection System (ACS)
projects. We are also developing a strategy around a corporate Small
Business/Self Employed (SB/SE) ACS inventory that will provide us with
greater flexibility to work priority inventory. Your report focuses on
four of our efforts: Field Examination Reengineering, Compliance
Support Case Processing Redesign, Collection Taxpayer Delinquent
Account Project, and the Collection Field Function Consultative
Initiative. We expect that all of these projects will improve our
overall efficiency and effectiveness as well as allow us to provide
better customer service and maximize our limited resources.
I agree with your recommendation to formalize a framework for future
improvement projects. At the standup of our new organization, we
recognized the need for extensive redesign and process improvements.
Not only were we looking to implement long-range changes to be made in
conjunction with our modernization efforts, but for improvements that
could be made immediately, working within the current enterprise
structures. To ensure we utilized the best methodologies for our
reengineering efforts, we engaged the services of a number of private
firms with extensive experience and recognition as leaders in this
field. We linked these organizations with key executives in SB/SE to
transfer expertise. Using "lessons learned" and the knowledge we now
have from working with these contractors, we are in a position to
create a framework for future reengineering efforts.
Your second recommendation, to invest in enforcement productivity data
that better adjusts for complexity and quality, involves taking many of
our existing measures, or newly created measures, and combining them
into one comprehensive measurement. While I agree in principle that
additional enforcement program productivity data may be beneficial, I
am not certain that existing statistical methods or software can
readily capture the amount of information and data required for this
purpose. The type of data and decision analytics software required to
conduct the type of analysis you are suggesting would be a huge and
very costly undertaking. However, we will continue to explore the
recommendation for use of statistical methods and ensure that these
methods are included in our modernization projects. We recognize the
need for these methods to combine robust outputs, inputs, quality, and
complexity to derive productivity indexes to enhance our effectiveness.
There are several initiatives currently in progress that will enhance
our program management and monitoring over the next fiscal year. For
example, an effort is underway to evaluate our workload delivery and
identification methods to enhance efficiencies and effectiveness. In
addition, in our examination program, we are implementing the use of
the Examination Operational Automation Database, as a system designed
to track data from examination adjustments by issue. This data will be
used to enhance the ability to identify specific areas of noncompliance
based on examination results.
As we implement new processes, we will continue to look at our current
measures. Our measures must be consistent with our balanced measures
approach which considers customer satisfaction and employee
satisfaction, as well as business results. Your report reflects
concerns that we do not specify performance goals for employees.
Specific employee goals are not used to ensure we are not violating the
legislative mandates in the Taxpayer Bill of Rights and the Internal
Revenue Service Restructuring and Reform Act of 1998, which placed
limitations on the types of enforcement data we can use to evaluate
employees. Instead, we set performance goals for our programs, not
individual employees. These goals provide specific information that is
valid down to the Area and certain Territory offices and allows us to
closely monitor business results.
I would also like to comment on the individual reengineering projects
reviewed in this audit. Your report recognizes our efforts to put the
most productive collection work in the hands of our revenue officers.
The successes with the Taxpayer Delinquent Account Project led to
additional modeling for Taxpayer Delinquent Investigations inventory.
As part of this modeling effort we included a research plan that will
monitor the results of the models, evaluate what the models produce,
how they perform and enable us to make informed decisions on updates as
well as refinements to the models. These modeling efforts will allow us
to identify the most productive work and make the best use of our
limited field and ACS resources.
The Collection Field Function (CFf) Consultation Initiative Project
improved our ability to contact and follow-up with taxpayers at an
earlier point in the collection process. Groups testing this process
demonstrated a 6.8 percent greater reduction in the percentage of aged
inventory from Fiscal Year 2002 to Fiscal Year 2003 than did control
groups in the same Area. As part of our assessment of this project we
conducted Customer Satisfaction surveys. Based upon opinions expressed
in these surveys we believe this effort will improve satisfaction of
taxpayers as well as enhance CFf productivity. Test groups also
demonstrated a 13 percent greater improvement in
dispositions per revenue officer from Fiscal Year 2002 to Fiscal Year
2003 than did control groups. The initiative also enabled CFf group
managers to become more engaged with employees and cases earlier in the
collection process, which will improve work quality by mandating a
discussion of each case soon after initial contact with the taxpayer.
An additional benefit of the project has been to identify and institute
improvements to the automated Entity Case Management Information system
that will enhance the CFf group manager's ability to monitor in-process
collection workload through better functionality.
With respect to our efforts in the Field Examination Reengineering
Project, these process changes are pervasive and will impact every
examination case and employee. In the pilot sites we realized
improvements in business results, customer satisfaction, and employee
satisfaction. Once the process changes are implemented in all Areas, we
expect to see the same improvements nationally. As part of our annual
planning process we used these anticipated improvements to develop our
Fiscal Year examination plan.
Finally, our Case Processing Redesign will help us cope with increasing
workloads by rapidly consolidating operations and standardizing
processes. Through the redesign, we will establish consistent quality,
improve service to the taxpaying public, and save resources. Once the
redesign is fully implemented, a 300 to 400 Full Time Equivalent (FTE)
savings is projected. These FTEs will be used to enhance field
compliance and enforcement activities.
We expect that all of these efforts will have a positive effect on our
efficiency and effectiveness and will enable us to provide better
customer service. If you have any questions, please contact me or Tom
Hull, Director, Compliance, SB/SE at:
(202) 283-2180.
Sincerely,
Signed for:
Mark W. Everson:
[End of section]
(440159):
FOOTNOTES
[1] U.S. General Accounting Office, Compliance and Collection:
Challenges for IRS in Reversing Trends and Implementing New
Initiatives, GAO-03-732T (Washington, D.C.: May 7, 2003) and U.S.
General Accounting Office, IRS Modernization: Continued Progress
Necessary for Improving Service to Taxpayers and Ensuring Compliance,
GAO-03-796T (Washington, D.C.: May 20, 2003).
[2] P.L. 105-206.
[3] U.S. General Accounting Office, Tax Administration: Impact of
Compliance and Collection Program Declines on Taxpayers, GAO-02-674
(Washington, D.C.: May 22, 2002).
[4] U.S. General Accounting Office, Business Process Reengineering
Assessment Guide, GAO/AIMD-10.1.15 (Washington, D.C.: April 1997).
[5] The Private Sector Council is a nonprofit, nonpartisan public
service organization created to help the federal government improve its
efficiency, management, and productivity through cooperative sharing of
knowledge between the public and private sectors. It is comprised of
member companies--businesses from across North America in industries
such as telecommunications, defense, finance, and energy. Corporate
executives from member companies provide their time and expertise at no
cost to the government.
[6] GAO/AIMD-10.1.15.
[7] GAO/AIMD-10.1.15.
[8] U.S. General Accounting Office, Major Management Challenges and
Program Risks: A Governmentwide Perspective, GAO-03-95 (Washington,
D.C.: January 2003).
[9] GAO/AIMD-10.1.15.
[10] GAO-02-674.
[11] U.S. General Accounting Office, Tax Administration: IRS Should
Continue to Expand Reporting on Its Enforcement Efforts, GAO-03-378
(Washington, D.C.: Jan. 31, 2003).
[12] GAO-02-674.
[13] Field examinations are the most complex audits and are done at the
taxpayer's location. In an office examination, the taxpayer comes to an
IRS office with his or her records and meets with an auditor.
[14] IRS employs a number of means to collect overdue taxes from
taxpayers who owe them. These include letters, phone calls, office
appointments, and visits to business locations.
[15] GAO/AIMD-10.1.15.
[16] GAO/AIMD-10.1.15.
[17] We did not evaluate the accuracy or consistency of the data that
IRS used to measure these quantities, or whether the specific outputs
and inputs that IRS chose for its productivity measures were the most
appropriate. These evaluations were beyond the scope of our report.
However, for a discussion of the issues involved in choosing output and
input indicators for productivity measures, see app. I.
[18] SB/SE staff have general schedule grades, with higher grades
equating to more education and/or experience and to more complex job
responsibilities.
[19] The Internal Revenue Manual describes standards for both the
collection and examination system. Standards for collection
requirements include determining if the casework isolated the right
issues at the right time, the right actions were taken timely and
efficiently, the right legal procedures were followed, and the case was
closed correctly. The exam system uses eight standards to define
quality, each defined by elements representing components that are
present in a quality examination. Each exam is scored on each of the
eight standards and the total score is the sum of the scores for each
standard.
[20] The statistical methods use data on outputs, inputs, quality, and
complexity to derive a composite productivity index. This index can be
further analyzed (or "decomposed") to adjust for the effect of factors
like the IRS Restructuring and Reform Act of 1998. Examples of these
methods include stochastic frontier and data envelopment analysis.
These methods have been applied extensively in both the public and
private sectors. For a survey of studies, see L. Sieford, "Data
Envelopment Analysis: The Evolution of the State of the Art, 1978-
1995," Journal of Productivity Analysis, 1996, 7, pp. 99-137. For an
example of an application of these methods to the banking industry, see
D. Wheelock and P. Wilson, "Technological Progress, Inefficiency, and
Productivity Change in U.S. Banking, 1984-1993, Journal of Money,
Credit and Banking, 31(2), May 1999, pp.212-234. For an application to
public school productivity measurement, see J. Ruggiero and D.
Vitaliano, "Assessing the Efficiency of Public Schools Using Data
Envelopment Analysis and Frontier Regression," Contemporary Economic
Policy, July 1999, 17(3), pp. 321-31.
[21] GAO/AIMD-10.1.15.
[22] IRS Restructuring and Reform Act of 1998, section 1204, 26 U.S.C.
§7804 note (2000).
[23] U.S. General Accounting Office, Performance Management Systems:
IRS's Systems for Frontline Employees and Managers Align with Strategic
Goals but Improvements Can Be Made, GAO-02-804 (Washington, D.C.: July
12, 2002).
[24] GAO/AIMD-10.1.15.
[25] U.S. General Accounting Office, Human Capital: A Guide for
Assessing Strategic Training and Development Efforts in the Federal
Government, GAO-03-893G (Washington, D.C.: July 2003); U.S. General
Accounting Office, Homeland Security: Management Challenges Facing
Federal Leadership, GAO-03-260 (Washington, D.C.: Dec. 20, 2002), p.
60; U.S. General Accounting Office, Information Technology: A Framework
for Assessing and Improving Enterprise Architecture Management (Version
1.1), GAO-03-584G (Washington, D.C.; April 2003).
[26] The Private Sector Council is a nonprofit, nonpartisan public
service organization created to help the federal government improve its
efficiency, management, and productivity through cooperative sharing of
knowledge between the public and private sectors. It is comprised of
member companies--businesses from across North America in industries
such as telecommunications, defense, finance, and energy. Corporate
executives from member companies provide their time and expertise at no
cost to the government.
GAO's Mission:
The General Accounting Office, the investigative arm of Congress,
exists to support Congress in meeting its constitutional
responsibilities and to help improve the performance and accountability
of the federal government for the American people. GAO examines the use
of public funds; evaluates federal programs and policies; and provides
analyses, recommendations, and other assistance to help Congress make
informed oversight, policy, and funding decisions. GAO's commitment to
good government is reflected in its core values of accountability,
integrity, and reliability.
Obtaining Copies of GAO Reports and Testimony:
The fastest and easiest way to obtain copies of GAO documents at no
cost is through the Internet. GAO's Web site ( www.gao.gov ) contains
abstracts and full-text files of current reports and testimony and an
expanding archive of older products. The Web site features a search
engine to help you locate documents using key words and phrases. You
can print these documents in their entirety, including charts and other
graphics.
Each day, GAO issues a list of newly released reports, testimony, and
correspondence. GAO posts this list, known as "Today's Reports," on its
Web site daily. The list contains links to the full-text document
files. To have GAO e-mail this list to you every afternoon, go to
www.gao.gov and select "Subscribe to e-mail alerts" under the "Order
GAO Products" heading.
Order by Mail or Phone:
The first copy of each printed report is free. Additional copies are $2
each. A check or money order should be made out to the Superintendent
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or
more copies mailed to a single address are discounted 25 percent.
Orders should be sent to:
U.S. General Accounting Office
441 G Street NW,
Room LM Washington,
D.C. 20548:
To order by Phone:
Voice: (202) 512-6000:
TDD: (202) 512-2537:
Fax: (202) 512-6061:
To Report Fraud, Waste, and Abuse in Federal Programs:
Contact:
Web site: www.gao.gov/fraudnet/fraudnet.htm E-mail: fraudnet@gao.gov
Automated answering system: (800) 424-5454 or (202) 512-7470:
Public Affairs:
Jeff Nelligan, managing director, NelliganJ@gao.gov (202) 512-4800 U.S.
General Accounting Office, 441 G Street NW, Room 7149 Washington, D.C.
20548: