Vocational Rehabilitation
Improved Information and Practices May Enhance State Agency Earnings Outcomes for SSA Beneficiaries
Gao ID: GAO-07-521 May 23, 2007
State vocational rehabilitation (VR) agencies, under the Department of Education (Education), play a crucial role in helping individuals with disabilities prepare for and obtain employment, including individuals receiving disability benefits from the Social Security Administration (SSA). In a prior report (GAO-05-865), GAO found that state VR agencies varied in the rates of employment achieved for SSA beneficiaries. To help understand this variation, this report analyzed SSA and Education data and surveyed state agencies to determine the extent to which (1) agencies varied in earnings outcomes over time; (2) differences in state economic conditions, client demographic traits, and agency strategies could account for agency performance; and (3) Education's data could be used to identify factors that account for differences in individual earnings outcomes.
Our analysis of data on state agency outcomes for SSA beneficiaries completing VR found that state agencies varied widely across different outcome measures for the years of our review. For example, from 2001 to 2003 average annual earnings levels among those SSA beneficiaries with earnings during the year after completing VR varied across state agencies from about $1,500 to nearly $17,000. After controlling for a range of factors, we found that much of the differences in state agency earnings outcomes could be explained by state economic conditions and the characteristics of the agencies' clientele. Together state unemployment rates and per capita income levels accounted for roughly one-third of the differences between state agencies in the proportion of SSA beneficiaries that had earnings during the year after VR. The demographic profile of SSA clients being served at an agency--such as the proportion of women beneficiaries--also accounted for some of the variation in earnings outcomes. We also found that after controlling for other factors, a few agency practices appeared to yield positive earnings results. For example, state agencies with a higher proportion of state-certified counselors had more SSA beneficiaries with earnings during the year after completing VR. However, we were unable to determine what factors might account for differences in earnings outcomes at the individual level. This was due in part to Education's data, which lacked information on important factors that research has linked to work outcomes, such as detailed data on the severity of clients' disabilities. Although Education collects extensive client-level data, some key data are self-reported and not always verified by state agencies.
Recommendations
Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.
Director:
Team:
Phone:
GAO-07-521, Vocational Rehabilitation: Improved Information and Practices May Enhance State Agency Earnings Outcomes for SSA Beneficiaries
This is the accessible text file for GAO report number GAO-07-521
entitled 'Vocational Rehabilitation: Improved Information and Practices
May Enhance State Agency earnings Outcomes for SSA Beneficiaries' which
was released on May 23, 2007.
This text file was formatted by the U.S. Government Accountability
Office (GAO) to be accessible to users with visual impairments, as part
of a longer term project to improve GAO products' accessibility. Every
attempt has been made to maintain the structural and data integrity of
the original printed product. Accessibility features, such as text
descriptions of tables, consecutively numbered footnotes placed at the
end of the file, and the text of agency comment letters, are provided
but may not exactly duplicate the presentation or format of the printed
version. The portable document format (PDF) file is an exact electronic
replica of the printed version. We welcome your feedback. Please E-mail
your comments regarding the contents or accessibility features of this
document to Webmaster@gao.gov.
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed
in its entirety without further permission from GAO. Because this work
may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this
material separately.
United States Government Accountability Office:
GAO:
May 2007:
Report to Congressional Requesters:
Vocational Rehabilitation:
Improved Information and Practices May Enhance State Agency Earnings
Outcomes for SSA Beneficiaries:
GAO-07-521:
GAO Highlights:
Highlights of GAO-07-521, a report to congressional requesters
Why GAO Did This Study:
State vocational rehabilitation (VR) agencies, under the Department of
Education (Education), play a crucial role in helping individuals with
disabilities prepare for and obtain employment, including individuals
receiving disability benefits from the Social Security Administration
(SSA). In a prior report (GAO-05-865), GAO found that state VR agencies
varied in the rates of employment achieved for SSA beneficiaries. To
help understand this variation, this report analyzed SSA and Education
data and surveyed state agencies to determine the extent to which (1)
agencies varied in earnings outcomes over time; (2) differences in
state economic conditions, client demographic traits, and agency
strategies could account for agency performance; and (3) Education‘s
data could be used to identify factors that account for differences in
individual earnings outcomes.
What GAO Found:
Our analysis of data on state agency outcomes for SSA beneficiaries
completing VR found that state agencies varied widely across different
outcome measures for the years of our review. For example, from 2001 to
2003 average annual earnings levels among those SSA beneficiaries with
earnings during the year after completing VR varied across state
agencies from about $1,500 to nearly $17,000.
Figure: Distribution of State Agency Average Annual Earnings for SSA
Beneficiaries during the Year:
[See PDF for Image]
Source: GAO analysis of SSA data.
Note: Earnings are in 2004 dollars.
[End of figure]
After controlling for a range of factors, we found that much of the
differences in state agency earnings outcomes could be explained by
state economic conditions and the characteristics of the agencies‘
clientele. Together state unemployment rates and per capita income
levels accounted for roughly one-third of the differences between state
agencies in the proportion of SSA beneficiaries that had earnings
during the year after VR. The demographic profile of SSA clients being
served at an agency”such as the proportion of women beneficiaries”also
accounted for some of the variation in earnings outcomes.
We also found that after controlling for other factors, a few agency
practices appeared to yield positive earnings results. For example,
state agencies with a higher proportion of state-certified counselors
had more SSA beneficiaries with earnings during the year after
completing VR.
However, we were unable to determine what factors might account for
differences in earnings outcomes at the individual level. This was due
in part to Education‘s data, which lacked information on important
factors that research has linked to work outcomes, such as detailed
data on the severity of clients‘ disabilities. Although Education
collects extensive client-level data, some key data are self-reported
and not always verified by state agencies.
What GAO Recommends:
GAO recommends that Education promote certain promising practices
identified in our analysis, reassess the data it collects on clients,
and consider economic factors when measuring state agency performance.
Education generally agreed with our recommendations, but disagreed that
economic factors should be incorporated into performance measures. It
considers these factors during monitoring and believes its approach to
be effective. We maintain that these factors are critical to measuring
agencies‘ relative performance.
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-07-521].
To view the full product, including the scope and methodology, click on
the link above. For more information, contact Denise Fantone at (202)
512-4997 or fantoned@gao.gov.
[End of section]
Contents:
Letter:
Results in Brief:
Background:
State VR Agencies Consistently Showed Very Different Rates of Success
for SSA Beneficiaries Who Completed VR Programs:
State Economic Conditions and SSA Beneficiary Characteristics Account
for Much of the Difference in State VR Agency Success Rates:
A Few Agency Practices Appeared to Yield Better Earnings Outcomes,
while the Results of Other Practices Were Inconclusive:
Limitations in Education's Data May Have Hampered Analyses of
Individual Earnings Outcomes:
Conclusions:
Recommendations for Executive Action:
Agency Comments and Our Evaluation:
Appendix I: Scope and Methodology:
Section 1: Data Used, Information Sources, and Data Reliability:
Section 2: Study Population and Descriptive Analyses:
Section 3: Econometric Analyses:
Section 4: Limitations of our Analyses:
Appendix II: Comments from the Department of Education:
Appendix III: Comments from the Social Security Administration:
Appendix IV: GAO Contacts and Staff Acknowledgments:
Related GAO Products:
Tables:
Table 1: Explanatory Variables from the TRF Subfile:
Table 2: Explanatory Variables from Education's RSA-2 Data:
Table 3: State Economic and Demographic Explanatory Variables and Their
Sources:
Table 4: Explanatory Variables from the VR Agency Survey Data:
Table 5: Dependent Variables Used in the Analyses:
Table 6: Coefficients for Multivariate Models Estimating the Effects of
State and Agency Characteristics on Three VR Outcomes, and the
Proportion of Variance Explained (R-Squared) by Each Model:
Figures:
Figure 1: Distribution of State VR Agencies by Percentage of SSA
Beneficiaries with Earnings during the Year after VR:
Figure 2: Distribution of State VR Agency Average Annual Earnings for
SSA Beneficiaries with Earnings during the Year after VR:
Figure 3: Distribution of State VR Agencies by Percentage of SSA
Beneficiaries Leaving the Rolls:
Figure 4: Range across State VR Agencies of the Percentage of SSA
Beneficiaries with Earnings during the Year after VR by Year:
Figure 5: Range of State VR Agency Average Earnings for SSA
Beneficiaries by Year:
Figure 6: Range across State VR Agencies of the Percentage of SSA
Beneficiaries with Earnings during the Year after VR by Agency Type:
Figure 7: Range of State VR Agency Average Earnings for SSA
Beneficiaries by Agency Type:
Figure 8: Range of State VR Agency Average Rates of SSA Beneficiaries
Leaving the Rolls by Agency Type:
Abbreviations:
CPI-U: Consumer Price Index for All Urban Consumers:
CSPD: Comprehensive System of Personnel Development:
DI: Disability Insurance:
GSP: gross state product:
IPE: individual plan of employment:
MEF: Master Earnings File:
OLS: ordinary least squares:
SSA: Social Security Administration:
SSI: Supplemental Security Income:
TRF: Ticket Research File:
VR: vocational rehabilitation:
WIA: Workforce Investment Act:
United States Government Accountability Office:
Washington, DC 20548:
May 23, 2007:
The Honorable Charles B. Rangel:
Chairman:
The Honorable Jim McCrery:
Ranking Minority Member:
Committee on Ways and Means:
House of Representatives:
The Honorable Michael R. McNulty:
Chairman:
The Honorable Sam Johnson:
Ranking Minority Member:
Subcommittee on Social Security:
Committee on Ways and Means:
House of Representatives:
The Honorable Sander M. Levin:
House of Representatives:
State vocational rehabilitation (VR) agencies, under the auspices of
the Department of Education (Education), play a crucial role in helping
individuals with disabilities prepare for and obtain employment. In
fiscal year 2005, state VR agencies received $2.6 billion to provide
people with disabilities a variety of supports such as job counseling
and placement, diagnosis and treatment of impairments, vocational
training, and postsecondary education. The VR program serves about 1.2
million people each year, and over a quarter of those who complete VR
are beneficiaries of the Disability Insurance (DI) or Supplemental
Security Income (SSI) programs administered by the Social Security
Administration (SSA). This proportion has increased steadily since
2002. As our society ages, the number of SSA disability beneficiaries
is expected to grow, along with the cost of providing SSA disability
benefits, and it will be increasingly important to manage this growth
by optimizing the ability of VR programs to help and encourage SSA
beneficiaries to participate in the workforce.
In 2005, GAO reported that state VR agencies varied substantially in
terms of the employment rates they achieved for their clients,
particularly for SSA beneficiaries who, according to research, attain
lower employment and earnings outcomes than other VR clients.[Footnote
1] Depending on the state agency, as many as 68 percent and as few as 9
percent of SSA beneficiaries exited VR with employment. In addition,
GAO found that Education's management of the VR program was lacking in
several respects and recommended that Education revise its performance
measures to account for economic differences between states, make
better use of incentives for state VR agencies to meet performance
goals, and create a means for disseminating best practices among state
VR agencies. Education agreed with these recommendations but has yet to
implement them.
As a follow-up to our 2005 report, you asked us to determine what may
account for the wide variations in state VR agency outcomes with
respect to SSA beneficiaries. Therefore, we examined the extent to
which (1) differences in VR agency outcomes for SSA beneficiaries
continued over several years and across different outcome measures, (2)
differences in VR agency outcomes were explained by state economies and
demographic traits of the clientele served, (3) differences in VR
agency outcomes were explained by specific policies and strategies of
the VR agencies, and (4) Education's data allowed for an analysis of
factors that account for differences in individual-level (as opposed to
agency-level) outcomes.
To perform our work, we used several data sources: (1) a newly
available longitudinal dataset that includes administrative data from
Education and SSA on SSA beneficiaries who completed the VR program
between 2001 and 2003,[Footnote 2] (2) original survey data collected
by GAO from 78 of the 80 state VR agencies, (3) data from Education on
yearly spending information by service category for each VR agency, and
(4) data from the Census Bureau, Bureau of Labor Statistics, and other
data sources regarding state demographic and economic characteristics.
We conducted reliability assessments of these data and found them to be
sufficiently reliable for our analyses.
We took several steps to analyze these data. To answer our questions,
we analyzed outcomes by state agency using three different earnings
outcomes: (1) the percentage of beneficiaries with earnings during the
year after VR, (2) the average beneficiary's annual earnings level
during the year after VR, and (3) the percentage of beneficiaries that
left the disability rolls by the close of 2005.[Footnote 3] For
objective one, we conducted descriptive statistical analyses of the
data. For objectives two, three, and four, we conducted econometric
analyses that controlled for a variety of explanatory factors.[Footnote
4] We also identified and interviewed academic and agency experts in an
effort to determine what variables to include in our models. As is the
case with most statistical analyses, our work was limited by certain
factors, such as the unavailability of certain information and the
inability to control for unobservable characteristics and those that
are not quantifiable. Our results only describe earnings outcomes of
SSA beneficiaries included in our study and cannot be generalized
beyond that population. We conducted our review from December 2005
through April 2007 in accordance with generally accepted government
auditing standards. See appendix I for a more detailed description of
our scope and methods.
Results in Brief:
When we analyzed state agency outcomes for SSA beneficiaries who
completed VR between 2001 and 2003, we found that differences in agency
outcomes continued over several years and across several outcome
measures--i.e., rates of beneficiaries with earnings, earnings levels,
and departures from the disability rolls. The proportion of
beneficiaries with earnings during the year after their completion of
the VR program ranged from as little as 0 percent in one state agency
to as high as 75 percent in another. Similarly, average annual earnings
levels among those SSA beneficiaries with earnings varied across state
agencies from $1,500 to nearly $17,000 in the year following VR.
Additionally, the proportion of SSA beneficiaries who left the
disability rolls varied greatly among agencies, with departure rates
ranging anywhere from 0 to 20 percent.
After controlling for certain economic, demographic, and agency
factors, we found that state economic conditions and the
characteristics of agencies' clientele accounted for much of the
differences in average earnings outcomes across state agencies.
Specifically, state unemployment rates and state per capita income
levels accounted for a substantial portion--as much as one-third--of
the differences between state agencies' VR outcomes for SSA
beneficiaries. For example, significantly fewer SSA beneficiaries had
earnings during the year after VR in those states with higher
unemployment rates and lower per capita incomes. Despite the
significant effect that state economies have on state agency outcomes,
Education currently does not consider such factors when analyzing state
agency outcomes and assessing their performance. Variations in the
demographic profile of SSA client populations also accounted for some
of the differences in earnings outcomes among agencies. For example,
state VR agencies serving a higher percentage of women beneficiaries
had significantly fewer SSA clients with earnings during the year after
VR.
We also found, after controlling for the same factors, that a few
agency practices helped explain differences in state agency outcomes
for SSA beneficiaries--and some were associated with positive outcomes.
For example, agencies with a higher proportion of state-certified VR
counselors--a certification now mandated by Education--had more SSA
beneficiaries exiting the VR program with earnings. Further, agencies
with closer ties to the business community also achieved higher average
annual earnings for SSA beneficiaries and higher rates of departures
from the disability rolls. Currently, Education promotes ties to the
business community through an employer network. Our findings also show
that agencies that received a greater degree of support and cooperation
from other public programs or that spent a greater proportion of their
service expenditures on training of VR clients had higher average
annual earnings for SSA beneficiaries completing VR.
We were unable to account for differences in individual beneficiary
outcomes, which might further explain differences in state agency
outcomes, in part because of limitations in Education's data. Our
statistical models were able to explain a greater percentage of the
differences in earnings outcomes when we analyzed state agency earnings
outcomes compared to individual earnings outcomes (i.e., as much as 77
percent compared to 8 percent). With so little variation explained by
our analyses of individual-level outcomes, we decided not to report our
individual-level analyses. Education's data lack information that we
believe is critical to assessing earnings outcomes, and this may have
hindered our ability to explain the variation in individual earnings
outcomes. Specifically, although Education collects extensive client-
level data, it does not systematically collect data that research has
linked to work outcomes, such as detailed information on the severity
of the client's disability--data that some state agencies independently
collect for program purposes. Knowing the severity of a disability can
indicate whether a person is physically or mentally limited in his or
her ability to perform work, a fact that may influence the person's
earnings outcomes. Further, other key data are self-reported and may
not be verified by state agencies.
We are recommending that Education consider the implications of the
results of our analyses in its management of the VR program.
Specifically, Education should further promote certain agency practices
that we found show an effect on state agency outcomes and reassess the
client-level data it collects through its state agencies. We also
continue to believe that, as we recommended in our 2005 report,
Education should consider economic factors, such as unemployment rates,
when evaluating state agency performance.
We received written comments on a draft of this report from Education
and SSA. While Education generally agreed with the substance of our
recommendations, it disagreed on when economic conditions and state
demographics should be considered in assessing performance. Instead of
using this information to help set performance measures, the department
said that it takes these factors into account when it monitors agency
performance results and believes that its approach is more effective.
We continue to believe that incorporating this contextual information
in assessing performance measures is essential to provide the state
agencies with a more accurate picture of their relative performance.
Although Education stated that it was open to our recommendation on
improving data quality, it suggested that validating self-reported
information would be a potential burden to state agencies and suggested
other approaches, such as conducting periodic studies. Our
recommendation that Education explore cost-effective ways to validate
self-reported data was based on the experience of some VR agencies that
have obtained data successfully from official sources and not relied
solely on self-reported information.
SSA stated that our report has methodological flaws that introduced
aggregation bias and false correlations, and suggested that we should
have focused on individual-level analysis or reported the results of
both individual and aggregate-level analysis. We used aggregated data-
-a widely used means of analysis--because our primary objective was to
understand better the wide variation in outcomes for state VR agencies
that serve SSA beneficiaries rather than the outcomes for individuals.
We used appropriate statistical techniques to ensure against bias and
false correlations. Both Education and SSA provided additional
comments, which we have addressed or incorporated, as appropriate.
Education's and SSA's comments are reprinted in appendixes II and III
respectively, along with our detailed responses.
Background:
Challenges Facing the Social Security Disability Program:
In 2005, the Social Security Administration provided income support to
more than 10 million working age people with disabilities. This income
support is provided in the form of monthly cash benefits under two
programs administered by the Social Security Administration--the
Disability Insurance program and the Supplemental Security Income
program. Some individuals, known as concurrent beneficiaries, qualify
for both programs. The federal government's cost of providing these
benefits was almost $101 billion in 2005.
Over the last decade, the number of disability beneficiaries has
increased, as has the cost of both the SSI and DI programs. This
growth, in part, prompted GAO in 2003 to designate modernizing federal
disability programs as a high-risk area--one that requires attention
and transformation to ensure that programs function in the most
economical, efficient, and effective manner possible. GAO's work found
that federal disability programs were not well positioned to provide
meaningful and timely support for Americans with disabilities. For
example, despite advances in technology and the growing expectations
that people with disabilities can and want to work, SSA's disability
programs remain grounded in an outmoded approach that equates
disability with incapacity to work. In 1999, GAO testified that even
relatively small improvements in return-to-work outcomes offer the
potential for significant savings in program outlays. GAO estimated
that if an additional 1 percent of working age SSA disability
beneficiaries were to leave the disability rolls as a result of
returning to work, lifetime cash benefits would be reduced by an
estimated $3 billion.
SSA has had a long-standing relationship with Education's VR program,
whereby SSA may refer beneficiaries to the VR program for assistance in
achieving employment and economic independence.[Footnote 5] As part of
this relationship, SSA reimburses VR state agencies for the cost of
providing services to beneficiaries who meet SSA's criteria for
successful rehabilitation (i.e., earnings at the substantial gainful
activity level for a continuous 9-month period). To further motivate
beneficiaries to seek VR assistance and expand the network of VR
providers, Congress enacted legislation in 1999 that created SSA's
Ticket to Work (Ticket) Program.[Footnote 6] Under the Ticket program,
beneficiaries receive a document, known as a ticket, which can be used
to obtain VR and employment services from an approved provider such as
a state VR agency. Thus far, only a small fraction of SSA beneficiaries
have used the Ticket program to obtain VR services. Administered by
SSA, this program was intended to (1) increase the number of
beneficiaries participating in VR by removing disincentives to work,
and (2) expand the availability of VR services to include private VR
providers. To date private VR providers have not participated heavily
in the Ticket program, with over 90 percent of SSA beneficiaries
participating in the Ticket program still receiving services from state
VR agencies.
Despite programs such as Ticket, SSA beneficiaries who wish to
participate in the workforce still face multiple challenges. As we have
previously reported, some SSA beneficiaries will not be able to return
to work because of the severity of their disability.[Footnote 7] But
those who do return to work may face other obstacles that potentially
deter or prevent them from leaving the disability rolls, such as (1)
the need for continued health care, (2) lack of access to assistive
technologies that could enhance their work potential, and (3)
transportation difficulties.
Description of Education's Vocational Rehabilitation Program:
The Vocational Rehabilitation Program is the primary federal government
program helping individuals with disabilities to prepare for and obtain
employment. Authorized by Title I of the Rehabilitation Act of 1973,
the VR program is administered by the Rehabilitation Services
Administration, a division of the Department of Education, in
partnership with the states. The Rehabilitation Act contains the
general provisions states should follow in providing VR services. Each
state and territory designates a single VR agency to administer the VR
program--except where state law authorizes a separate agency to
administer VR services for blind individuals. Twenty-four states have
two separate agencies, one that exclusively serves blind and visually
impaired individuals (known as blind agencies) and another that serves
individuals who are not blind or visually impaired (known as general
agencies). Twenty-six states, the District of Columbia, and five
territories have a single combined agency that serves both blind and
visually impaired individuals and individuals with other types of
impairments (known as combined agencies). In total, there are 80 state
VR agencies.[Footnote 8]
Although Education provides the majority of the funding for state VR
agencies, state agencies have significant latitude in the
administration of VR programs. Within the framework of legal
requirements, state agencies have adopted different policies and
approaches to achieve earnings outcomes for their clients. For example,
although all state VR agencies are required to have their VR counselors
meet Comprehensive System of Personnel Development (CSPD) standards,
states have the ability to define the CSPD certification standard for
their VR counselors. Specifically, under the CSPD states can establish
certification standards for VR counselors based on the degree standards
of the highest licensing, certification, or registration requirement in
the state, or based on the degree standards of the national
certification. For example, if an agency bases its certification
standard on the national standard, VR counselors are required to have a
master's degree in vocational counseling or another closely related
field, hold a certificate indicating they meet the national
requirement, or take certain graduate-level courses. Regardless of the
individual state's definition of the certification standard, research
has shown that VR agencies are concerned about meeting their needs for
state-certified counselors because many experienced VR counselors may
retire in the coming years, and a limited supply of qualified VR
counselors are entering the labor market.[Footnote 9]
VR agencies also vary in their locations within state government and
their operations. Some are housed in state departments of labor or
education, while others are free-standing agencies or commissions.
Similarly, while all VR agencies are partners in the state workforce
investment system, as mandated in the Workforce Investment Act (WIA) of
1998, VRs vary in the degree to which they coordinate with other
programs participating in this system.[Footnote 10] For example, some
VRs have staff colocated at WIA one-stop career centers, while others
do not.
By law, each of the 80 VR agencies is required to submit specific
information to Education regarding individuals that apply for, and are
eligible to receive, VR services. Some of the required information
includes (1) the types and costs of services the individuals received;
(2) demographic factors, such as impairment type, gender, age, race,
and ethnicity; and (3) income from work at the time of application to
the VR program. Education also collects additional information such as
(1) the weekly earnings and hours worked by employed individuals, (2)
public support received,[Footnote 11] (3) whether individuals sustained
employment for at least 90 days after receiving services,[Footnote 12]
and (4) summary information on agency expenditures in a number of
categories from each state VR agency.
Education also monitors the performance of state VR agencies, and since
2000, Education has used two standards for evaluating their
performance. One assesses the agencies' performance in assisting
individuals in obtaining, maintaining, or regaining high-quality
employment. The second assesses the agencies' performance in ensuring
that individuals from minority backgrounds have equal access to VR
services. Education also publishes performance indicators that
establish what constitutes minimum compliance with these performance
standards. Six performance indicators were published for the employment
standard, and one was published for the minority service standard. To
have passing performance, state VR agencies must meet or exceed
performance targets in four of the six categories for the first
standard, and meet or exceed the performance target for the second
standard.
In 2005, GAO reported that Education could improve performance of this
decentralized program through better performance measures and
monitoring.[Footnote 13] Specifically, we recommended that Education
account for additional factors such as the economies and demographics
of the states' populations in its performance measures, or its
performance targets, for individual state VR agencies to address these
issues. We also noted that whatever system of performance measures
Education chooses to use, without consequences or incentives to meet
performance standards, state VR agencies will have little reason to
achieve the targets Education has set for them. We recommended that
Education consider developing new consequences for failure to meet
required performance targets and incentives for encouraging good
performance. While Education agreed with our recommendations, it is
currently considering them as part of the development of its VR
strategic performance plan, and has not adopted them to date.
Earlier this year, GAO reported on national-level earnings outcomes for
SSA beneficiaries who completed VR from 2000 to 2003.[Footnote 14]
Among other findings, this report estimated that as a result of work,
some DI and concurrent beneficiaries saw a reduction in their DI
benefits--for an overall annual average benefit reduction of $26.6
million in the year after completing VR compared to the year before VR.
Further, we reported that 10 percent of SSA beneficiaries who exited VR
in 2000 or 2001 were able to leave the disability rolls at some point.
However, almost one quarter of those who left had returned by 2005 for
at least 1 month.
State VR Agencies Consistently Showed Very Different Rates of Success
for SSA Beneficiaries Who Completed VR Programs:
Before controlling for factors that might explain differences in
outcomes among state VR agencies, our analysis of state agency outcomes
over a 3-year period showed very different rates of success for SSA
beneficiaries. This was the case in terms of the proportion of
beneficiaries with earnings, earnings levels, and departures from the
disability rolls. The wide range in average earnings outcomes among
agencies was generally consistent from 2001 through 2003 and within
each of the three types of agencies--referred to as blind, general, and
combined agencies.
Proportion with Earnings, Earnings Levels, and Departures from the
Disability Rolls for SSA Beneficiaries Differed Substantially among
State Agencies:
Between 2001 and 2003, VR agencies varied widely in terms of outcomes
for SSA beneficiaries who completed their VR programs. While the agency
average for beneficiary earnings was 50 percent, the proportion of
beneficiaries with earnings during the year following VR varied
substantially among agencies: from 0 to 75 percent. (See fig. 1.)
Figure 1: Distribution of State VR Agencies by Percentage of SSA
Beneficiaries with Earnings during the Year after VR:
[See PDF for image]
Source: GAO analysis of SSA data.
Note: n = 234, average = 50 percent. The 234 observations result from
78 VR agencies providing data for 3 years (2001 through 2003).
[End of figure]
Similarly, while the agency average for annual earnings levels for SSA
beneficiaries who had earnings was $8,140, such earnings ranged by
agency from about $1,500 to nearly $17,000. (See fig. 2.)
Figure 2: Distribution of State VR Agency Average Annual Earnings for
SSA Beneficiaries with Earnings during the Year after VR:
[See PDF for image]
Source: GAO analysis of SSA data.
Note: n = 232, average = $8,140. The number in figure 2 differs from
that in figure 1 because two agencies did not have any beneficiaries
with reported earnings in fiscal year 2002. All earnings are in 2004
dollars.
[End of figure]
Agencies also differed in the proportion of SSA beneficiaries who had
left the disability rolls by 2005, with departure rates ranging
anywhere from 0 to 20 percent. The average departure rate was 7
percent. (See fig. 3.)
Figure 3: Distribution of State VR Agencies by Percentage of SSA
Beneficiaries Leaving the Rolls:
[See PDF for image]
Source: GAO analysis of SSA data.
Note: n = 234, average = 7 percent.
[End of figure]
Trends Were Similar over Time and by Agency Type:
In general, the range of earnings outcomes across agencies was similar
over the 3 years we examined. While the average percentage of SSA
beneficiaries with earnings during the year after VR declined slightly
over this period from 53 percent in 2001 to 48 percent in 2003, the
spread in the percentage of beneficiaries with earnings remained widely
dispersed across agencies for all 3 years, as shown in figure 4.
Figure 4: Range across State VR Agencies of the Percentage of SSA
Beneficiaries with Earnings during the Year after VR by Year:
[See PDF for image]
Source: GAO analysis of SSA data.
[End of figure]
Likewise, the range of average earnings among agencies was similar for
all 3 years, as shown in figure 5.[Footnote 15]
Figure 5: Range of State VR Agency Average Earnings for SSA
Beneficiaries by Year:
[See PDF for image]
Source: GAO analysis of SSA data.
Note: Two agencies did not have any beneficiaries with reported
earnings in fiscal year 2002. All earnings are in 2004 dollars.
[End of figure]
There were also wide differences in performance within the three types
of agencies that serve different types of clientele--known as blind,
general, and combined agencies. Specifically, among blind agencies, the
percentage of SSA beneficiaries with earnings during the year after VR
ranged from 23 to 67 percent, with an average of 46 percent. Among
general agencies, the percentage of SSA beneficiaries with earnings
after VR varied from 37 to 74 percent, with an average of 55 percent,
and for combined agencies the percentage varied from 0 to 75 percent,
with an average of 49 percent. (See fig. 6.)
Figure 6: Range across State VR Agencies of the Percentage of SSA
Beneficiaries with Earnings during the Year after VR by Agency Type:
[See PDF for image]
Source: GAO analysis of SSA data.
[End of figure]
Average annual SSA client earnings among blind agencies varied the
most--from $4,582 to $16,805, with an average of $10,699 per year. SSA
client earnings among the combined agencies varied anywhere from $1,528
to $10,889, with an average of $7,088 per year. General agencies showed
the least variation in earnings among their SSA clients--from $4,654 to
$9,424--but the lowest average ($6,867). (See fig. 7.)
Figure 7: Range of State VR Agency Average Earnings for SSA
Beneficiaries by Agency Type:
[See PDF for image]
Source: GAO analysis of SSA data.
Note: Two combined agencies did not have any beneficiaries with
reported earnings in fiscal year 2002. All earnings are in 2004
dollars.
[End of figure]
Finally, for rates of departure from the SSA disability rolls by 2005,
blind agencies ranged from 0 to 16 percent, with an average of 6.7
percent; general agencies varied from 4 to 15 percent, with an average
of 7.5 percent; and combined agencies varied from 0 to 20 percent, with
an average of 7 percent. (See fig. 8.)
Figure 8: Range of State VR Agency Average Rates of SSA Beneficiaries
Leaving the Rolls by Agency Type:
[See PDF for image]
Source: GAO analysis of SSA data.
[End of figure]
State Economic Conditions and SSA Beneficiary Characteristics Account
for Much of the Difference in State VR Agency Success Rates:
After controlling for a range of factors, we found that much of the
differences in state VR agency success rates could be explained by
state economic climates and the characteristics of the SSA beneficiary
populations at the VR agencies. Specifically, among a range of possible
factors we considered, the economic conditions of the state appeared to
explain up to one-third of the differences between state agency
outcomes for SSA beneficiaries.[Footnote 16] Additionally, differences
in the characteristics of the clientele accounted for some of the
variation in performance among VR agencies.
Differences in Agency Outcomes Were Largely Due to a State's Economic
Conditions:
When we controlled for a variety of factors using multivariate
analysis, we found that state economic conditions accounted for a
substantial portion of the differences in VR outcomes across state
agencies. Not surprisingly, we found that fewer SSA beneficiaries had
earnings during the year after completing VR in states with high
unemployment rates after controlling for other factors. Moreover, our
analysis showed that for each 1 percent increase in the unemployment
rate, the percentage of SSA beneficiaries who had earnings during the
year after completing VR decreased by over 2 percent.[Footnote 17]
Across agencies, unemployment rates ranged from 2.3 to 12.3 percent
between 2001 and 2003, with an average of 4.7 percent.
We also found that after controlling for other factors, VR agencies in
states with lower per capita incomes saw fewer SSA beneficiaries who
had earnings, lower earnings levels, and fewer departures from the
disability rolls in the year after VR. Across states, per capita
incomes ranged from approximately $4,400 to $46,000 dollars, with an
average of approximately $28,000. Together, state unemployment rates
and per capita incomes explained over one-third of the differences
between states agencies in the proportion of SSA beneficiaries that had
earnings during the year after VR and the proportion that left the
rolls.[Footnote 18]
Agency officials commented that difficult economic environments result
in lower earnings outcomes because a state's economy has a direct
impact on an agency's ability to find employment for individuals. Our
findings are also consistent with past research that has found labor
market conditions to be among the most influential determinants of
agency performance.[Footnote 19] Education, however, does not currently
consider state economic conditions when evaluating agency
performance.[Footnote 20] Although Education agreed with our prior
recommendation to consider economic and demographic characteristics
when evaluating agency performance, Education is currently considering
it as part of the development of its VR strategic performance plan and
has not yet adopted this recommendation.
Demographic Characteristics and the Types of Disabilities of Clientele
Also Accounted for Some of the Disparities in State Agency Performance:
After controlling for a variety of factors, certain characteristics of
the clientele served by state agencies accounted for some of the state
agency differences in earnings outcomes for SSA beneficiaries. Among
the factors we examined the influence of were: demographic
characteristics, types of disabilities, and the proportion of SSA
beneficiaries served by each state agency.[Footnote 21]
Demographic Differences:
Several clientele characteristics influenced state agency earnings
outcomes.[Footnote 22] In particular, after controlling for other
factors, state agencies that served a higher proportion of women
beneficiaries had fewer beneficiaries with earnings during the year
after completing VR. According to our analysis, a 10 percent increase
in the percentage of women served by a VR agency resulted in a 5
percent decrease in the percentage of SSA beneficiaries with earnings.
Research shows that for the population of low-income adults with
disabilities, women were found to have lower employment rates than
men.[Footnote 23]
Further, we found that after controlling for other factors, state
agencies serving a larger percentage of SSA beneficiaries between 46
and 55 years old when they applied for the VR program saw fewer SSA
beneficiaries leave the disability rolls.[Footnote 24] For every 10
percent increase in the percentage of beneficiaries in this age group,
the percentage of SSA beneficiaries leaving the rolls decreased by
approximately 1 percent.
Differences in Types of Disabilities:
When we considered the influence of various types of medical
impairments on earnings outcomes, we found that some state agency
outcomes were related to the proportion of SSA beneficiaries who had
mental or visual impairments. Average earnings and departures from the
disability rolls for SSA beneficiaries were lower in agencies that
served a larger percentage of individuals with mental impairments,
after controlling for other factors. Specifically, our analysis
indicated that a 10 percent increase in the proportion of the
beneficiary population with a mental impairment resulted in a decrease
of almost 1 percent in the proportion of SSA beneficiaries who left the
rolls. Some SSA beneficiaries may not leave the disability rolls
because, as research has shown, they fear a loss of their public
benefits or health coverage.[Footnote 25] This is particularly true for
individuals with mental impairments.
Agencies with a higher proportion of blind or visually impaired
beneficiaries had fewer departures from the disability rolls after
controlling for other factors. We found that an increase of 10 percent
in the proportion of individuals with a visual impairment resulted in a
decrease of 0.5 percent of beneficiaries leaving the rolls. Some SSA
beneficiaries with visual impairments are classified as legally blind.
As such, they are subject to a higher earnings threshold, in comparison
to those that are not legally blind, before their benefits are reduced
or ceased. Our analysis also showed that holding other factors equal,
blind agencies--those serving only clientele with visual impairments--
had fewer SSA beneficiaries with earnings during the year after
completing VR than agencies that served a lower proportion of
beneficiaries with visual impairments.[Footnote 26]
Proportion of SSA Beneficiaries Served:
Differences in the proportion of SSA beneficiaries served by an agency
also affected earnings outcomes for SSA beneficiaries. Specifically,
agencies with a greater proportion of SSA beneficiaries had more
beneficiaries with earnings during the year after VR, but saw lower
earnings levels for their SSA beneficiaries, holding other factors
constant. VR state agency officials and experts with whom we consulted
were unable to provide an explanation for this result.[Footnote 27]
We also found that after controlling for other factors, agencies with a
higher proportion of SSA beneficiaries who were DI beneficiaries had
lower average annual earnings among SSA beneficiaries and a lower
percentage of beneficiaries leaving the rolls. The earnings result
might be explained by differences in the work incentive rules between
the two programs. Specifically, the work incentive rules are more
favorable for SSI beneficiaries who want to increase their earnings
while not incurring a net income penalty.[Footnote 28] The lower rates
of departures from the rolls among agencies with a greater proportion
of DI beneficiaries might be due to the limited time frames of our
study and the fact that DI beneficiaries are allowed to work for a
longer period of time before their benefits are ceased.[Footnote 29]
A Few Agency Practices Appeared to Yield Better Earnings Outcomes,
while the Results of Other Practices Were Inconclusive:
When we analyzed outcomes at the agency level, a few agency practices
appeared to yield some positive results, albeit in different ways.
Specifically, after controlling for other factors, we found that state
agencies with a higher proportion of state-certified VR counselors, or
stronger relationships with businesses or other public agencies
appeared to have better earnings outcomes. Further, agencies that
devoted a greater proportion of their service expenditures to training
of VR clients had higher average annual earnings for SSA beneficiaries
completing VR, holding other factors equal. On the other hand, our
multivariate analyses suggest that agencies using in-house benefits
counselors saw fewer beneficiaries with earnings following VR, but
these results may not be conclusive because the benefits counseling
program has changed considerably since the time period of our study.
Agencies with State-Certified Counselors or Strong Relationships with
Businesses or Other Public Agencies Appeared to Have Better Earnings
Outcomes:
State VR agencies that reported employing a higher percentage of
counselors meeting the state certification standards had higher rates
of beneficiaries with earnings among those beneficiaries who completed
VR between 2001 and 2003, holding other factors constant. On average,
62 percent of counselors at an agency met the states' certification
requirements, but the range was from 0 to 100 percent. According to our
analysis, for every 10 percent increase in the percentage of counselors
meeting state requirements, the percentage of SSA beneficiaries with
earnings during the year after VR increased by 0.5 percent. This
appeared to be consistent with research indicating that more highly
qualified VR counselors are more likely to achieve successful earnings
outcomes.[Footnote 30] While the certification requirements vary by
state, agency officials reported that counselors with master's degrees
in vocational rehabilitation are more likely to be versed in the
history of the VR program and the disability rights movement and are
likely to be more attuned to the needs of their clients than those
without specialized degrees.
VR agencies that had stronger relationships with the business community
had higher average earnings among SSA beneficiaries during the year
after completing VR and higher rates of departures from the disability
rolls, holding other factors equal. These were agencies that reported
interacting with the business community more frequently by sponsoring
job fairs, hosting breakfasts, attending business network meetings,
meeting with local businesses, meeting with local chambers of commerce,
and interacting with civic clubs. To support these practices, Education
has helped establish the Vocational Rehabilitation Employer Business
and Development Network, which aims to connect the business community
to qualified workers with disabilities through the efforts of staff
located at each of the VR agencies who specialize in business
networking.[Footnote 31] VR agency officials with whom we spoke said
that through interaction with the business community, they could dispel
myths about the employability of people with disabilities, and they
could tailor services for their clients to the specific needs of
different businesses.
In addition to business outreach, our multivariate analysis indicated
that agencies that reported receiving a greater degree of support and
cooperation from more than one public program--such as from state
social services, mental health, and education departments--also showed
higher average earnings among SSA beneficiaries. One VR agency official
commented that people with disabilities need multiple supports and
services and therefore are more effectively served through partnerships
between government programs.[Footnote 32] Another VR official said that
coordination with other programs facilitated the provision of a
complete package of employment-related services. For example, VR might
provide employment training to an individual, while the department of
labor might provide transportation services to get the person to work.
Although many agencies said they were successful in coordinating with
other programs, some reported difficulties. For example, they cited
barriers to coordinating with WIA one-stops such as inability to share
credit for successful earnings outcomes, staff that are not trained to
serve people with disabilities, and inaccessible equipment,
particularly for those with visual or hearing impairments.
Agency Expenditures on Training Yield Positive Outcomes:
Additionally, agencies with a greater proportion of their service
expenditures spent on training of VR clients--including postsecondary
education, job readiness and augmentative skills, and vocational and
occupational training--had higher average annual earnings for SSA
beneficiaries completing VR, holding other factors equal.[Footnote 33]
The average percentage of service expenditures devoted to training of
VR clients was 47 percent, but this ranged from 3 to 84 percent across
agencies. Research has shown that the receipt of certain types of
training services, such as business and vocational training, leads to
positive earnings outcomes.[Footnote 34]
Effect of Using In-house Benefits Counselors is Unclear:
Our analysis suggests that after controlling for other factors,
agencies with in-house benefits counselors--counselors who advise VR
clients on the impact of employment on their benefits--had lower rates
of SSA beneficiaries with earnings during the year after completing VR
than agencies without them. Over the years we studied, only 14 percent
of state agencies reported using in-house benefits counselors. However,
this was a period of transition for the benefits counseling program.
There was wide variation in how this service was provided, and clients
in states that did not have on-site benefits counselors may have
received benefits counseling from outside the agency. According to one
researcher, the benefits counseling program has become more
standardized since that period. In fact, other empirical research shows
that benefits counselors have had a positive effect on
earnings.[Footnote 35]
VR Officials in Some Agencies Credited Other Practices with Yielding
Results:
Some agency officials credited certain other practices with yielding
positive results, but we were not able to corroborate their ideas with
our statistical approach. For example, VR agency officials cited the
following practices as being beneficial: (1) collaborative initiatives
between the state VR agency and other state agencies aimed to help
specific client populations, such as individuals with mental
impairments or developmental disabilities; (2) unique applications of
performance measures, such as measuring performance at the team level
rather than the individual counselor level; and (3) improved use of
computer information systems, such as real-time access to the status of
individual employment targets. Although we were able to examine many
state practices with our survey data, there were not enough agencies
employing these practices for us to determine whether these practices
led to improved earnings outcomes for SSA beneficiaries among state VR
agencies.
Limitations in Education's Data May Have Hampered Analyses of
Individual Earnings Outcomes:
Although we were able to explain a large amount of the differences in
earnings outcomes among state agencies, we could only explain a small
amount of the differences in earnings outcomes among individual SSA
beneficiaries. Specifically, while our models accounted for between 66
and 77 percent of the variation in agency-level earnings outcomes, our
models using the individual-level data had low explanatory power,
accounting for only 8 percent of variation in earnings levels across
individuals and rarely producing reliable predictions for achieving
earnings or leaving the rolls. With so little variation explained in
individual-level outcomes, we could not be confident that our
individual-level analyses were sufficiently reliable to support
conclusions. As a result, we chose not to report on these analyses.
Other researchers told us they have experienced similar difficulties
using Education's client database to account for individual differences
in earnings outcomes among VR clients.
Education's data lack information that we believe is critical to
assessing earnings outcomes, and not having this information may have
hindered our ability to explain differences in individual earnings
outcomes.[Footnote 36] Specifically, Education does not collect certain
information on VR clients that research has linked to work outcomes,
such as detailed information on the severity of the disability and
historical earnings data. Research indicates that both of these factors
are, or could be, important to determining employment success for
people with disabilities.[Footnote 37] With regard to obtaining
information on the severity of the client's disability, knowing the
severity of the disability can indicate the extent to which a person is
physically or mentally limited in the ability to perform work, a fact
that may influence the person's earnings outcomes. While Education's
client data include information indicating whether a disability is
significant--which is defined by the Rehabilitation Act--the data do
not include more detailed information on the severity of the
disability, such as the number and extent of functional
limitations.[Footnote 38] Additionally, Education does not collect
information on a client's historical earnings, which may provide a
broader understanding of the client's work experience and likelihood to
return to work. States may be able to obtain earnings data from other
official sources, such as other state and federal agencies.
Another limitation with Education's data is the collection of self-
reported information from the client that may not be validated by the
VR agency. For example, one agency official said that clients are asked
to report their earnings at the time of application--information that
Education is legally required to collect--and that these data may not
be accurate. Reliable information on a client's earnings at the time of
application to VR is essential for evaluating the impact of the VR
program on earnings. However, some clients may misreport their
earnings. One researcher reported, for example, that VR clients
sometimes report net as opposed to gross earnings. Instead of relying
on self-reported information, agencies may be able to obtain or
validate this information from official sources. Specifically, some
state VR agencies have agreements with other state and federal agencies
to obtain earnings data on their clients. For example, agency officials
from one state told us that they match their data against earnings data
from the Department of Labor, while another agency relies on data from
their state's Employment Development Department. However, in some cases
state agencies are required to pay for these data.
Conclusions:
The federal-state vocational rehabilitation program is still the
primary avenue for someone with a disability to prepare for and obtain
employment. Given the growing size of the disability rolls and the
potential savings associated with moving beneficiaries into the
workforce, it is important to make the nation's VR program as effective
as possible to help people with disabilities participate in the
workforce.
Our findings indicate that it will be difficult to maximize the
effectiveness of the VR program with assessments of state agency
performance that do not account for important factors, such as the
economic health of the state. Such comparisons will be misleading.
Without credible indicators, VR agencies do not have an accurate
picture of their relative performance, and Education may continue its
reluctance to use sanctions or incentives to encourage compliance. Our
findings underscore the recommendation that we made in 2005 that
Education consider economic factors in assessing the performance of
state vocational rehabilitation agencies.
Moreover, our study points to deficiencies in Education's data that may
hinder more conclusive analyses of individual-level earnings outcomes.
Without data on the severity of a client's disability or information on
historical earnings, VR programs may not be able to conduct valuable
analysis to explain differences in individual outcomes. With the
growing emphasis on the role of VR in helping people with disabilities
enter the workforce, the need for such analyses--and data that can be
used to conduct them--is likely to increase.
Despite the deficiencies in Education's data, our findings show that
certain agency practices may improve VR success across the country and
give weight to current efforts by Education to promote such practices.
The fact that agencies with stronger ties to the business community
have achieved higher earnings among their SSA beneficiaries suggests
the importance of such practices, such as Education's initiative to
promote business networks. Our findings also demonstrate the value of
having VR counselors meet state certification standards and having
agencies collaborate with more than one supportive public agency to
help their clients. Our study also suggests that other practices, such
as state agencies devoting more resources to targeted training services
for VR clients, may have positive benefits.
Recommendations for Executive Action:
To improve the effectiveness of Education's program evaluation efforts
and ultimately the management of vocational rehabilitation programs, we
recommend that the Secretary of Education:
1. Further promote agency practices that show promise for helping more
SSA disability beneficiaries participate in the workforce. Such a
strategy should seek to increase:
* the percentage of VR staff who meet state standards and
certifications established under the CSPD,
* partnership or involvement with area business communities, and:
* collaboration with other agencies that provide complementary
services.
2. Reassess Education's collection of VR client data through
consultation with outside experts in vocational rehabilitation and the
state agencies. In particular, it should:
* consider the importance of data elements that are self-reported by
the client and explore cost-effective approaches for verifying these
data, and:
* consider collecting additional data that may be related to work
outcomes, such as more detailed data on the severity of the client's
disability and past earnings history, collaborating whenever possible
with other state and federal agencies to collect this information.
3. In a 2005 report, we recommended that Education revise its
performance measures or adjust performance targets for individual state
VR agencies to account for additional factors. These include the
economic conditions of states, as well as the demographics of a state's
population. We continue to believe that Education should adopt this
recommendation, especially in light of our findings on the impact of
state unemployment rates, per capita incomes, and demographic factors
on earnings outcomes.
Agency Comments and Our Evaluation:
We received written comments on a draft of this report from Education,
which oversees the VR program, and SSA, from which we received data
that were used to evaluate its Ticket to Work program. Education
commended our use of multiple data sources and said that it opens up
new analytical possibilities in evaluating how VR programs serve SSA
beneficiaries, including identifying low-performing and high-
performing VR programs. However, Education also questioned whether the
statistical relationships we found can be applied to how it administers
a state-operated formula grant program. We continue to believe our
findings have important implications for improving what data are
collected and how VR services are delivered. While Education generally
agreed with the substance of our recommendations, it disagreed on when
economic conditions and state demographics should be considered in
assessing agency performance. Instead of using this information to help
set performance measures, the department said that it takes these
factors into account when it monitors agency performance results and
believes that its approach is effective. We believe that incorporating
this contextual information into assessing performance is essential to
provide the state agencies with a more accurate picture of their
relative performance. Although Education stated that it was open to our
recommendation on improving data quality, it suggested that validating
self-reported information would be a potential burden to state agencies
and suggested other approaches, such as conducting periodic studies.
Our recommendation that Education explore cost-effective ways to
validate self-reported data was based on the experience of some VR
agencies that have obtained data successfully from official sources and
not relied solely on self-reported information. We made additional
technical changes as appropriate based on Education's comments. See
appendix II for a full reprinting of Education's comments and our
detailed responses.
SSA stated that our report has methodological flaws that introduced
aggregation bias and false correlations, and suggested that we should
have focused on individual-level analysis or reported the results of
both individual and aggregate-level analyses. We used aggregated data-
-a widely used means of analysis--because our primary objective was to
understand better the wide variation in outcomes for state VR agencies
that serve SSA beneficiaries rather than the outcomes for individuals.
Further, we used appropriate statistical techniques to ensure the lack
of bias due to clustering of individual cases within agencies (see app.
I for a more detailed discussion). Because we used aggregated data, we
did not attempt to infer the effects of individual behavior or
individual outcomes. Additionally, SSA had concerns about the
implications of our analysis of state economic factors on agency-level
outcomes. Our findings related to the influence of state economic
characteristics were highly statistically significant as well as
corroborated by previous research, and we believe these results have
important implications for VR agency performance measures. SSA provided
additional comments, which we addressed or incorporated, as
appropriate. See appendix III for a full reprinting of SSA's comments
as well as our detailed responses.
Copies of this report are being sent to the Secretary of Education, the
Commissioner of SSA, appropriate congressional committees, and other
interested parties. The report is also available at no charge on GAO's
Web site at http://www.gao.gov. If you have any questions about this
report, please contact me at (202) 512-7215. Other major contributors
to this report are listed in appendix IV.
Signed by:
Denise M. Fantone:
Acting Director, Education, Workforce, and Income Security Issues:
[End of section]
Appendix I: Scope and Methodology:
To understand the variation in state agency outcomes for Social
Security Administration (SSA) disability beneficiaries completing the
vocational rehabilitation (VR) program, we conducted two sets of
analyses. First, we used descriptive analyses to compare agency
performance with three measures of earnings outcomes from 2001 to 2003.
Second, using agency and survey data, we conducted econometric analyses
of the three measures of earnings outcomes to determine how state and
agency characteristics related to state agency performance.
We developed our analyses in consultation with GAO methodologists, an
expert consultant, and officials from SSA and the Department of
Education (Education).[Footnote 39] To choose the appropriate variables
for our analyses, we reviewed pertinent literature and consulted with
agency officials and academic experts.
This appendix is organized in four sections: Section 1 describes the
data that were used in our analyses and our efforts to ensure data
reliability. Section 2 describes the study population, how the
dependent variables used in the analyses were constructed, and the
descriptive analyses of those variables. Section 3 describes the
econometric analyses. Section 4 explains the limitations of our
analyses.
Section 1: Data Used, Information Sources, and Data Reliability:
This section describes each of the datasets we analyzed, the variables
from each dataset that were used in our analyses, and the steps that
were taken to assess the reliability of each dataset.
To conduct our analyses, we used several data sources: (1) a newly
available longitudinal dataset that includes information from several
SSA and Education administrative databases on all SSA disability
beneficiaries who completed the VR program from 2001 through 2003; (2)
data from Education on yearly spending information by service category
for each state VR agency; (3) data from the Census Bureau, the Bureau
of Labor Statistics, and other data sources regarding state demographic
and economic characteristics; and (4) original survey data collected by
GAO from state VR agencies. To perform our analyses, we used variables
from each of the above datasets by merging, by agency and year, each of
the datasets into one large data file.
Education and SSA Beneficiary Data:
We obtained a newly available longitudinal dataset--a subfile of SSA's
Ticket Research File (TRF)--which contains information from several SSA
and Education administrative databases on all SSA disability
beneficiaries who completed the federal-state VR program between 1998
and 2004.[Footnote 40] SSA merged this dataset with its Master Earnings
File (MEF), which contains information on each beneficiary's annual
earnings from 1990 through 2004. The combined data provide information
about each beneficiary's disability benefits, earnings, and VR
participation.[Footnote 41] See section 2 of this appendix for a
description of how these data were used to create our dependent
variables on earnings outcomes.
We were interested in how earnings outcomes were affected by
differences across agencies, including differences in characteristics
of the individuals served by the different agencies. Table 1 shows
information from the TRF subfile on characteristics of our study
population that we included among our explanatory variables.[Footnote
42]
Table 1: Explanatory Variables from the TRF Subfile:
State agency demographic characteristics: Percentage of beneficiaries
between the ages of 18 and 25.
State agency demographic characteristics: Percentage of beneficiaries
between the ages of 26 and 35.
State agency demographic characteristics: Percentage of beneficiaries
between the ages of 36 and 45.
State agency demographic characteristics: Percentage of beneficiaries
between the ages of 46 and 55.
State agency demographic characteristics: Percentage of beneficiaries
between the ages of 56 and 64.
State agency demographic characteristics: Percentage of female
beneficiaries.
State agency demographic characteristics: Percentage of white
beneficiaries.
State agency demographic characteristics: Percentage of African-
American beneficiaries.
State agency demographic characteristics: Percentage of Native-
American beneficiaries.
State agency demographic characteristics: Percentage of Asian and
Pacific Islander beneficiaries.
State agency demographic characteristics: Percentage of Hispanic
beneficiaries.
State agency demographic characteristics: Percentage of multiracial
beneficiaries.
Stage agency medical characteristics: Percentage of beneficiaries who
are blind or have visual impairments.
Stage agency medical characteristics: Percentage of beneficiaries with
sensory impairments.
Stage agency medical characteristics: Percentage of beneficiaries with
physical impairments.
Stage agency medical characteristics: Percentage of beneficiaries with
mental impairments.
State agency program participation: Percentage of beneficiaries
receiving Supplemental Security Income.
State agency program participation: Percentage of beneficiaries
receiving Disability Insurance.
State agency program participation: Percentage of concurrent
beneficiaries (receiving both SSI and DI).
State agency program participation: Proportion of SSA beneficiaries
served by an agency[A].
Source: SSA and Education data.
[A] To construct this variable, additional information was obtained
from Education on the total number of clients completing the VR
program.
[End of table]
To determine the reliability of the TRF subfile, we:
* reviewed SSA and Education documentation regarding the planning for
and construction of the TRF subfile,
* conducted our own electronic data testing to assess the accuracy and
completeness of the data used in our analyses, and:
* reviewed prior GAO reports and consulted with GAO staff knowledgeable
about these datasets.
On the basis of these steps, we determined that despite the limitations
outlined in section 4, the data that were critical to our analyses were
sufficiently reliable for our use.
VR Agency Administrative Data:
To determine whether differences in agency size and expenditure
patterns affected earnings outcomes, we obtained information on state
VR agency expenditures for the years 2000 through 2002 from the RSA-2
data, an administrative dataset compiled by Education. The RSA-2 data
contain aggregated agency expenditures for each of the 80 state VR
agencies as reported in various categories, such as administration and
different types of services. Table 2 shows the variables that were
derived from the RSA-2 data.
Table 2: Explanatory Variables from Education's RSA-2 Data:
Agency structure: Type of agency: (1) general, (2) blind, and (3)
combined agencies.
Agency structure: Number of people receiving services (proxy for size).
Agency structure: Total expenditures on services (proxy for size).
Agency expenditures: Percentage of all service expenditures spent on
assessment.
Agency expenditures: Percentage of all service expenditures spent on
diagnosis/treatment.
Agency expenditures: Percentage of all service expenditures spent on
training services for VR clients.
Agency expenditures: Percentage of all service expenditures spent on
maintenance.
Agency expenditures: Percentage of all service expenditures spent on
transportation.
Agency expenditures: Percentage of all service expenditures spent on
personal assistance services.
Agency expenditures: Percentage of all service expenditures spent on
placement services.
Agency expenditures: Percentage of all service expenditures spent on
post employment services.
Agency expenditures: Percentage of all service expenditures spent on
other services.
Agency expenditures: Percentage of total service expenditures (not
including assessment, counseling, guidance, and placement) spent on
assessment[A].
Agency expenditures: Percentage of total service expenditures (not
including assessment, counseling, guidance, and placement) spent on
diagnosis/treatment[A].
Agency expenditures: Percentage of total service expenditures (not
including assessment, counseling, guidance, and placement) spent on
training services for VR clients[A].
Agency expenditures: Percentage of total service expenditures (not
including assessment, counseling, guidance, and placement) spent on
maintenance[A].
Agency expenditures: Percentage of total service expenditures (not
including assessment, counseling, guidance, and placement) spent on
transportation[A].
Agency expenditures: Percentage of total service expenditures (not
including assessment, counseling, guidance, and placement) spent on
personal assistance services[A].
Agency expenditures: Percentage of total service expenditures (not
including assessment, counseling, guidance, and placement) spent on
placement[A].
Agency expenditures: Percentage of total service expenditures (not
including assessment, counseling, guidance, and placement) spent on
post employment services[A].
Agency expenditures: Percentage of total service expenditures (not
including assessment, counseling, guidance, and placement) spent on
other services[A].
Agency expenditures: Percentage of total expenditures spent on
administration.
Agency expenditures: Percentage of total expenditures spent on services
provided directly by VR personnel.
Agency expenditures: Percentage of total expenditures spent on
purchased services.
Agency expenditures: Percentage of total expenditures spent on services
purchased from public vendors.
Agency expenditures: Percentage of total expenditures spent on services
purchased from private vendors.
Agency expenditures: Percentage of total expenditures spent on services
to individuals with disabilities.
Agency expenditures: Percentage of total expenditures spent on services
to groups with disabilities.
Source: Education data.
[A] These total expenditures include those optional services that are
provided to clients based on their specific needs. They do not include
assessment, counseling, guidance, and placement services provided
directly by VR personnel since these services are generally provided to
all VR clients.
[End of table]
To determine the reliability of the RSA-2 data, we:
* reviewed relevant agency documentation and interviewed agency
officials who were knowledgeable about the data, and:
* conducted our own electronic data testing to assess the accuracy and
completeness of the data used in our analyses.
On the basis of these steps, we determined that the data that were
critical to our analyses were sufficiently reliable for our use.
State Economic and Demographic Data:
We were interested in how differences in state characteristics affected
earnings outcomes of SSA beneficiaries completing VR at different VR
agencies. The state characteristics we considered included economic
conditions (unemployment rates, per capita income, and gross state
product growth rates), population characteristics (including size,
density, and percentage living in rural areas and on Disability
Insurance), and availability of the Medicaid Buy-in program. Data on
state characteristics were downloaded from several sources, including
federal agencies and research institutes. The research institutes from
which we obtained data included Cornell University Institute for Policy
Research and Mathematica Policy Research, Inc., both authorities in
social science research. Table 3 summarizes the state data that were
collected and the sources for those data.
Table 3: State Economic and Demographic Explanatory Variables and Their
Sources:
Variable: Annual state unemployment rates;
Data source: Department of Labor, Bureau of Labor Statistics.
Variable: Gross state product (GSP) growth rate;
Data source: Department of Commerce, Bureau of Economic Analysis.
Variable: Annual per capita income;
Data source: Department of Commerce, Bureau of Economic Analysis.
Variable: Annual population;
Data source: Department of Commerce, Census Bureau.
Variable: Population density;
Data source: Department of Commerce, Census Bureau.
Variable: Percentage of rural population;
Data source: Department of Commerce, Census Bureau.
Variable: Medicaid Buy-In participation;
Data source: Cornell University Institute of Policy Research and
Mathematica Policy Research, Inc. (primary sources).
Variable: Ticket to Work program implementation;
Data source: Mathematica Policy Research, Inc.
Source: Various data sources listed in table.
[End of table]
For each of these data sources we reviewed documentation related to the
agency's or research organization's efforts to ensure the accuracy and
integrity of their data. On the basis of these reviews, we concluded
that the data were sufficiently reliable for the purposes of our
review.
VR Agency Survey Data:
We were also interested in how differences in the VR agencies
themselves affected earnings outcomes. To obtain information about the
policies, practices, and environment of each state VR agency, we
conducted a detailed survey of all state agencies. The survey was
intended to collect information that may be relevant to explaining
earnings outcomes of SSA beneficiaries who exited the VR program
between federal fiscal years 2001 through 2003. Specifically, we
collected information on the structure of the VR program, staffing and
turnover rates, performance measures, service portfolios, and the
extent of integration with outside partners such as other state and
federal agencies and the business community.[Footnote 43] In developing
our survey, we identified relevant areas of inquiry by conducting a
review of the literature on state VR agency performance and consulting
with state agency officials and outside researchers.
For the final survey, we sent e-mail notifications asking state agency
officials to complete either a Web-based version of the survey (which
was accessible to those with visual impairments) or a Microsoft Word
version of the survey by August 4, 2006. We closed the survey on August
22, 2006. We obtained survey responses from 78 of the 80 state VR
agencies, for a response rate of 98 percent.
Because this was not a sample survey, it has no sampling errors.
However, the practical difficulties of conducting any survey may
introduce errors, commonly referred to as nonsampling errors. For
example, difficulties in interpreting a particular question or sources
of information available to respondents can introduce unwanted
variability into the survey results. We took steps in developing the
questionnaire, collecting the data, and analyzing them to minimize such
nonsampling error. For example, we pretested the content and format of
our survey with officials from 17 state agencies to determine if it was
understandable and the information was feasible to collect, and we
refined our survey as appropriate. When the data were analyzed, an
independent analyst checked all computer programs. Since the data were
collected with a Web-based and Word format survey, respondents entered
their answers directly into the electronic questionnaire, thereby
eliminating the need to key data into a database, minimizing another
potential source of error.
The variables that we analyzed from the survey data are presented in
table 4. These included the structure of the agency (stand-alone
agencies, umbrella agencies with and without autonomy over staff and
finances, and others), agency staffing, agency management, indicators
of the existence of performance targets and incentives, specialized
caseloads, case management systems and system components, and
integration with outside partners and the business community. Since we
had data on each of the earnings outcomes and most of the state and
agency characteristics for each of the 3 years, we included in our
analysis an indicator for year.
Table 4: Explanatory Variables from the VR Agency Survey Data:
Agency structure.
Agency structure 1--indicates whether agency is (1) part of an umbrella
agency with autonomy over its own staff and finances, (2) part of an
umbrella agency without autonomy over its own staff and finances, (3) a
stand-alone agency, and (4) other type of agency.
Agency structure 2--indicates whether agency is part of an umbrella
agency.
Agency structure 3--indicates whether agency is in an umbrella agency
that was a part of (1) social services, (2) education, (3) labor (4)
human services, (5) a stand-alone, or (6) other type of agency.
Agency staffing.
Percentage of service delivery sites staffed full-time[A].
Percentage of service delivery sites staffed part-time[A].
Percentage of service delivery sites shared with social services[A].
Percentage of service delivery sites shared with education[A].
Percentage of service delivery sites shared with labor[A].
Percentage of service delivery sites shared with human services[A].
Percentage of service delivery sites shared with other agencies[A].
Indicates whether the VR program experienced a hiring freeze in a given
fiscal year.
Indicates whether the VR program experienced a large number of
retirements in a given fiscal year.
Indicates whether the VR program experienced a large influx of new
hires in a given fiscal year.
Indicates whether the VR program experienced downsizing through layoffs
in a given fiscal year.
Indicates whether the VR program experienced unusual changes in
staffing in a given fiscal year.
Indicates whether VR counselors were affiliated with a union in a given
fiscal year.
Agency management.
Number of clients per VR counselor[A].
Number of counselors employed (proxy for agency size)[A].
Indicates whether the director had authority over developmental
disability services.
Indicates whether the director had authority over independent living
services.
Indicates whether the director had authority over disability
determination services.
Indicates whether the director had authority over other programs or
services.
Percentage of counselors who left VR agency (turnover)[A].
Percentage of counselors meeting comprehensive system of personnel
development (CSPD) standards[A].
Percentage of senior managers who left VR agency (turnover)[A].
Length of time director has held his/her position (director tenure)[A].
Length of time director has been with the VR agency (director
experience)[A].
Length of time the director has held his/her position as a percent of
their time at the agency[A].
Indicates whether the agency operated under an order of selection.
Indicates whether the program had a wait list.
Length of wait list.
Indicates whether the program had a wait list and, if so, its length.
Performance targets/incentives.
Scale indicating number of reported specific and numerical targets
including SSA reimbursements, individual plans for employment (IPE)
initiated, client referrals, contacts with businesses, client
satisfaction, and other client employment outcomes by year.
Indicates whether counselors had performance expectations with
numerical targets based on successful VR employment outcomes (status 26
closures).
Nature of performance expectations.
Indicates whether counselors had numerical targets in their performance
expectations.
Average number of status 26 case closures required for satisfactory
performance[A].
Indicates whether there were performance expectations that contained
numerical targets for SSA reimbursements.
Indicates whether there were performance expectations that contained
numerical targets for the number of IPEs initiated.
Indicates whether there were performance expectations that contained
numerical targets for the number of client referrals.
Indicates whether there were performance expectations that contained
numerical targets for the number of contacts made with businesses for
job development.
Indicates whether there were performance expectations that contained
numerical targets for client satisfaction rates.
Indicates whether there were performance expectations that contained
numerical targets for any other outcomes.
Indicates whether there were monetary performance incentives to VR
counselors.
Indicates how frequently a VR program reported on agencywide
performance.
Specialized caseloads.
Indicates whether there were in-house benefits counselors.
Number of benefits counselors[A].
Indicates whether there were job development specialists.
Number of job development specialists[A].
Scale measuring the number of types of specialized caseloads covered,
including transitioning high school students, mental health,
developmental disabilities, traumatic brain/spinal cord injuries,
hearing impairments, visual impairments (not counted for blind-serving
agencies), or other groups.
Percentage of counselors with specialized caseloads serving
transitioning high school students[A].
Percentage of counselors with specialized caseloads serving clients
with mental health issues[A].
Percentage of counselors with specialized caseloads serving clients
with developmental disabilities[A].
Percentage of counselors with specialized caseloads serving clients
with traumatic brain/spinal cord injuries[A].
Percentage of counselors with specialized caseloads serving clients
with hearing impairments[A].
Percentage of counselors with specialized caseloads serving clients
with visual impairments[A].
Percentage of counselors with specialized caseloads serving any other
group of clients[A].
Case management system.
Scale indicating the sophistication of the case management system
including the ability of the case management system to collect
Education data, collect fiscal data, generate IPEs, generate client
letters, produce state-level management reports, and produce counselor-
level management reports.
Indicates whether an agency used an automated case management system.
Indicates whether the automated case management system was new if an
agency used one.
Indicates whether an agency used an automated case management system
and if so, whether the system was new.
Indicates whether case management system could collect RSA 911 data.
Indicates whether case management system could collect fiscal data.
Indicates whether case management system could generate IPEs.
Indicates whether case management system could generate client letters.
Indicates whether case management system could generate state level
management reports.
Indicates whether case management system could generate reports at VR
counselor level.
Integration with outside partners.
Indicates whether any VR staff worked full-time or part-time at
Workforce Investment Act (WIA) one-stops.
Total number of staff (both full-and part-time) that worked at a WIA
site.
Indicates whether VR program purchased any services from public or
private vendors.
Indicates how many purchased services had fee for service arrangements.
Indicates how many purchased services had contracts with outcome-based
performance measures.
Indicates how many purchased services had vendor fees tied to meeting
performance measures.
Indicates how many purchased services had renewal of their contracts
tied to meeting performance measures.
Indicates how many purchased services were evaluated by VR to see
whether performance measures were met at contract end.
Indicates how many purchased services were evaluated by VR by group or
type of vendor.
Scale indicating the average support level received from different
types of programs including WIA one-stops, social service departments,
mental health departments, education systems, Medicaid program,
Medicare program, substance abuse departments, and developmental
disabilities programs.
Indicates the extent to which a VR program received support from the
state WIA one-stop system.
Indicates the extent to which a VR program received support from state
social services.
Indicates the extent to which a VR program received support from the
state mental health department.
Indicates the extent to which a VR program received support from the
state education system.
Indicates the extent to which a VR program received support from the
state Medicaid program.
Indicates the extent to which a VR program received support from the
state Medicare program.
Indicates the extent to which a VR program received support from the
state substance abuse department.
Indicates the extent to which a VR program received support from the
state development disabilities program.
Indicates the extent to which a VR program received support from
another state program.
Integration with business community.
Scale indicating agency's level of integration with the business
community, including the average frequency with which the agency
sponsors job fairs, attends business network meetings, meets with local
businesses, meets with chambers of commerce, interacts with civic
clubs, and hosts employer breakfasts.
Frequency with which agency sponsored job fairs.
Frequency with which agency representatives attended job fairs.
Frequency with which agency representatives attended meetings of
business networks.
Frequency with which agency met with local businesses.
Frequency with which agency met with local chambers of commerce.
Frequency with which agency representatives interacted with civic
clubs.
Frequency with which agency hosted employer breakfasts.
Frequency with which agency representatives participated in other
business outreach.
Source: GAO survey data.
[A] Indicates variables that were categorized.
[End of table]
To determine whether the survey data were sufficiently reliable for our
analysis, we collected and analyzed additional data. Specifically, we
included questions in the survey that were designed to determine
whether each state VR agency uses certain practices to monitor the
quality of computer-processed data that were used to complete the
survey.[Footnote 44] From these questions, we developed a variable to
indicate whether a particular agency might have unreliable data. To
determine whether there was a relationship between agencies with data
reliability issues and the earnings outcomes we were studying, we
included this variable in our three models of earnings outcomes
(described below).
We found two issues associated with the survey data that are related to
our findings. First, net of other effects, agencies that reported
having a data reliability issue had significantly lower rates of SSA
beneficiaries departing the disability rolls.[Footnote 45] Although we
suspect that data quality issues do not have a direct effect on the
rates of SSA beneficiaries departing the rolls, poor data quality might
be correlated with some other characteristic that we were not able to
measure (e.g., agency efficiency), which may have an impact on the rate
of departures from the rolls. Second, 11 agencies did not report the
percentage of CSPD-certified counselors (a variable that we found to be
significantly related to the percentage of SSA beneficiaries with
earnings during the year after completing VR) for at least 1 year. For
these agencies, the percentage of counselors was imputed using the mean
derived from agencies that did report. Statistical tests were conducted
to ensure that the observations for which data were imputed did not
have significantly different rates of having earnings than those for
which the data were not missing.
Section 2: Study Population and Descriptive Analyses:
Study Population:
In consultation with SSA officials and contractors as well as Education
officials, we selected as our study population working age individuals
who (1) were either receiving Disability Insurance (DI) only,
Supplemental Security Income (SSI) only, or both DI and SSI benefits
concurrently; and (2) exited VR after having completed VR
services.[Footnote 46] To use the most recent data available, we
further refined this population to include those beneficiaries who:
* Began receiving VR services no earlier than 1995 and who completed VR
after having received services in fiscal years 2001 though 2003.
* Had received a DI or SSI benefit payment at least once during the 3
months before application for VR services. Beneficiaries were defined
as concurrent if they received both DI and SSI benefits for at least 1
month in the 3 months before VR application. We selected a 3-month
window to account for the fact that many beneficiaries, SSI
beneficiaries in particular, fluctuate in their receipt of benefits for
any given month.
We excluded from our study population those disability beneficiaries
who:
* Completed VR after 2003, because we lacked at least 1 year of post-VR
earnings data.
* Applied for or started VR services, but did not complete VR.
* Began receiving disability benefits after receiving VR services
because these beneficiaries may have differed in certain important
characteristics from those receiving benefits before VR participation.
* Reached age 65 or died at any point in their VR participation or
during the time frame of our study. We excluded the beneficiaries who
died or reached age 65 because they would have left the disability
rolls for reasons unrelated to employment. For example, beneficiaries
who reach age 65 convert to SSA retirement benefits.
Computation of Dependent Variables:
Using the Ticket Research File (TRF) subfile combined with data from
SSA's Master Earnings File (MEF), we computed three measures of
earnings outcomes for the 2001 through 2003 exit cohorts for each state
VR agency: (1) the percentage of beneficiaries who had earnings during
the year after receiving VR services, (2) the average amount they
earned,[Footnote 47] and (3) the percentage that left the disability
rolls by 2005. The data sources for our three earnings outcomes or
dependent variables are shown in table 5.
Table 5: Dependent Variables Used in the Analyses:
Dependent variable: Percentage of beneficiaries with earnings during
the year after VR;
Dataset from which variable was derived: MEF.
Dependent variable: Average annual earnings for SSA beneficiaries among
those with earnings during the year after exiting VR;
Dataset from which variable was derived: MEF.
Dependent variable: Percentage of beneficiaries that left the rolls by
2005;
Dataset from which variable was derived: TRF subfile.
Source: SSA data.
[End of table]
To adjust for inflation, all of our earnings figures were computed in
2004 dollars using the Consumer Price Index for All Urban Consumers
(CPI-U). The CPI-U, maintained by the Bureau of Labor Statistics,
represents changes in prices of all goods and services purchased for
consumption by urban households. The CPI-U can be used to adjust for
the effects of inflation, so that comparisons can be made from one year
to the next using standardized dollars. We standardized the value of
average annual earnings to 2004 dollars because this was the most
recent year for which earnings data were available at the time of our
analysis.
Departures from the Disability Rolls:
To determine whether disability beneficiaries left the rolls before
2005, we used data from the TRF subfile that indicated the month in
which a beneficiary left the rolls because of work. We included all
beneficiaries who left the rolls after their VR application date.
Concurrent beneficiaries were considered to have left the rolls only if
they stopped receiving benefits from both programs.
Descriptive Analyses:
To depict the variation of agency performance in earnings outcomes of
SSA beneficiaries completing VR from 2001 to 2003, we performed two
descriptive analyses. First, we developed distributions of each
earnings outcome. Second, we computed the means and ranges of these
outcomes by year and agency type. With data from 78 agencies over 3
years (from persons who exited the state VR programs from 2001 to
2003), we had 234 cases in our data file.[Footnote 48] Both sets of
analyses are presented in the findings section of the report.
Section 3: Econometric Analyses:
To identify key factors related to the earnings outcomes of SSA
beneficiaries completing VR programs, we used econometric methods to
analyze data from various sources related to VR agencies and the SSA
beneficiaries who exited them from 2001 through 2003. Our econometric
analyses focused on the differences across agencies for the three
different dependent variables: (1) the percentage of beneficiaries who
had earnings during the year after leaving VR; (2) among those with
earnings, the average beneficiary earnings level during the year after
leaving VR; and (3) the percentage of beneficiaries that left the
disability rolls as a result of finding work by the end of 2005.
We began our econometric analysis with ordinary least squares (OLS) and
logistic regression models to analyze differences in outcomes based on
individual characteristics. That is, we started with as many
observations as there were individuals in our study population, each
observation being assigned the characteristics of the agency as well as
of the individual. Given that our data were multilevel (i.e., included
information on both individuals and agency-level characteristics), we
used statistical techniques to assess the feasibility of using ordinary
least squares and logistic regression at the individual level rather
than hierarchical modeling techniques.[Footnote 49] As a result of
these analyses, we chose to use robust standard errors to account for
clustering in agencies rather than hierarchical modeling techniques.
However, preliminary analyses using the individual-level data to model
binary outcomes and each individuals' earnings revealed that regression
and logistic models frequently failed statistical tests when compared
to a null model with no explanatory variables, and only accounted for a
small fraction of the variability outcomes of interest to us.[Footnote
50]
Because our econometric models using individual-level data explained
very little variation in earnings outcomes (i.e., low predictive
power), we proceeded to model outcomes at the agency level.
Specifically, we combined data on the aggregate characteristics of
individuals within agencies (such as the percentage of female
beneficiaries or Disability Insurance recipients within an agency) with
agency-level data on structure, expenditures, and policies and
practices. In other words, rather than assess whether individuals
differed in the likelihood of getting a job or leaving the rolls or had
different earnings, we analyzed whether the agencies' earnings outcomes
varied as a function of the characteristics of the agencies, the
aggregate characteristics of beneficiaries within each agency, and the
characteristics of the states the agencies were located in.[Footnote
51] Our dependent variables thus contained, for each agency in a given
fiscal year, the average earnings level among those with jobs, the
percentage at each agency who had earnings during the year after
completing VR, and the percentage of those leaving the rolls due to
work.
As with our descriptive analysis, we had 234 cases in our data file, a
number that was fairly small relative to the large number of agency
characteristics whose effects we wanted to estimate.[Footnote 52] We
could not, as a result, fit models that estimated the effects of all of
the characteristics of interest simultaneously to determine which were
statistically significant. We therefore chose to proceed by first
estimating, in a series of bivariate regression models, which state and
clientele characteristics (or characteristics of the types of SSA
beneficiaries served in each agency) were significant. After obtaining
preliminary estimates, we aggregated sets of significant state and
clientele characteristics into single models for each of the three
outcomes, and reassessed the significance of their net effects when
they were estimated simultaneously in a multivariate regression
model.[Footnote 53] We next tested the stability and magnitude of
statistically significant coefficients for the state and clientele
characteristics under different model specifications, and proceeded to
introduce the agency characteristics (e.g., structure, management,
expenditures, etc.) one at a time into these base models with the
significant state and case mix characteristics. After determining
individually significant agency characteristics, we used an iterative
procedure to reassess agency-level effects by testing model stability
and which variables were and were not significant when others were
included, and retesting the significance of selected state, case mix,
and agency characteristics that were marginally significant in prior
models.[Footnote 54] In all cases we used robust regression procedures
to account for the clustering of cases within agencies (i.e., the lack
of independence within agencies over time), and weighted the cases in
our analyses according to either the total number of beneficiaries in
each agency in each year (for models of having earnings or leaving the
rolls) or the total number of beneficiaries with earnings due to work
in each year (for models of earnings).
Ultimately, we obtained the models shown in table 6. Each of the models
consisted of 7 to 9 characteristics that jointly accounted for between
66 and 77 percent of the variability in each dependent variable.
Although certain characteristics were significant in some
specifications for each outcome, the limited degrees of freedom
prevented us from including all but the most consistently significant
variables with greatest stability across models. In the models that
estimated factors affecting the percentage of SSA beneficiaries who had
earnings and factors affecting average earnings, state characteristics
accounted for a substantial portion of the explained variance. Although
state characteristics were also important in the model estimating the
percentage getting off the rolls by 2005, the year that beneficiaries
exited the agency accounted for the greatest portion of the variance
explained, a result reflecting that those who exited the rolls earlier
had more time to do so.
Table 6: Coefficients for Multivariate Models Estimating the Effects of
State and Agency Characteristics on Three VR Outcomes, and the
Proportion of Variance Explained (R-Squared) by Each Model:
Significant explanatory variables for percentage of beneficiaries with
earnings during the year after VR (R-squared = 0.66):
Unemployment rate;
Effect coefficient: -2.22;
Robust standard error: .358;
P-value: