Registered Apprenticeship Programs
Labor Can Better Use Data to Target Oversight
Gao ID: GAO-05-886 August 29, 2005
Between 2002 and 2012 nearly 850,000 jobs will open in the construction industry; experts predict that there will not be enough skilled workers to fill them. This has heightened concerns about program outcomes and program quality in the nation's apprenticeship system and the U.S. Department of Labor's oversight of it. GAO assessed (1) the extent to which Labor monitors registered apprenticeship programs in the states where it has direct oversight, (2) its oversight activities in states that do their own monitoring, and (3) the outcomes for construction apprentices in programs sponsored by employers and unions in relation to programs sponsored by employers alone.
Labor's monitoring of programs it directly oversees has been limited. We found that in 2004 Labor reviewed only 4 percent of programs in the 23 states where it has direct oversight. According to federal program directors in those states, limited staff constrained their ability to do more reviews. Also, Labor has focused in recent years on registering new programs and recruiting apprentices. Although Labor collects much data about the programs it oversees, it has not employed its database to generate information indicative of program performance, such as completion rates, that might allow it to be more efficient in its oversight. Labor does not regularly review council-monitored states or collect data from them that would allow for a national picture of apprenticeships. Labor is responsible for conducting formal reviews of the 27 states and the District of Columbia that established apprenticeship councils to monitor their own apprenticeship programs; but, according to directors in these states, the reviews have been infrequent and not necessarily useful. While Labor collects only aggregate data on apprentices from these states, we identified 10 states with large numbers of apprentices that were willing and capable of providing GAO data on apprentices by occupation as well as some information on completion rates, completion times, and wages. Data in Labor's apprenticeship database and from council-monitored states show that completion rates and wages for construction apprentices in programs sponsored jointly by employers and unions were higher than those for programs sponsored by employers alone. We found that completion rates for apprentices in programs jointly sponsored by unions and employers were 47 percent on average compared with 30 percent in programs sponsored solely by employers. Completion rates declined under both types of sponsorship for the period we examined, but Labor, as part of its oversight, does not track reasons for noncompletion, making it difficult to determine what lies behind this trend.
Recommendations
Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.
Director:
Team:
Phone:
GAO-05-886, Registered Apprenticeship Programs: Labor Can Better Use Data to Target Oversight
This is the accessible text file for GAO report number GAO-05-886
entitled 'Registered Apprenticeship Programs: Labor Can Better Use Data
to Target Oversight' which was released on September 13, 2005.
This text file was formatted by the U.S. Government Accountability
Office (GAO) to be accessible to users with visual impairments, as part
of a longer term project to improve GAO products' accessibility. Every
attempt has been made to maintain the structural and data integrity of
the original printed product. Accessibility features, such as text
descriptions of tables, consecutively numbered footnotes placed at the
end of the file, and the text of agency comment letters, are provided
but may not exactly duplicate the presentation or format of the printed
version. The portable document format (PDF) file is an exact electronic
replica of the printed version. We welcome your feedback. Please E-mail
your comments regarding the contents or accessibility features of this
document to Webmaster@gao.gov.
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed
in its entirety without further permission from GAO. Because this work
may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this
material separately.
Report to Congressional Requesters:
United States Government Accountability Office:
GAO:
August 2005:
Registered Apprenticeship Programs:
Labor Can Better Use Data to Target Oversight:
GAO-05-886:
GAO Highlights:
Highlights of GAO-05-886, a report to congressional requesters:
Why GAO Did This Study:
Between 2002 and 2012 nearly 850,000 jobs will open in the construction
industry; experts predict that there will not be enough skilled workers
to fill them. This has heightened concerns about program outcomes and
program quality in the nation‘s apprenticeship system and the
Department of Labor‘s oversight of it. GAO assessed (1) the extent to
which Labor monitors registered apprenticeship programs in the states
where it has direct oversight, (2) its oversight activities in states
that do their own monitoring, and (3) the outcomes for construction
apprentices in programs sponsored by employers and unions in relation
to programs sponsored by employers alone.
What GAO Found:
Labor‘s monitoring of programs it directly oversees has been limited.
We found that in 2004 Labor reviewed only 4 percent of programs in the
23 states where it has direct oversight. According to federal program
directors in those states, limited staff constrained their ability to
do more reviews. Also, Labor has focused in recent years on registering
new programs and recruiting apprentices. Although Labor collects much
data about the programs it oversees, it has not employed its database
to generate information indicative of program performance, such as
completion rates, that might allow it to be more efficient in its
oversight.
Labor does not regularly review council-monitored states or collect
data from them that would allow for a national picture of
apprenticeships. Labor is responsible for conducting formal reviews of
the 27 states and the District of Columbia that established
apprenticeship councils to monitor their own apprenticeship programs;
but, according to directors in these states, the reviews have been
infrequent and not necessarily useful. While Labor collects only
aggregate data on apprentices from most of these states, we identified
10 states with large numbers of apprentices that were willing and
capable of providing GAO data on apprentices by occupation as well as
some information on completion rates, completion times, and wages.
Data in Labor‘s apprenticeship database and from council-monitored
states show that completion rates and wages for construction
apprentices in programs sponsored jointly by employers and unions were
higher than those for programs sponsored by employers alone. We found
that completion rates for apprentices in programs jointly sponsored by
unions and employers were 47 percent on average compared with 30
percent in programs sponsored solely by employers. Completion rates
declined under both types of sponsorship for the period we examined,
but Labor, as part of its oversight, does not track reasons for
noncompletion, making it difficult to determine what lies behind this
trend.
Construction Apprentices at Work:
[See PDF for image]
[End of figure]
What GAO Recommends:
GAO recommends that Labor better utilize its database for
oversight”particularly for apprenticeship programs with expected future
labor shortages”develop a cost effective strategy for collecting data
from council-monitored states for selected occupations, conduct its
reviews of apprenticeship activities in states that regulate their own
programs on a regular basis to ensure that state activities are in
accord with those requirements set forth by federal law, and offer
substantive feedback. Labor concurred with these recommendations and
said it has taken initial steps to implement them.
www.gao.gov/cgi-bin/getrpt?GAO-05-886.
To view the full product, including the scope and methodology, click on
the link above. For more information, contact Sigurd Nilsen at (202)
512-7215 or nilsens@gao.gov.
[End of section]
Contents:
Letter:
Results in Brief:
Background:
Labor's Monitoring of Registered Apprenticeship Programs Is Limited:
Labor Has Reviewed Council-Monitored States Infrequently, Provided
Little Feedback, and Not Collected Data That Would Allow for a National
Picture of Apprenticeships:
Construction Apprenticeship Completion Rates and Wages Vary by Program
Sponsor:
Conclusions:
Recommendations:
Agency Comments:
Appendix I: Scope and Methodology:
Appendix II: Completion Rates, Time Taken to Complete, and Wages for
Construction Apprentices in Council-Monitored States:
Appendix III: Responses to Survey of Directors of Apprenticeships in
Federally-Monitored States:
Appendix IV: Responses to Survey of Directors of Apprenticeships in
Council-Monitored States:
Appendix V: Comments from the Department of Labor:
Appendix VI: GAO Contact and Staff Acknowledgments:
Related GAO Products:
Tables:
Table 1: Differences between Actual and Expected Completion Time for
Apprentices in Joint and Non-joint Construction Programs in Weeks:
Table 2: Survey Numbers and Response Rates:
Table 3: Percentages of Apprentices Completing Joint and Non-joint
Construction Programs as Reported by Selected Council-monitored States
for Fiscal Years 1997-2004:
Table 4: Average Number of Weeks Spent to Complete Joint and Non-joint
Construction Apprenticeship Programs as Reported by Selected Council-
monitored States:
Table 5: Mean Hourly Wage Rates for Beginning Apprentices in Joint and
Non-joint Construction Programs as Reported by Selected Council-
monitored States, Fiscal Year 2004:
Table 6: Mean Hourly Wage Rates for Apprentices Achieving Journey
Status in Joint and Non-joint Construction Programs as Reported by
Selected Council-monitored States, Fiscal Year 2004:
Figures:
Figure 1: States with Apprenticeship Programs Overseen by Federal
Officials and State Apprenticeship Councils:
Figure 2: Frequency of Quality and Equal Employment Opportunity Reviews
of Apprenticeship Programs in Federally-and Council-monitored States
during Fiscal Year 2004:
Figure 3: Frequency of Federal Reviews of Council-monitored States:
Figure 4: Council-monitored States' Rankings of the Usefulness of
Federal Quality Reviews:
Figure 5: Council-monitored States' Rankings of the Usefulness of EEO
Reviews:
Figure 6: Completion Rates after 6 Years for Apprentices Entering
Construction Programs in FY 1994 through 1998:
Figure 7: Completion Rates after 6 Years by Occupation for Apprentices
Who Began Joint and Non-joint Construction Programs between FY 1994 and
1998:
Figure 8: Enrollment for Apprentices in Joint and Non-joint
Construction Programs, FY 1994 through 1998:
Figure 9: Trends in Completion Rates after 6 Years for Apprentices in
Joint and Non-joint Construction Programs Entering Programs in FY 1994
through 1998:
Figure 10: Average Wages for Apprentices in Joint and Non-Joint
Construction Programs in FY 2004:
Abbreviations:
EEO: Equal Employment Opportunity:
ETA: Employment and Training Administration:
OATELS: Office of Apprenticeship Training, Employer and Labor Services:
RAIS: Registered Apprenticeship Information System:
SAC: State Apprenticeship Council:
WIA: Workforce Investment Act:
United States Government Accountability Office:
Washington, DC 20548:
August 29, 2005:
The Honorable Edward M. Kennedy:
Ranking Member:
Committee on Health, Education, Labor and Pensions:
United States Senate:
The Honorable Patty Murray:
Ranking Member:
Subcommittee on Employment, Safety and Training:
Committee on Health, Education, Labor and Pensions:
United States Senate:
Between 2002 and 2012 an estimated 850,000 jobs will open in the
construction industry, but experts predict that there will not be
enough skilled workers to fill them. The National Registered
Apprenticeships System, administered by the Office of Apprenticeship
Training, Employer and Labor Services (OATELS) within the Department of
Labor, has an important role in the development of this skilled
workforce. With a budget of $21 million, OATELS promulgates standards
to safeguard the welfare of apprentices and registers apprenticeship
programs that meet those standards, which include requirements for
related instruction, on-the-job training, and equal employment
opportunity for apprentices. OATELS also oversees apprenticeship
programs to ensure that they provide quality training for apprentices,
as many as 480,000 of whom may be enrolled at any one time. Labor,
through OATELS, directly registers and oversees programs in 23 states.
It has granted 27 other states, the District of Columbia, and 3
territories authority to register and oversee their own programs, and
ensures programs comply with federal standards and meet additional
state standards. In these states, referred to here as council-monitored
states, OATELS reviews the activities of the apprenticeship councils
that are responsible for ensuring programs in their state comply with
federal labor standards and equal opportunity protections. While Labor
and apprenticeship councils provide oversight, recent studies have
shown that a significant number of construction apprentices are not
completing their programs and that construction programs sponsored by
employers without union participation have lower completion rates and
wages for apprentices. In addition, some have raised concerns that the
failure to complete programs could be indicative of poor program
quality. The anticipated shortage of skilled construction workers has
heightened concerns about the relationship between program outcomes and
program quality, the prospect for expanding the supply of skilled
workers through apprenticeships, and Labor's oversight of these
programs.
In view of these concerns, you asked us to review the extent of federal
oversight of apprenticeship programs in general and compare
apprenticeship outcomes in the construction industry by type of program
sponsorship. Specifically, we assessed (1) the extent to which the U.S.
Department of Labor monitors the operations and outcomes of registered
apprenticeship programs in the states where it has direct oversight,
(2) its oversight activities for council-monitored states, and (3) the
outcomes for construction apprentices in programs sponsored jointly by
employers and unions in relation to those sponsored by employers alone.
To obtain national information on Labor's monitoring and oversight
activities, we surveyed all state directors of apprenticeship training
programs through electronic questionnaires posted on the World Wide
Web. We excluded the three territories--Guam, Puerto Rico, and the
Virgin Islands--from our analyses because the few programs they had
were atypical. We also visited four states, both federal (Texas) and
council-monitored (California, New York, and Washington), that had
large numbers of construction apprentices (from about 52,000 to 6,500).
In these states, we talked to knowledgeable officials, private-sector
experts and stakeholders, including employer and labor union sponsors
of apprenticeship programs. In some cases, we visited apprentice
training facilities. To determine completion rates, times to
completion, and wage rates for apprentices, we analyzed data in Labor's
apprenticeship database for the fiscal years 1994 through 2004. In
calculating completion rates, we constructed five cohorts based on when
apprentices enrolled in a program--1994, 1995, 1996, 1997, or 1998--and
established their completion status 6 years after they enrolled. These
analyses included data on programs in 23 states where Labor has direct
oversight and programs in 8 council-monitored states.[Footnote 1] In
addition, we obtained comparable data on construction programs from 10
council-monitored states that have large numbers of construction
apprentices and were able to provide this information. The 41 states
for which we had some type of data accounted for an estimated 92
percent of all construction apprentices. We also interviewed Labor
officials and other knowledgeable parties. (See app. I.) We conducted
our work between August of 2004 and July 2005 in accordance with
generally accepted government auditing standards.
Results in Brief:
Labor's monitoring of the programs it directly oversees has been
limited, in part, due to staffing levels and also its reluctance to use
data to target oversight. In 2004, Labor reviewed only 4 percent of
programs in the 23 states where it has direct oversight, in part,
because of limited staffing, according to federal program directors in
those states. Currently each reviewer has responsibility for about
2,000 apprentices for whom they enter apprentice and program
information in Labor's database, in addition to reviewing program
quality and equal employment opportunities, and overseeing program
progress. Labor officials also said that in recent years their
resources have been more focused on developing new programs and
recruiting apprentices, particularly for new fields. Although Labor
collects much data about the programs it oversees, its Apprenticeship
Office has not employed its database to generate information on program
performance. Federal program directors for the states told us, for
example, that they do not systematically use outcome data from the
database, such as completion rates or apprentices' reasons for dropping
out, to determine which programs to review. This limited use of data
may stem, in part, from challenges in accessing it. These program
directors reported that they were not able to generate customized
reports and that data fields had been changing frequently. Recently,
however, the Apprenticeship Office has begun piloting new software that
agency officials say will make it possible for federal program
directors to effectively extract information from the database and
generate customized reports. Still, federal program directors in two
states who were already using the software said they were still unable
to look at programs by occupation at the state level, a level of
analysis that most state program directors--17 of 23 we surveyed--said
they wanted. In addition, we found little evidence that the
Apprenticeship Office had systematically sought input from federal
program directors regarding their reporting needs or problems they
might face in using the new software. Nor could Labor officials provide
a plan with explicit steps for its implementation.
Labor does not regularly review the activities of the states
apprenticeship councils or collect data from them that would allow for
a national picture of apprenticeships. Labor's reviews have been
infrequent, according to directors of apprenticeship systems in most of
the 27 council-monitored states. Moreover, some directors reported not
having had reviews in the last 9 to 12 years, and our examination of
apprenticeship office documents indicated the department had conducted
only three reviews for 2002 and 2003, and none for 2004. In addition,
many directors reported the reviews were of limited use in helping them
assess programs or make informed decisions about their administration,
in part because of the limited feedback they received. While neither
statute nor regulations specify the frequency of these reviews,
according to Labor officials they are important for ensuring that
states are fulfilling federal requirements for recognition and
oversight of apprenticeship programs. Labor has only collected
aggregate counts of apprentices from most of these states, and to date
has been unable to develop a composite national picture of apprentices.
We nevertheless found 10 states with large numbers of apprentices that
readily provided apprentice data to us by industry, sponsor, and
occupation, as well as some information on completions, on-time
completions, and wages--information that Labor could use to build a
national portrait of apprentices in key programs.
Data in Labor's apprenticeship database and from council-monitored
states show that completion rates and wages for construction
apprentices in programs sponsored jointly by employers and unions were
higher than those for programs sponsored by employers alone. Of
apprentices beginning training between 1994 and 1998 (and completing by
2004), on average, 47 percent of those in programs sponsored jointly
with unions completed compared with 30 percent in programs sponsored
solely by employers, a 17 percentage point difference. Officials said
joint programs had higher completion rates because they were more
established and more likely to provide mentoring and job placement
services. Despite growth in construction program enrollments,
completion rates consistently declined for both types of program
sponsorship for the time period we examined. Specifically, while 59
percent of the apprentices who enrolled in construction programs in
1994 graduated within 6 years, only 37 percent of 1998 enrollees did.
Given that Labor, as part of its oversight, does not track the reasons
for noncompletions, it is difficult to determine what lies behind this
trend or what might account for differences in completion rates by type
of sponsorship. Those apprentices that did complete programs within 6
years tended to finish earlier than they were expected to. Construction
wages were generally higher for apprentices in joint programs than for
those in non-joint programs--being more than $2.00 per hour higher on
average at the start and $6.00 per hour higher on average at completion
of training in 2004, the first full year Labor began collecting wage
data. Factors that may explain such differences in wages include the
presence of a collective bargaining agreement.
In this report we recommend that the Secretary of Labor take steps to
use the data Labor has to better target its oversight activities,
develop a cost-effective strategy for collecting data from council-
monitored states, and conduct regular reviews with feedback for those
states. In its written comments on a draft of this report the
Department of Labor concurred with these recommendations and said it is
taking initial steps to implement them.
Background:
Although apprenticeship programs in the United States are largely
private systems that are paid for largely by program sponsors, the
National Apprenticeship Act of 1937 authorizes and directs the
Secretary of Labor to formulate and promote labor standards that
safeguard the welfare of apprentices. The responsibility for
formulating and promoting these standards resides with OATELS. OATELS
had a staff of about 176 full-time equivalencies and an annual
appropriation of about $21 million in 2004. In addition, because of
budgetary constraints, OATELS officials do not expect resources to
increase. At the national level, OATELS can register and deregister
apprenticeship programs (i.e., give or take away federal recognition),
issue nationally recognized, portable certificates to individuals who
have completed registered programs, plan appropriate outreach
activities targeted to attract women and minorities, and promote new
apprenticeship programs to meet workforce needs. In addition to this
national role, OATELS directly oversees individual apprenticeship
programs in 23 states. In these states, the director for the state's
apprenticeship system and other program staff are federal employees who
monitor individual apprenticeship programs for quality and their
provision of equal opportunity.
Labor can give authority to states to oversee their own apprenticeship
programs if the state meets certain requirements. Labor has given this
authority to 27 states, the District of Columbia, and three
territories. In these states, which we refer to as council-monitored,
the federal government is not responsible for monitoring individual
apprenticeship programs; instead, the state is. It does so through
state apprenticeship councils. OATELS does, however, conduct two types
of reviews to determine how well the state fulfills its
responsibilities. Quality reviews determine, in part, conformance with
prescribed federal requirements concerning state apprenticeship laws,
state council composition, and program registration, cancellation and
deregistration provisions. Equal Employment Opportunity (EEO) reviews
assess the conformity of state EEO plans, affirmative action
activities, record-keeping procedures, and other activities with
federal EEO regulations. In addition to these reviews, OATELS may also
provide state agencies with federal staff to assist in day-to-day
operations.
The number and type of construction apprenticeship programs are
distributed differently across federally-and council-monitored states.
Council-monitored states not only have more programs, but these
programs are more likely to be jointly sponsored by employers and
unions than sponsored by employers alone. On average, a construction
apprenticeship program in federally-monitored states trains about 17
apprentices and in council-monitored states trains about 20. Beyond
this average, it's important to note that there can be great variation
among programs, with some having over 400 participants and others 1 or
2. Figure 1 identifies states where programs are federally-and council-
monitored.
Figure 1: States with Apprenticeship Programs Overseen by Federal
Officials and State Apprenticeship Councils:
[See PDF for image]
[End of figure]
Both the federal and council-monitored states collect data on the
individual programs they oversee. Labor maintains a large database
called the Registered Apprenticeship Information System (RAIS) and
collects information about individual programs, apprentices, and
sponsors for apprenticeships in the 23 states where it has direct
oversight and in 8 council-monitored states that have chosen to report
into this system. The other council-monitored states, 20 in total,
maintain their own data and collect various pieces of information on
apprenticeship systems. Labor does collect aggregate data on
apprentices and programs from these states.
In all states, individuals can enter the construction trades without
completing formal apprenticeship programs, but many construction
workers, particularly those working in highly skilled occupations that
require extensive training, such as the electrical, carpentry, and
plumbing trades, receive their training though registered
apprenticeship programs. To complete their programs, apprentices must
meet requirements for on-the-job training and classroom instruction
that must meet the minimum standards for the trade as recognized by
Labor or the state apprenticeship council. Programs in some trades, for
example, commercial electricity, may take 5 years to complete but
programs to train laborers may only take a year. Beginning apprentices'
wages generally start at about 40 percent of the wage of someone
certified in a particular trade and rise to about 90 percent of that
wage near completion. Apprentices' contracts with their program
sponsors specify a schedule of wage increases.
Labor's Monitoring of Registered Apprenticeship Programs Is Limited:
Although OATELS is responsible for overseeing thousands of
apprenticeship programs in the states where it has direct oversight, it
reviews few of these programs each year. Also, while its apprenticeship
database collects much information about individual participants and
programs, Labor hasn't used these data to systematically generate
program performance indicators such as completion rates. As a result,
it lacks information that would allow it to identify poorly performing
programs and adjust its oversight accordingly. Furthermore, despite
many technical upgrades, Labor's database hasn't provided information
that meets the needs of federal apprenticeship directors or the needs
of other stakeholders.
Few Federal Staff Are Engaged in Monitoring the Programs That Labor
Directly Oversees:
OATELS has reviewed very few of the apprenticeship programs in the
states where it has direct oversight. Federal apprenticeship directors
in these states reported they conducted 379 quality reviews in 2004,
covering only about 4 percent of the programs under their watch. These
reviews are done to determine, for example, whether sponsors have
provided related instruction and on-the-job training hours in
accordance with the standards for the program and whether wages
reflected actual time in the program. The number of reviews conducted
varied across states. On average, 22 quality reviews per state were
conducted, but one director reported conducting as many as 67 reviews
while another reported conducting no reviews at all. In addition,
programs in council-monitored states were almost twice as likely as
programs in federally-monitored states to have been reviewed within 3
years. (See fig. 2.) Several federal officials said over the past
several years they had placed primary emphasis on registering new
programs and recruiting more apprentices, particularly in
nontraditional areas such as childcare and health. In addition, they
told us it was not possible to do more reviews in part because of
limited staff.
Figure 2: Frequency of Quality and Equal Employment Opportunity Reviews
of Apprenticeship Programs in Federally-and Council-monitored States
during Fiscal Year 2004:
[See PDF for image]
[End of figure]
In addition to having fewer reviews, apprenticeships in federally-
monitored states had fewer staff dedicated to monitoring activities
than council-monitored states. In 2004, each staff person in a
federally monitored state was responsible, on average, for about 2,000
apprentices, according to federal program directors; to put this in
context, case loads of monitors in federally-monitored states were
almost twice as large as those in council-monitored states. In
federally-monitored states, on average there were about 2.5 staff to
monitor programs, less than one-third the average in council-monitored
states. Labor's practice of assigning federal staff to monitor programs
in 18 of the council-monitored states rather than to programs in
federally-monitored states compounded differences in staff resources.
Directors in council-monitored states reported that at least two
federal employees, on average, monitored programs in their
jurisdiction. As important as the number of staff, is how they spent
their time. About a half of the staff in federally-monitored states
spent 40 percent or more of their time in the field performing
monitoring, oversight, and providing related technical assistance,
according to federal program directors whereas one-half of the staff in
council-monitored states spent about 70 percent or more in the field.
While Labor Collects Much Information about Apprenticeship Programs, It
Does Not Systematically Use Data to Focus Its Oversight:
Although Labor collects information to compute completion rates and
track participants who do not complete programs in the time expected,
it does not use these data to focus its oversight efforts on programs
with poor performance. During a site visit in a federally-monitored
state, a monitor showed us how she computed cancellation rates by hand
for apprentices in programs that she felt were not doing an adequate
job of training apprentices to see if her hypotheses were correct. In
the absence of performance information, directors and staff in
federally-monitored states reported that a variety of factors dictated
which programs to review. These included size, newness, location, date
of the last review, sponsor's cooperativeness, as well as the location
of staff resources.
In addition to not using program data to target reviews, Labor has not
collected and consistently entered into its database information about
why apprentices cancel out of programs, although its database was
designed to include such information and having it could help target
reviews. Officials told us that voluntary cancellation or transfers to
another program were at times associated with program quality, while
other nonvoluntary reasons, such as illness or military service, were
not. Currently, recording the reason for an apprentice's cancellation
in the database is an optional field. We found that no reason was
recorded for 60 percent of the cancellations and the remaining 40
percent did not effectively capture the reasons for leaving. Of the 18
reasons entered, the most common reasons were "Unknown," "Voluntarily
Quit," "Unsatisfactory Performance," "Discharged/Released," and
"Cancelled with the Occupation," some of which did not provide useful
information to target reviews. Also, other entries were close
duplicates of one another, such as "left for related employment" and
"left for other employment."
Labor also treats as optional data entries for its equal employment
opportunity reviews: including the date of the last review, compliance
status, status of corrective actions, and other information that would
improve the efficiency of managing reviews. As a result, such data were
available for about 5 percent of programs in Labor's database in fiscal
year 2004. Without this information, it is more difficult to determine
when programs had their last EEO review and to readily identify
programs with known problems.
Labor's Data Base Does Not Meet the Needs of Apprenticeship Directors
and Other Stakeholders:
Despite many technical upgrades, Labor's database hasn't provided
information that meets the needs of its federal directors or the needs
of other stakeholders. While acknowledging that Labor's database has
been updated and improved, 22 out of the 23 directors of apprenticeship
programs and their monitoring staff have expressed dissatisfaction with
the present system. One complained of "daily" changes to the data
fields without being informed of "why or when they will change."
Expressing the desire to select and sort data on any field and generate
unique reports in context with all available data, another concluded,
"In short, we need a lot of flexibility with database reports that we
don't have at this time." Many federal apprenticeship directors made
recommendations for improving the database. In general, what state
directors wanted most was a system that was stable, user friendly, and
that would allow them to produce customized reports to better oversee
the apprenticeship programs in their states. The list below shows those
potential improvements endorsed by more than half of the state
apprenticeship directors:
* Increase the timeliness of notifications to state and regional
offices for changes to RAIS (e.g., provide for more frequent
communication), (22 of 23 surveyed states).
* Simplify instruction and procedures for producing reports (18 of 23
surveyed states).
* Allow production of customized state and regional reports by type of
industry (18 of 23 surveyed states).
* Allow production of customized state and regional reports by sponsor
type (17 of 23) and occupational type (17 of 23 surveyed states).
* Improve the frequency of RAIS training (17 of 23 surveyed states).
* Improve the quality of RAIS training (16 of 23 surveyed states).
* Simplify instructions and procedures for inputting and updating data
(16 of 23 surveyed states).
* Increase available coding options to explain why apprentices leave
the program (14 of 23 surveyed states).
* Allow production of customized state and regional reports by sex of
apprentice and race of apprentice (14 of 23 surveyed states).
OATELS has recently purchased software that enables users to extract
data from Labor's databases in order to produce customized reports.
Purchased originally for the Secretary of Labor's use, Labor
Information Technology and OATELS officials said they foresaw the
software's utility for many programs and therefore decided to purchase
licenses for apprenticeship field staff. However, OATELS has not
necessarily taken steps to ensure field staff will be able to make
optimal use of the software. About half the directors in federally-
monitored states did not know the software was available or what it
was. Although the software was demonstrated at a directors' meeting in
2004, several couldn't recall the demonstration and others were not in
attendance. Moreover, two of the directors lacked basic hardware, such
as a high-speed cable needed to support the software. In fact, one
director told us he was working from his home because his office didn't
have such basics as a cable hook-up for his computer. Even if such
obstacles are surmounted, the new system may not meet the staffs' data
needs. Two directors who were already attempting to use the software
reported to us that it did not allow them to select information using
factors that would be most useful to them, such as state-level data on
apprenticeship programs. In addition, Labor could or would not supply
us with formal documentation describing its plans to implement the
software or its vision of how the software would be used by its staff.
Labor also reported that because of budget constraints and the easy use
of the new software, it had no plans to provide training. Without such
plans, Labor's commitment to the full implementation and future
financing of the program is questionable.
Labor Has Reviewed Council-Monitored States Infrequently, Provided
Little Feedback, and Not Collected Data That Would Allow for a National
Picture of Apprenticeships:
Labor has infrequently reviewed states to which it has delegated
oversight responsibility. This includes both quality reviews and EEO
reviews to assure that these states are in compliance with federal
rules for overseeing apprenticeship programs and also adhering to equal
employment opportunity requirements. Moreover, states that have been
reviewed in recent years reported that they had little utility for
helping them manage their programs, in part, because of the little
feedback they received. In terms of providing information to Congress
and others, Labor does not collect from these states information that
is readily available on apprenticeships by occupation or industry, even
for occupations where shortages of skilled workers are anticipated.
Labor Has Reviewed Council-Monitored States Infrequently in Recent
Years:
Agency records indicate that Labor conducted only three quality and EEO
reviews of council-monitored states in calendar years 2002 and 2003,
and none in 2004 but has scheduled seven for 2005. State apprenticeship
directors confirmed that reviews are infrequent. Twelve of the 27
directors in council-monitored states reported that OATELS had
conducted reviews of their programs less frequently than once every 3
years and several responded that reviews had not taken place in the
last 9 to 12 years. An additional five directors reported their states
had never been reviewed or that they were unaware if such reviews had
taken place. The remaining 10 reported reviews took place in their
states at least once every 3 years. (See fig. 3.) While neither statute
nor regulation specifies the frequency with which OATELS should conduct
such reviews, they constitute an important mechanism for ensuring that
state laws conform to requirements necessary for Labor's recognition of
a state's registered apprenticeship program.
Figure 3: Frequency of Federal Reviews of Council-monitored States:
[See PDF for image]
[End of figure]
Officials in Most Council-Monitored States Reported Reviews Were Not
Very Useful, in Part Because of Limited Feedback:
State directors reported that the Quality Reviews and the EEO Reviews
had limited utility for helping them manage their programs. For
example, only about half of them reported that the quality reviews were
at least moderately useful for helping them determine their compliance
with federal regulation. (See fig. 4.) Results were similar for the EEO
reviews. (See fig. 5.) For example, slightly less than half of state
directors reported that EEO reviews were at least moderately useful in
helping them determine their compliance with federal EEO regulations.
Some directors said reviews would be more useful if they focused on
reviewing program-related activities in the state. Eight of the
directors suggested that Labor focus more on state and local conditions
and the performance of apprenticeship programs instead of focusing only
on whether council-monitored states comply with federal standards. For
example, one director reported the feedback he received on EEO
activities was unrelated to the racial composition of the state. Also,
some suggested reviews could provide opportunities for federal
officials to provide assistance and share knowledge about strategies
that other states have found useful.
Figure 4: Council-monitored States' Rankings of the Usefulness of
Federal Quality Reviews:
[See PDF for image]
[End of figure]
Figure 5: Council-monitored States' Rankings of the Usefulness of EEO
Reviews:
[See PDF for image]
[End of figure]
While directors had a number of ideas for improving the usefulness of
quality and EEO reviews, many noted that Labor provided limited or no
feedback as part of the review process. For example, one said his state
agency received a brief letter from Labor stating only that the state
was in compliance with federal regulations. Two others said their
agencies received no documentation that a review had in fact been
conducted, even though in one of these cases the state had made
requests for the review findings. Officials in one state said feedback
from their last review was positive and indicated no problems, but a
few years later, OATELS took steps to get their state apprenticeship
council derecognized with no prior notice or subsequent review.
Labor Has Not Collected Data That Would Allow for a National Picture of
Apprenticeships:
Labor collects aggregate counts of apprentices for most council-
monitored states and has not developed strategies to collect more
detailed information that would allow for a description of
apprenticeships at the national level, even for those where shortages
of skilled workers are anticipated. Of the 28 council-monitored states,
20 have their own data system and do not report data to Labor's
apprenticeship database. These 20 states represent about 68 percent of
the nation's apprentices. Labor and council-monitored states have
differing opinions about why there are separate data systems. Labor
officials told us that, as they were developing their database, they
conducted outreach to council-monitored states. Officials from these
states say otherwise. They also said that participating in Labor's
database would be an onerous process or that Labor's system did not
meet their state's information needs and, therefore, they had invested
the time and money to develop their own systems. Because many of these
systems are not compatible with Labor's, the agency collects only total
counts of apprentices and programs from these 20 states, which it uses
for its official reports.
While incompatible data systems may suggest that it would be difficult
or costly to obtain more than aggregate counts, in collecting data for
this report, we found many of the council-monitored states--including
10 with large numbers of apprentices--were both willing and capable of
providing us data on apprentices by industry and by occupation as well
as information on completion rates, completion times, and some wage
data for occupations that we had specified. In fact, one state reported
that it had designed its apprenticeship database to collect all
information required by Labor's database and had offered to report
these data to Labor electronically--but Labor had not taken steps to
accept this offer. Nevertheless, as one director pointed out, having a
unified data picture is central to OATELS' oversight as well as its
promotional activities and, as many agree, such a system would promote
the health of the registered apprenticeship system.
Construction Apprenticeship Completion Rates and Wages Vary by Program
Sponsor:
Construction apprentices in programs sponsored jointly by employers and
unions (joint programs) generally completed at a higher rate and in
greater numbers than those enrolled in programs sponsored by employers
alone (non-joint programs). More importantly, despite growth in
construction program enrollment, there has been a decline over time in
completion rates for both types of programs. Completion rates declined
from 59 percent for apprentices enrolling in 1994 to 37 percent for
apprentices enrolling in 1998. It is difficult to know what factors
underlie this trend because, as noted earlier, Labor does not
systematically record information about why apprentices leave programs.
Apprentices who completed programs within 6 years tended to finish
earlier than expected. In addition, wages for joint apprentices were
generally higher at the start and upon completion of their programs.
Data received from 10 council-monitored states that do not report to
Labor's database generally mirrored these findings.
Nearly Half of Apprentices in Joint Programs Completed Their
Apprenticeships Compared with about a Third in Non-joint Programs:
Completion rates were generally higher for apprentices in joint
programs than for those in non-joint programs. Of the apprentices who
entered programs between 1994 and 1998, about 47 percent of apprentices
in joint programs and 30 percent of apprentices in non-joint programs
completed their apprenticeships by 2004. For five consecutive classes
(1994-1998) of apprentices in Labor's database, completion rates
calculated after 6 years, were higher for joint programs, as shown in
figure 6.[Footnote 2] The data we received from 10 additional states
that do not report into Labor's database showed similar trends, with
joint apprentices having higher completion rates. For complete data
that we received from these 10 states, see appendix II.
Figure 6: Completion Rates after 6 Years for Apprentices Entering
Construction Programs in FY 1994 through 1998:
[See PDF for image]
[End of figure]
For the programs in Labor's database, this higher completion rate for
joint apprenticeship programs was true for all but 1 of the 15 largest
individual trades which collectively account for 93 percent of active
apprentices in construction. (See fig. 7.) It should be noted that
among the trades, themselves, there were substantial variations in
completion rates, often due to the nature of work environment and other
constraints, according to federal and state officials. For example,
roofing programs, which have low completion rates, face unpredictable
weather and seasonal work flows.
Figure 7: Completion Rates after 6 Years by Occupation for Apprentices
Who Began Joint and Non-joint Construction Programs between FY 1994 and
1998:
[See PDF for image]
[End of figure]
Officials said that joint programs have higher completion rates because
they are more established and better funded. For some joint programs,
these additional resources stem in part from union members paying a
small portion of their paychecks into a general training fund that is
used to help defray some of the training costs for apprentices. In
addition, they suggested that, because unions tend to have a network of
affiliates spread across an area, they are more likely to find work for
participating apprentices in other areas when work is slow in a
particular area. Local union chapters often have portability agreements
with one another other, which help to facilitate such transfers.
Officials also said these programs provide mentoring and other social
supports.
While Enrollments Increased, Completion Rates Declined in General for
the Period Examined:
Enrollments in construction apprenticeship programs more than doubled
from 1994 to 1998, increasing from 20,670 construction apprentices to
47,487.[Footnote 3] (See fig. 8.) Meanwhile, completion rates declined
from 59 percent for the class of 1994 to 37 percent for the class of
1998[Footnote 4]. This decline for these cohorts held for both joint
and non-joint programs. (See fig. 9.) Completion rates for joint
apprentices dropped from nearly 63 percent to 42 percent, and from 46
percent to 26 percent for non-joint apprentices. This trend was
consistent across different occupations as well, with most experiencing
declines.
Figure 8: Enrollment for Apprentices in Joint and Non-joint
Construction Programs, FY 1994 through 1998:
[See PDF for image]
[End of figure]
Figure 9: Trends in Completion Rates after 6 Years for Apprentices in
Joint and Non-joint Construction Programs Entering Programs in FY 1994
through 1998:
[See PDF for image]
[End of figure]
Because Labor does not systematically record the explanations that
apprentices offer for canceling out of programs, it is difficult to
determine what may lie behind this downward trend. Labor suggested that
some apprentices may choose to acquire just enough training to make
them marketable in the construction industry in lieu of completing a
program and achieving journey status. While we cannot confirm this
hypothesis, we did find that those apprentices who did cancel chose to
do so after receiving over a year of training. Joint apprentices
cancelled after 92 weeks on average and non-joint apprentices cancelled
after 85 weeks on average. Other reasons offered included a decline in
work ethic, the emphasis placed by high schools on preparing students
for college and the corresponding under-emphasis on preparation for the
trades, and a lack of work in the construction industry. We cannot
verify the extent to which unemployment played a role influencing
outcomes, but, according to the Bureau of Labor Statistics, the
unemployment rate for construction increased overall from 6.2 percent
to 8.4 percent between 2000 to 2004, despite the predictions of future
worker shortages in construction.
Apprentices in Both Joint and Non-joint Construction Programs Tended to
Complete Their Programs Early:
Those apprentices who completed construction programs within 6 years
tended to finish earlier than they were expected to, with apprentices
in non-joint programs finishing a bit sooner than their joint
counterparts. On average, joint apprentices completed their programs 12
weeks early and non-joint apprentices completed 35 weeks early. This
trend was similar across the largest trades in terms of enrollment as
shown in table 1 below. This may be due to the willingness of program
sponsors to grant apprentices credit for previous work or classroom
experience that was directly related to their apprenticeship
requirements.
Table 1: Differences between Actual and Expected Completion Time for
Apprentices in Joint and Non-joint Construction Programs in Weeks:
Electrician;
Joint apprentices: Actual weeks to complete: 237;
Joint apprentices: Expected weeks to complete: 240;
Joint apprentices: Difference: 3 weeks early;
Non-joint apprentices: Actual weeks to complete: 179;
Non-joint apprentices: Expected weeks to complete: 211;
Non-joint apprentices: Difference: 32 weeks early.
Carpenter;
Joint apprentices: Actual weeks to complete: 188;
Joint apprentices: Expected weeks to complete: 210;
Joint apprentices: Difference: 22 weeks early;
Non-joint apprentices: Actual weeks to complete: 184;
Non-joint apprentices: Expected weeks to complete: 208;
Non-joint apprentices: Difference: 24 weeks early.
Plumber;
Joint apprentices: Actual weeks to complete: 238;
Joint apprentices: Expected weeks to complete: 252;
Joint apprentices: Difference: 14 weeks early;
Non-joint apprentices: Actual weeks to complete: 171;
Non-joint apprentices: Expected weeks to complete: 220;
Non-joint apprentices: Difference: 49 weeks early.
Source: GAO analysis of RAIS database.
[End of table]
Starting Wages and Wages upon Completion in Joint Construction Programs
Were Higher on Average than Those for Apprentices in Non-joint
Construction Programs:
Apprentices in joint construction programs were paid higher wages at
the start of their apprenticeships and were scheduled to receive higher
wages upon completion of their programs. In 2004, the first year in
which Labor collected information on starting wages, apprentices in
joint programs earned $12.28 per hour while non-joint apprentices
earned $9.90 at the start of their apprenticeships. These differences
in wages were more pronounced at the journey level, that is, upon
completion, with apprentices in joint programs scheduled to earn
journey-level wages of $24.19 as compared with $17.85 for those in non-
joint programs. As shown in figure 10, joint apprentices generally
earned higher wages across the 15 trades with the largest numbers of
construction apprentices. There were three trades--carpenter,
structural steel worker, and cement mason--for which starting wages
were higher for non-joint apprentices. For journey-level wages there
was only one trade for which wages were higher for non-joint
apprentices--that of millwright. Officials we spoke with commonly
attributed this distinction in wages to the bargaining process
associated with joint programs. Data from the 10 additional states
(outside Labor's database) whose data we examined showed a similar
pattern--with joint apprentices earning higher wages. (See app. II.)
Figure 10: Average Wages for Apprentices in Joint and Non-Joint
Construction Programs in FY 2004:
[See PDF for image]
[End of figure]
Conclusions:
As a small program with finite resources tasked with an important
mission, it is incumbent on Labor's Apprenticeship Office to leverage
the tools at its disposal to carry out its oversight, all the more so
during a period of tight budgets. Labor's responsibility for assuring
that registered apprenticeship programs meet appropriate standards is
no small charge, given the thousands of programs in operation today. In
terms of the programs it directly monitors, Labor has not made optimal
use of the information it collects to target resources. The failure to
do so limits the agency's ability to target its oversight activities to
address and remedy areas where there may be significant need,
particularly the construction trades where completion rates are
declining. Underscoring this point is the fact that apprenticeship
directors in federally-monitored states cannot get easy access to the
data in the form of customized reports. Irrespective of distinctions
between apprentice outcomes for joint and non-joint programs, without
better use of its data, Labor is still not in a position to assess
programs on their individual merits. Given the relatively limited
number of staff available for field visits, by not using the program
data it has, Labor misses opportunities to more efficiently use its
staff.
With regard to states with council-monitored apprenticeship programs,
Labor's oversight practices do not necessarily ensure that those
states' activities comply with federal standards for oversight because
the Apprenticeship Office has only sporadically assessed their
operations. Moreover, to the extent that the federal office does not
provide useful feedback to the states when it does conduct reviews,
states may lose opportunities to improve programs under their
jurisdiction. Finally, because Labor does not seek much information
beyond aggregate numbers from a majority of council-monitored states,
policymakers lose an opportunity to gain perspective and insight for
aligning workforce training with national needs, specifically for key
occupations within construction that are likely to be faced with
shortages of skilled workers in the near future.
Recommendations:
We recommend that the Secretary of Labor take steps to (1) better
utilize information in Labor's database, such as indicators of program
performance, for management oversight, particularly for apprenticeship
programs in occupations with expected future labor shortages; (2)
develop a cost-effective strategy for collecting data from council-
monitored states; (3) conduct Labor's reviews of apprenticeship
activities in states that regulate their own programs on a regular
basis to ensure that state activities are in accord with Labor's
requirements for recognition of apprenticeship programs; and (4) offer
substantive feedback to states from its reviews.
Agency Comments:
We provided a draft of this report to the Department of Labor for
review and comment. Labor provided written comments on the draft report
that are reproduced in appendix V. Labor concurred with our
recommendations and has already taken steps to obtain data on
apprenticeships from some council-monitored states and to regularly
review activities in these states. Further, Labor stated it plans to
use the data to better target the performance of the apprenticeship
programs that OATELS directly registers and oversees, and to provide
improved feedback to states that register and oversee their own
apprenticeship programs.
Unless you publicly announce its contents earlier, we plan no further
distribution of this report until 14 days after the date of this
letter. At that time, we will send copies of this report to the
Secretary of Labor and other interested parties. We will also make
copies available to others upon request. In addition, the report will
be available at no charge on GAO's Web site at http://www.gao.gov.
Please contact me on 512-7215 or nilsens@gao.gov if you or your staff
have any questions about this report. Contact points for our Offices of
Congressional Relations and Public Affairs may be found on the last
page of this report. GAO staff who made major contributions to this
report are listed in appendix VI.
Signed by:
Sigurd R. Nilsen:
Director, Education, Workforce, and Income Security Issues:
[End of section]
Appendix I: Scope and Methodology:
Our objectives were to determine (1) the extent to which the U.S.
Department of Labor monitors the operations and outcomes of registered
apprenticeship programs in the states where it has direct oversight,
(2) its oversight activities for council-monitored states, and (3)
outcomes for construction apprentices in programs sponsored jointly by
employers and unions in relation to those sponsored by employers alone.
To carry out these objectives, we surveyed OATELS officials in charge
of apprenticeship programs in 23 federally monitored states and state
apprenticeship directors in 28 states, including the District of
Columbia, where state apprenticeship councils oversee programs. We used
two surveys--one for federally-monitored states and one for council-
monitored states--to obtain national information on OATELS' monitoring
and oversight activities. We focused only on apprentices in the
civilian sector of the economy and did not include military or prison-
based programs. We asked questions designed to determine the amount of
resources devoted to oversight, the frequency of oversight activities,
and the outcomes from these activities. The surveys were conducted
using self-administered electronic questionnaires posted on the World
Wide Web. We pretested our surveys with a total of five federally-
monitored and council-monitored state officials to determine if the
surveys were understandable and if the information was feasible to
collect. We then refined the questionnaire as appropriate. We sent e-
mail notifications to all federally-monitored and council-monitored
state officials on January 5, 2005. We then sent each potential
respondent a unique password and username by e-mail on January 13,
2005, to ensure that only members of the target population could
participate in the appropriate survey. To encourage respondents to
complete the surveys, we sent e-mail messages to prompt each
nonrespondent approximately 1½ weeks after the initial e-mail message
and a final e-mail reminder on February 7, 2005. We also called
nonrespondents to encourage them to complete the survey. We closed the
surveys on March 18, 2005. We received responses from all 23 federally-
monitored and 27 of 28 council-monitored state officials including the
District of Columbia. (See table 2.) Copies of the surveys are provided
in appendices III and IV.
Table 2: Survey Numbers and Response Rates:
Respondents: Federally-monitored states;
Surveys conducted: 23;
Surveys received: 23.
Respondents: Council-monitored states and the District of Columbia;
Surveys conducted: 28;
Surveys received: 27.
Source: GAO.
[End of table]
To examine the outcomes for apprentices in the construction industry,
we analyzed data from Labor's RAIS database. In calculating completion
rates, we constructed five cohorts based on when they enrolled in their
programs; we had cohorts for fiscal years 1994, 1995, 1996, 1997, and
1998. We then considered the status of these cohorts 6 years after they
enrolled to determine if they had completed, cancelled, or remained in
training. Our analysis of wage data focused on data collected in fiscal
year 2004, the first full year that Labor began collecting such
information. We assessed the reliability of the RAIS database by
reviewing relevant information on the database, interviewing relevant
OATELS officials, and conducting our own testing of the database. This
testing included examining the completeness of the data, performing
data reliability checks, and assessing the internal controls of the
data. Based on this information and our analysis, we determined that
these data were sufficiently reliable for the purposes of our report.
Because Labor's RAIS database does not contain data from all states, we
supplemented these data with data from 10 council-monitored states that
do not report to this database. We selected these states based on the
number of apprentices they had and whether their data were in an
electronic format that would facilitate extracting and sending these
data to us. We submitted a data request that asked for selected
information on enrollment, completion, and wages for the 10 largest
apprenticeship occupations to these states and received data from all
of them. We determined that these data were reliable for our purposes.
We did not combine these data with those from RAIS; we used them as a
means of comparison.
To learn more about the oversight of apprenticeship programs and their
outcomes, we conducted site visits to four states--New York,
California, Texas, and Washington. These states represented both
federal and council-monitored states and had large numbers (from a high
of about 52,000 to a low of 6,500) of construction apprentices. On
these site visits, we interviewed relevant federal and state officials
along with joint and non-joint program sponsors. We also toured
facilities in two states where certain apprentices are trained.
Throughout the engagement we interviewed relevant Labor officials and
experts that have researched apprenticeship programs and reviewed
relevant past reports and evaluations of these programs. We conducted
our review from August 2004 through July 2005 in accordance with
generally accepted government auditing standards.
[End of section]
Appendix II: Completion Rates, Time Taken to Complete, and Wages for
Construction Apprentices in Council-Monitored States:
Table 3: Percentages of Apprentices Completing Joint and Non-joint
Construction Programs as Reported by Selected Council-monitored States
for Fiscal Years 1997-2004:
Electrician;
California[A]: Joint: 48;
California[A]: Non-joint: 32;
Kentucky[B]: Joint: 60;
Kentucky[B]: Non-joint: 38;
Maryland[C]: Joint: 52;
Maryland[C]: Non-joint: 35;
Massachusetts[D]: Joint: 63;
Massachusetts[D]: Non-joint: 38;
Minnesota[D]: Joint: 77;
Minnesota[D]: Non-Joint: 25;
New York: Joint: 68;
New York: Non-joint: 12;
Oregon[E]: Joint: 65;
Oregon[E]: Non-joint: --;
Virginia[F]: Joint: 42;
Virginia[F]: Non-joint: 20;
Washington[G]: Joint: 62;
Washington[G]: Non-joint: 38;
Wisconsin[H]: Joint: 86;
Wisconsin[H]: Non-joint: 90.
Carpenter;
California[A]: Joint: 22;
California[A]: Non-joint: 12;
Kentucky[B]: Joint: 20;
Kentucky[B]: Non-joint: 0;
Maryland[C]: Joint: 26;
Maryland[C]: Non-joint: 27;
Massachusetts[D]: Joint: 48;
Massachusetts[D]: Non-joint: 0;
Minnesota[D]: Joint: 20;
Minnesota[D]: Non-Joint: 0;
New York: Joint: 36;
New York: Non-joint: 28;
Oregon[E]: Joint: 20;
Oregon[E]: Non-joint: --;
Virginia[F]: Joint: 47;
Virginia[F]: Non-joint: 16;
Washington[G]: Joint: 32;
Washington[G]: Non-joint: 21;
Wisconsin[H]: Joint: 69;
Wisconsin[H]: Non-joint: 55.
Plumber;
California[A]: Joint: 46;
California[A]: Non-joint: 25;
Kentucky[B]: Joint: 65;
Kentucky[B]: Non-joint: 67;
Maryland[C]: Joint: 45;
Maryland[C]: Non-joint: 10;
Massachusetts[D]: Joint: 77;
Massachusetts[D]: Non-joint: 25;
Minnesota[D]: Joint: 59;
Minnesota[D]: Non-Joint: 20;
New York: Joint: 53;
New York: Non-joint: 15;
Oregon[E]: Joint: 49;
Oregon[E]: Non-joint: --;
Virginia[F]: Joint: 67;
Virginia[F]: Non-joint: 25;
Washington[G]: Joint: 94;
Washington[G]: Non-joint: 22;
Wisconsin[H]: Joint: 75;
Wisconsin[H]: Non-joint: 56.
Pipe fitter;
California[A]: Joint: 43;
California[A]: Non-joint: 20;
Kentucky[B]: Joint: 89;
Kentucky[B]: Non-joint: 0;
Maryland[C]: Joint: 66;
Maryland[C]: Non-joint: 0;
Massachusetts[D]: Joint: --;
Massachusetts[D]: Non-joint: --;
Minnesota[D]: Joint: --;
Minnesota[D]: Non-Joint: --.
New York: Joint: 90;
New York: Non-joint: 13;
Oregon[E]: Joint: 46;
Oregon[E]: Non-joint: --;
Virginia[F]: Joint: 58;
Virginia[F]: Non-joint: 22;
Washington[G]: Joint: 70;
Washington[G]: Non-joint: --;
Wisconsin[H]: Joint: 82;
Wisconsin[H]: Non-joint: 33.
Sheet metal worker;
California[A]: Joint: 55;
California[A]: Non-joint: 19;
Kentucky[B]: Joint: 58;
Kentucky[B]: Non-joint: 0;
Maryland[C]: Joint: 50;
Maryland[C]: Non-joint: 56;
Massachusetts[D]: Joint: --;
Massachusetts[D]: Non-joint: --;
Minnesota[D]: Joint: --;
Minnesota[D]: Non-Joint: --.
New York: Joint: 70;
New York: Non-joint: 0;
Oregon[E]: Joint: 41;
Oregon[E]: Non-joint: --;
Virginia[F]: Joint: 10;
Virginia[F]: Non-joint: 27;
Washington[G]: Joint: 37;
Washington[G]: Non-joint: 0;
Wisconsin[H]: Joint: 63;
Wisconsin[H]: Non-joint: 45.
Structural steel worker;
California[A]: Joint: 35;
California[A]: Non-joint: --;
Kentucky[B]: Joint: 26;
Kentucky[B]: Non-joint: -- ;
Maryland[C]: Joint: 33;
Maryland[C]: Non-joint: --;
Massachusetts[D]: Joint: --;
Massachusetts[D]: Non-joint: --;
Minnesota[D]: Joint: --;
Minnesota[D]: Non-Joint: --;
New York: Joint: 61;
New York: Non-joint: 0;
Oregon[E]: Joint: 41;
Oregon[E]: Non-joint: --;
Virginia[F]: Joint: 50;
Virginia[F]: Non-joint: 7;
Washington[G]: Joint: 41;
Washington[G]: Non-joint: --;
Wisconsin[H]: Joint: --;
Wisconsin[H]: Non-joint: --.
Bricklayer;
California[A]: Joint: 28;
California[A]: Non-joint: 0;
Kentucky[B]: Joint: --;
Kentucky[B]: Non-joint: 56;
Maryland[C]: Joint: 37;
Maryland[C]: Non-joint: 8;
Massachusetts[D]: Joint: --;
Massachusetts[D]: Non-joint: --;
Minnesota[D]: Joint: --;
Minnesota[D]: Non-Joint: --;
New York: Joint: 37;
New York: Non-joint: 11;
Oregon[E]: Joint: 44;
Oregon[E]: Non-joint: --;
Virginia[F]: Joint: 16;
Virginia[F]: Non-joint: 25;
Washington[G]: Joint: 33;
Washington[G]: Non-joint: --;
Wisconsin[H]: Joint: 51;
Wisconsin[H]: Non-joint: 67.
Roofer;
California[A]: Joint: 7;
California[A]: Non-joint: 8;
Kentucky[B]: Joint: 35;
Kentucky[B]: Non-joint: --;
Maryland[C]: Joint: 0;
Maryland[C]: Non-joint: --;
Massachusetts[D]: Joint: --;
Massachusetts[D]: Non-joint: --;
Minnesota[D]: Joint: --;
Minnesota[D]: Non-Joint: --.
New York: Joint: 21;
New York: Non-joint: 8;
Oregon[E]: Joint: 5;
Oregon[E]: Non-joint: --;
Virginia[F]: Joint: --;
Virginia[F]: Non-joint: --;
Washington[G]: Joint: 6;
Washington[G]: Non-joint: --;
Wisconsin[H]: Joint: 43;
Wisconsin[H]: Non-joint: 0.
Painter;
California[A]: Joint: 27;
California[A]: Non-joint: 15;
Kentucky[B]: Joint: 33;
Kentucky[B]: Non-joint: --;
Maryland[C]: Joint: 18;
Maryland[C]: Non-joint: --;
Massachusetts[D]: Joint: --;
Massachusetts[D]: Non-joint: --;
Minnesota[D]: Joint: --;
Minnesota[D]: Non-Joint: --.
New York: Joint: 25;
New York: Non-joint: 0;
Oregon[E]: Joint: 25;
Oregon[E]: Non-joint: --;
Virginia[F]: Joint: --;
Virginia[F]: Non-joint: 0;
Washington[G]: Joint: 11;
Washington[G]: Non-joint: --;
Wisconsin[H]: Joint: 47;
Wisconsin[H]: Non-joint: 50.
Operating engineer;
California[A]: Joint: 53;
California[A]: Non-joint: 0;
Kentucky[B]: Joint: 50;
Kentucky[B]: Non-joint: --;
Maryland[C]: Joint: 47;
Maryland[C]: Non-joint: --;
Massachusetts[D]: Joint: --;
Massachusetts[D]: Non-joint: --;
Minnesota[D]: Joint: --;
Minnesota[D]: Non-Joint: --.
New York: Joint: 65;
New York: Non-joint: 0;
Oregon[E]: Joint: 29;
Oregon[E]: Non-joint: --;
Virginia[F]: Joint: 60;
Virginia[F]: Non-joint: --;
Washington[G]: Joint: 52;
Washington[G]: Non-joint: 7;
Wisconsin[H]: Joint: 81;
Wisconsin[H]: Non-joint: --.
Source: Data were provided by selected council-monitored states.
Note: Data include apprentices entering program from October 1, 1997,
through September 30, 1998, and completing before October 1, 2004.
[A] California reported no structural steel worker non-joint programs.
[B] Kentucky reported that no apprentices entered bricklayer joint
programs or carpenter, structural steel worker, roofer, painter, and
operating engineer non-joint programs from October 1, 1997, through
September 30, 1998.
[C] Maryland reported that no apprentices entered structural steel
worker, roofer, painter, and operating engineer non-joint programs from
October 1, 1997, through September 30, 1998.
[D] Massachusetts and Minnesota reported data for electrician,
carpenter, and plumber programs only. We told state directors they
could do this in order to save resources and because these three fields
represent over half of all apprentices in the construction trades.
[E] Oregon reported that no non-joint apprenticeship programs are
registered in the state.
[F] Virginia reported that no apprentices entered roofer and painter
joint programs, and roofer and operating engineer non-joint programs
from October 1, 1997, through September 30, 1998.
[G] Washington reported no pipe fitter, structural steel worker,
bricklayer, roofer, and painter non-joint programs.
[H] Wisconsin reported no structural steel worker joint or non-joint
programs and no operating engineer non-joint programs.
[End of table]
Table 4: Average Number of Weeks Spent to Complete Joint and Non-joint
Construction Apprenticeship Programs as Reported by Selected Council-
monitored States:
Electrician;
California[A]: Joint: 225;
California[A]: Non-joint: 218;
Kentucky[B]: Joint: 253;
Kentucky[B]: Non-joint: 177;
Maryland[C]: Joint: --;
Maryland[C]: Non-joint: --;
Massachusetts[C]: Joint: --;
Massachusetts[C]: Non-joint: --;
Minnesota[D]: Joint: 208;
Minnesota[D]: Non-joint: 60;
New York[E]: Joint: 290;
New York[E]: Non-joint: 219;
Oregon[F]: Joint: 205;
Oregon[F]: Non-joint: --;
Virginia[G]: Joint: 240;
Virginia[G]: Non-joint: 166;
Washington[H]: Joint: 233;
Washington[H]: Non-joint: 209;
Wisconsin[I]: Joint: 256;
Wisconsin[I]: Non-joint: 264.
Carpenter;
California[A]: Joint: 188;
California[A]: Non-joint: 140;
Kentucky[B]: Joint: 219;
Kentucky[B]: Non-joint: --;
Maryland[C]: Joint: --;
Maryland[C]: Non-joint: --;
Massachusetts[C]: Joint: --;
Massachusetts[C]: Non-joint: --;
Minnesota[D]: Joint: 191;
Minnesota[D]: Non-joint: --.
New York[E]: Joint: 165;
New York[E]: Non-joint: 213;
Oregon[F]: Joint: 176;
Oregon[F]: Non-joint: --;
Virginia[G]: Joint: 169;
Virginia[G]: Non-joint: 234;
Washington[H]: Joint: 201;
Washington[H]: Non-joint: 184;
Wisconsin[I]: Joint: 207;
Wisconsin[I]: Non-joint: 204.
Plumber;
California[A]: Joint: 232;
California[A]: Non-joint: 203;
Kentucky[B]: Joint: 247;
Kentucky[B]: Non-joint: 151;
Maryland[C]: Joint: --;
Maryland[C]: Non-joint: --;
Massachusetts[C]: Joint: --;
Massachusetts[C]: Non-joint: --;
Minnesota[D]: Joint: 213;
Minnesota[D]: Non-joint: 85;
New York[E]: Joint: 262;
New York[E]: Non-joint: 247;
Oregon[F]: Joint: 211;
Oregon[F]: Non-joint: --;
Virginia[G]: Joint: 254;
Virginia[G]: Non-joint: 168;
Washington[H]: Joint: 234;
Washington[H]: Non-joint: 161;
Wisconsin[I]: Joint: 274;
Wisconsin[I]: Non-joint: 280.
Pipe fitter;
California[A]: Joint: 231;
California[A]: Non-joint: 191;
Kentucky[B]: Joint: 234;
Kentucky[B]: Non-joint: --;
Maryland[C]: Joint: --;
Maryland[C]: Non-joint: --;
Massachusetts[C]: Joint: --;
Massachusetts[C]: Non-joint: --;
Minnesota[D]: Joint: --;
Minnesota[D]: Non-joint: --;
New York[E]: Joint: 209;
New York[E]: Non-joint: --;
Oregon[F]: Joint: 198;
Oregon[F]: Non-joint: --;
Virginia[G]: Joint: 214;
Virginia[G]: Non-joint: 201;
Washington[H]: Joint: 247;
Washington[H]: Non-joint: --;
Wisconsin[I]: Joint: 259;
Wisconsin[I]: Non-joint: 216.
Sheet metal worker;
California[A]: Joint: 224;
California[A]: Non-joint: 217;
Kentucky[B]: Joint: 226;
Kentucky[B]: Non-joint: --;
Maryland[C]: Joint: --;
Maryland[C]: Non-joint: --;
Massachusetts[C]: Joint: --;
Massachusetts[C]: Non-joint: --;
Minnesota[D]: Joint: --;
Minnesota[D]: Non-joint: --;
New York[E]: Joint: 217;
New York[E]: Non-joint: 52;
Oregon[F]: Joint: 217;
Oregon[F]: Non-joint: --;
Virginia[G]: Joint: 214;
Virginia[G]: Non-joint: 104;
Washington[H]: Joint: 219;
Washington[H]: Non-joint: --;
Wisconsin[I]: Joint: 264;
Wisconsin[I]: Non-joint: 244.
Structural steel worker;
California[A]: Joint: 167;
California[A]: Non-joint: --;
Kentucky[B]: Joint: 156;
Kentucky[B]: Non-joint: --;
Maryland[C]: Joint: --;
Maryland[C]: Non-joint: --;
Massachusetts[C]: Joint: --;
Massachusetts[C]: Non-joint: --;
Minnesota[D]: Joint: --;
Minnesota[D]: Non-joint: --;
New York[E]: Joint: 162;
New York[E]: Non-joint: --;
Oregon[F]: Joint: 188;
Oregon[F]: Non-joint: --;
Virginia[G]: Joint: 154;
Virginia[G]: Non-joint: 196;
Washington[H]: Joint: 149;
Washington[H]: Non-joint: --;
Wisconsin[I]: Joint: --;
Wisconsin[I]: Non-joint: --.
Bricklayer;
California[A]: Joint: 140;
California[A]: Non-joint: --;
Kentucky[B]: Joint: --;
Kentucky[B]: Non-joint: 149;
Maryland[C]: Joint: --;
Maryland[C]: Non-joint: --;
Massachusetts[C]: Joint: --;
Massachusetts[C]: Non-joint: --;
Minnesota[D]: Joint: --;
Minnesota[D]: Non-joint: --.
New York[E]: Joint: 155;
New York[E]: Non-joint: 174;
Oregon[F]: Joint: 171;
Oregon[F]: Non-joint: --;
Virginia[G]: Joint: 159;
Virginia[G]: Non-joint: 215;
Washington[H]: Joint: 161;
Washington[H]: Non-joint: --;
Wisconsin[I]: Joint: 173;
Wisconsin[I]: Non-joint: 139.
Roofer;
California[A]: Joint: 192;
California[A]: Non-joint: 188;
Kentucky[B]: Joint: 184;
Kentucky[B]: Non-joint: --;
Maryland[C]: Joint: --;
Maryland[C]: Non-joint: --;
Massachusetts[C]: Joint: --;
Massachusetts[C]: Non-joint: --;
Minnesota[D]: Joint: --;
Minnesota[D]: Non-joint: --.
New York[E]: Joint: 197;
New York[E]: Non-joint: 174;
Oregon[F]: Joint: 146;
Oregon[F]: Non-joint: --;
Virginia[G]: Joint: --;
Virginia[G]: Non-joint: --;
Washington[H]: Joint: 121;
Washington[H]: Non-joint: --;
Wisconsin[I]: Joint: 165;
Wisconsin[I]: Non-joint: ---.
Painter;
California[A]: Joint: 152;
California[A]: Non-joint: 119;
Kentucky[B]: Joint: 234;
Kentucky[B]: Non-joint: --;
Maryland[C]: Joint: --;
Maryland[C]: Non-joint: --;
Massachusetts[C]: Joint: --;
Massachusetts[C]: Non-joint: --;
Minnesota[D]: Joint: --;
Minnesota[D]: Non-joint: --.
New York[E]: Joint: 166;
New York[E]: Non-joint: --;
Oregon[F]: Joint: 164;
Oregon[F]: Non-joint: --;
Virginia[G]: Joint: --;
Virginia[G]: Non-joint: --;
Washington[H]: Joint: 115;
Washington[H]: Non-joint: --;
Wisconsin[I]: Joint: 208;
Wisconsin[I]: Non-joint: 157.
Operating engineer;
California[A]: Joint: 183;
California[A]: Non-joint: --;
Kentucky[B]: Joint: 150;
Kentucky[B]: Non-joint: --;
Maryland[C]: Joint: --;
Maryland[C]: Non-joint: --;
Massachusetts[C]: Joint: --;
Massachusetts[C]: Non-joint: --;
Minnesota[D]: Joint: --;
Minnesota[D]: Non-joint: --.
New York[E]: Joint: 194;
New York[E]: Non-joint: --;
Oregon[F]: Joint: 261;
Oregon[F]: Non-joint: --;
Virginia[G]: Joint: 149;
Virginia[G]: Non-joint: --;
Washington[H]: Joint: 198;
Washington[H]: Non-joint: --;
Wisconsin[I]: Joint: 140;
Wisconsin[I]: Non-joint: --.
Source: Data were provided by selected council-monitored states.
Note: Data include apprentices entering program from October 1, 1997,
through September 30, 1998, and completing before October 1, 2004.
[A] California reported no structural steel worker non-joint programs
and no completers from bricklayer and operating engineer non-joint
programs.
[B] Kentucky reported that no apprentices entered bricklayer joint
programs and carpenter, pipe fitter, structural steel, sheet metal
worker, roofer, painter, and operating engineer non-joint programs from
October 1, 1997, through September 30, 1998.
[C] Maryland and Massachusetts do not track these data.
[D] Minnesota reported data for electrician, carpenter, and plumber
programs only and reported no completions for carpenters in non-joint
programs. We told state directors they could report only for these
three fields in order to save resources and because these three fields
represent over half of all apprentices in the construction trades.
[E] New York reported no completers for pipe fitter, structural steel
worker, painter, and operating engineer non-joint programs.
[F] Oregon reported no non-joint apprenticeship programs are registered
in the state.
[G] Virginia reported no apprentices entered roofer and painter joint
programs and roofer, painter and operating engineer non-joint programs
from October 1, 1997, through September 30, 1998.
[H] Washington reported no pipe fitter, structural steel worker,
bricklayer, operating engineer and roofer non-joint programs. Also, no
apprentices completed sheet metal worker and painter non-joint
programs.
[I] Wisconsin reported no structural steel worker programs and no
roofer and operating engineer non-joint programs.
[End of table]
Table 5: Mean Hourly Wage Rates for Beginning Apprentices in Joint and
Non-joint Construction Programs as Reported by Selected Council-
monitored States, Fiscal Year 2004:
Electrician;
California[A]: Joint: $13.50;
California[A]: Non-joint: $12.28;
Kentucky: Joint: $9.31;
Kentucky: Non-joint: $6.41;
Maryland[B]: Joint: ---;
Maryland[B]: Non-joint: --;
Massachusetts[B]: Joint: --;
Massachusetts[B]: Non-joint: --;
Minnesota[C]: Joint: $11.81;
Minnesota[C]: Non-joint: $11.06;
New York[B]: Joint: --;
New York[B]: Non-joint: --;
Oregon[D]: Joint: $9.81;
Oregon[D]: Non-joint: --;
Virginia[E]: Joint: $9.50;
Virginia[E]: Non-joint: $8.08;
Washington[F]: Joint: $11.64;
Washington[F]: Non-joint: $11.38;
Wisconsin[B]: Joint: --;
Wisconsin[B]: Non-joint: --.
Carpenter;
California[A]: Joint: $15.16;
California[A]: Non-joint: $14.11;
Kentucky: Joint: $8.05;
Kentucky: Non-joint: $8.31;
Maryland[B]: Joint: --;
Maryland[B]: Non-joint: --;
Massachusetts[B]: Joint: --;
Massachusetts[B]: Non-joint: --;
Minnesota[C]: Joint: $13.46;
Minnesota[C]: Non-joint: $9.63;
New York[B]: Joint: --;
New York[B]: Non-joint: --;
Oregon[D]: Joint: $11.03;
Oregon[D]: Non-joint: --;
Virginia[E]: Joint: $8.22;
Virginia[E]: Non-joint: $9.68;
Washington[F]: Joint: $14.67;
Washington[F]: Non-joint: $12.67;
Wisconsin[B]: Joint: --;
Wisconsin[B]: Non-joint: --.
Plumber;
California[A]: Joint: $13.82;
California[A]: Non-joint: $12.85;
Kentucky: Joint: $12.14;
Kentucky: Non-joint: $7.54;
Maryland[B]: Joint: --;
Maryland[B]: Non-joint: --;
Massachusetts[B]: Joint: --;
Massachusetts[B]: Non-joint: --;
Minnesota[C]: Joint: $14.69;
Minnesota[C]: Non-joint: $14.36;
New York[B]: Joint: --;
New York[B]: Non-joint: --;
Oregon[D]: Joint: $9.68;
Oregon[D]: Non-joint: --;
Virginia[E]: Joint: $8.70;
Virginia[E]: Non-joint: $8.59;
Washington[F]: Joint: $12.67;
Washington[F]: Non-joint: $10.63;
Wisconsin[B]: Joint: --;
Wisconsin[B]: Non-joint: --.
Pipe fitter;
California[A]: Joint: $11.80;
California[A]: Non-joint: $13.10;
Kentucky: Joint: $12.14;
Kentucky: Non-joint: $7.08;
Maryland[B]: Joint: --;
Maryland[B]: Non-joint: --;
Massachusetts[B]: Joint: --;
Massachusetts[B]: Non-joint: --;
Minnesota[C]: Joint: --;
Minnesota[C]: Non-joint: --;
New York[B]: Joint: --;
New York[B]: Non-joint: --;
Oregon[D]: Joint: $11.03;
Oregon[D]: Non-joint: --;
Virginia[E]: Joint: $9.75;
Virginia[E]: Non-joint: $9.36;
Washington[F]: Joint: $13.64;
Washington[F]: Non-joint: ---;
Wisconsin[B]: Joint: -- ;
Wisconsin[B]: Non-joint: --.
Sheet metal worker;
California[A]: Joint: $12.64;
California[A]: Non-joint: $10.85;
Kentucky: Joint: $11.79;
Kentucky: Non-joint: $7.08;
Maryland[B]: Joint: --;
Maryland[B]: Non-joint: --;
Massachusetts[B]: Joint: --;
Massachusetts[B]: Non-joint: --;
Minnesota[C]: Joint: --;
Minnesota[C]: Non-joint: --;
New York[B]: Joint: --;
New York[B]: Non-joint: --;
Oregon[D]: Joint: $8.83;
Oregon[D]: Non-joint: --;
Virginia[E]: Joint: $9.43;
Virginia[E]: Non-joint: $8.05;
Washington[F]: Joint: $13.11;
Washington[F]: Non-joint: $7.61;
Wisconsin[B]: Joint: --;
Wisconsin[B]: Non-joint: --.
Structural steel worker;
California[A]: Joint: $17.24;
California[A]: Non-joint: --;
Kentucky: Joint: $13.56;
Kentucky: Non-joint: $7.08;
Maryland[B]: Joint: --;
Maryland[B]: Non-joint: --;
Massachusetts[B]: Joint: --;
Massachusetts[B]: Non-joint: --;
Minnesota[C]: Joint: --;
Minnesota[C]: Non-joint: --.
New York[B]: Joint: --;
New York[B]: Non-joint: --;
Oregon[D]: Joint: $18.51;
Oregon[D]: Non-joint: --;
Virginia[E]: Joint: $9.49;
Virginia[E]: Non-joint: --;
Washington[F]: Joint: $17.74;
Washington[F]: Non-joint: --;
Wisconsin[B]: Joint: --;
Wisconsin[B]: Non-joint: --.
Bricklayer;
California[A]: Joint: $11.22;
California[A]: Non-joint: $11.40;
Kentucky: Joint: $10.59;
Kentucky: Non-joint: $9.82;
Maryland[B]: Joint: --;
Maryland[B]: Non-joint: --;
Massachusetts[B]: Joint: --;
Massachusetts[B]: Non-joint: --;
Minnesota[C]: Joint: --;
Minnesota[C]: Non-joint: --.
New York[B]: Joint: --;
New York[B]: Non-joint: --;
Oregon[D]: Joint: $13.35;
Oregon[D]: Non-joint: --;
Virginia[E]: Joint: $8.02;
Virginia[E]: Non-joint: $9.47;
Washington[F]: Joint: $12.62;
Washington[F]: Non-joint: --;
Wisconsin[B]: Joint: --;
Wisconsin[B]: Non-joint: --.
Roofer;
California[A]: Joint: $11.90;
California[A]: Non-joint: $10.96;
Kentucky: Joint: $10.12;
Kentucky: Non-joint: $7.08;
Maryland[B]: Joint: --;
Maryland[B]: Non-joint: --;
Massachusetts[B]: Joint: --;
Massachusetts[B]: Non-joint: --;
Minnesota[C]: Joint: --;
Minnesota[C]: Non-joint: --.
New York[B]: Joint: --;
New York[B]: Non-joint: --;
Oregon[D]: Joint: $10.03;
Oregon[D]: Non-joint: --;
Virginia[E]: Joint: --;
Virginia[E]: Non-joint: $7.44;
Washington[F]: Joint: $13.28;
Washington[F]: Non-joint: --;
Wisconsin[B]: Joint: --;
Wisconsin[B]: Non-joint: --.
Painter;
California[A]: Joint: $11.31;
California[A]: Non-joint: $10.63;
Kentucky: Joint: $9.86;
Kentucky: Non-joint: $8.00;
Maryland[B]: Joint: --;
Maryland[B]: Non-joint: --;
Massachusetts[B]: Joint: --;
Massachusetts[B]: Non-joint: --;
Minnesota[C]: Joint: --;
Minnesota[C]: Non-joint: --.
New York[B]: Joint: --;
New York[B]: Non-joint: --;
Oregon[D]: Joint: $11.26;
Oregon[D]: Non-joint: --;
Virginia[E]: Joint: --;
Virginia[E]: Non-joint: $10.95;
Washington[F]: Joint: $11.50;
Washington[F]: Non-joint: $8.33;
Wisconsin[B]: Joint: --;
Wisconsin[B]: Non-joint: --.
Operating engineer;
California[A]: Joint: $20.30;
California[A]: Non-joint: $18.42;
Kentucky: Joint: $12.50;
Kentucky: Non-joint: $7.08;
Maryland[B]: Joint: --;
Maryland[B]: Non-joint: --;
Massachusetts[B]: Joint: --;
Massachusetts[B]: Non-joint: --;
Minnesota[C]: Joint: --;
Minnesota[C]: Non-joint: --.
New York[B]: Joint: --;
New York[B]: Non-joint: --;
Oregon[D]: Joint: $17.43;
Oregon[D]: Non-joint: --;
Virginia[E]: Joint: $10.99;
Virginia[E]: Non-joint: $8.08;
Washington[F]: Joint: $15.95;
Washington[F]: Non-joint: $15.37;
Wisconsin[B]: Joint: --;
Wisconsin[B]: Non-joint: --.
Source: Data were provided by selected council-monitored states.
Note: Data includes wages for apprentices who began programs on October
1, 2003, through September 30, 2004.
[A] California reported no structural steel worker non-joint programs.
[B] Maryland, Massachusetts, New York, and Wisconsin do not collect
wage data.
[C] Minnesota reported data for electrician, carpenter, and plumber
programs only. We told state directors they could do this in order to
save resources and because these three fields represent over half of
all apprentices in the construction trades.
[D] Oregon reported no non-joint apprenticeship programs are registered
in the state.
[E] Virginia reported no roofer joint programs and no operating
engineer non-joint programs. Also, no apprentices entered painter joint
programs and structural steel worker non-joint programs that year.
[F] Washington reported no pipe fitter, structural steel worker,
bricklayer, painter, and roofer non-joint programs as of September 30,
2004.
[End of table]
Table 6: Mean Hourly Wage Rates for Apprentices Achieving Journey
Status in Joint and Non-joint Construction Programs as Reported by
Selected Council-monitored States, Fiscal Year 2004:
Electrician;
California[A]: Joint: $34.98;
California[A]: Non-joint: $30.35;
Kentucky: Joint: $20.74;
Kentucky: Non-joint: $12.81;
Maryland[B]: Joint: --;
Maryland[B]: Non-joint: --;
Massachusetts[B]: Joint: --;
Massachusetts[B]: Non-joint: --;
Minnesota[C]: Joint: $26.62;
Minnesota[C]: Non-joint: $25.31;
New York[B]: Joint: --;
New York[B]: Non-joint: --;
Oregon[D]: Joint: $23.57;
Oregon[D]: Non-joint: --;
Virginia[E]: Joint: $16.54;
Virginia[E]: Non-joint: $13.34;
Washington[F]: Joint: $24.38;
Washington[F]: Non-joint: $28.46;
Wisconsin: Joint: --;
Wisconsin: Non-joint: --.
Carpenter;
California[A]: Joint: $32.45;
California[A]: Non-joint: $32.48;
Kentucky: Joint: $16.57;
Kentucky: Non-joint: $15.00;
Maryland[B]: Joint: --;
Maryland[B]: Non-joint: --;
Massachusetts[B]: Joint: --;
Massachusetts[B]: Non-joint: --;
Minnesota[C]: Joint: $23.20;
Minnesota[C]: Non-joint: $29.38;
New York[B]: Joint: --;
New York[B]: Non-joint: --;
Oregon[D]: Joint: $20.06;
Oregon[D]: Non-joint: --;
Virginia[E]: Joint: $11.39;
Virginia[E]: Non-joint: $10.73;
Washington[F]: Joint: $24.46;
Washington[F]: Non-joint: $22.04;
Wisconsin: Joint: --;
Wisconsin: Non-joint: --.
Plumber;
California[A]: Joint: $31.80;
California[A]: Non-joint: $30.91;
Kentucky: Joint: $24.28;
Kentucky: Non-joint: $14.00;
Maryland[B]: Joint: --;
Maryland[B]: Non-joint: --;
Massachusetts[B]: Joint: --;
Massachusetts[B]: Non-joint: --;
Minnesota[C]: Joint: $32.65;
Minnesota[C]: Non-joint: $30.56;
New York[B]: Joint: --;
New York[B]: Non-joint: --;
Oregon[D]: Joint: $23.38;
Oregon[D]: Non-joint: --;
Virginia[E]: Joint: $17.15;
Virginia[E]: Non-joint: $14.47;
Washington[F]: Joint: $26.68;
Washington[F]: Non-joint: $26.52;
Wisconsin: Joint: --;
Wisconsin: Non-joint: --.
Pipe fitter;
California[A]: Joint: $32.40;
California[A]: Non-joint: -- ;
Kentucky: Joint: $24.28;
Kentucky: Non-joint: $12.00;
Maryland[B]: Joint: --;
Maryland[B]: Non-joint: --;
Massachusetts[B]: Joint: --;
Massachusetts[B]: Non-joint: --;
Minnesota[C]: Joint: --;
Minnesota[C]: Non-joint: --;
New York[B]: Joint: --;
New York[B]: Non-joint: --;
Oregon[D]: Joint: $25.46;
Oregon[D]: Non-joint: --;
Virginia[E]: Joint: $17.38;
Virginia[E]: Non-joint: $13.53;
Washington[F]: Joint: $27.07;
Washington[F]: Non-joint: --;
Wisconsin: Joint: --;
Wisconsin: Non-joint: --.
Sheet metal worker;
California[A]: Joint: $32.50;
California[A]: Non-joint: $31.98;
Kentucky: Joint: $23.58;
Kentucky: Non-joint: $12.00;
Maryland[B]: Joint: --;
Maryland[B]: Non-joint: --;
Massachusetts[B]: Joint: --;
Massachusetts[B]: Non-joint: --;
Minnesota[C]: Joint: --;
Minnesota[C]: Non-joint: --;
New York[B]: Joint: --;
New York[B]: Non-joint: --;
Oregon[D]: Joint: $19.74;
Oregon[D]: Non-joint: --;
Virginia[E]: Joint: $14.85;
Virginia[E]: Non-joint: $11.15;
Washington[F]: Joint: $25.58;
Washington[F]: Non-joint: $18.12;
Wisconsin: Joint: --;
Wisconsin: Non-joint: --.
Structural steel worker;
California[A]: Joint: $31.35;
California[A]: Non-joint: ---;
Kentucky: Joint: $22.68;
Kentucky: Non-joint: $12.00;
Maryland[B]: Joint: --;
Maryland[B]: Non-joint: --;
Massachusetts[B]: Joint: --;
Massachusetts[B]: Non-joint: --;
Minnesota[C]: Joint: --;
Minnesota[C]: Non-joint: --.
New York[B]: Joint: --;
New York[B]: Non-joint: --;
Oregon[D]: Joint: $28.47;
Oregon[D]: Non-joint: --;
Virginia[E]: Joint: $13.87;
Virginia[E]: Non-joint: -;
Washington[F]: Joint: $27.30;
Washington[F]: Non-joint: --;
Wisconsin: Joint: --;
Wisconsin: Non-joint: --.
Bricklayer;
California[A]: Joint: $29.97;
California[A]: Non-joint: $30.28;
Kentucky: Joint: $21.17;
Kentucky: Non-joint: $16.98;
Maryland[B]: Joint: --;
Maryland[B]: Non-joint: --;
Massachusetts[B]: Joint: --;
Massachusetts[B]: Non-joint: --;
Minnesota[C]: Joint: --;
Minnesota[C]: Non-joint: --.
New York[B]: Joint: --;
New York[B]: Non-joint: --;
Oregon[D]: Joint: $26.70;
Oregon[D]: Non-joint: --;
Virginia[E]: Joint: $14.75;
Virginia[E]: Non-joint: $18.62;
Washington[F]: Joint: $25.23;
Washington[F]: Non-joint: --;
Wisconsin: Joint: --;
Wisconsin: Non-joint: --.
Roofer;
California[A]: Joint: $25.92;
California[A]: Non-joint: $24.89;
Kentucky: Joint: $18.40;
Kentucky: Non-joint: $12.00;
Maryland[B]: Joint: --;
Maryland[B]: Non-joint: --;
Massachusetts[B]: Joint: --;
Massachusetts[B]: Non-joint: --;
Minnesota[C]: Joint: --;
Minnesota[C]: Non-joint: --.
New York[B]: Joint: --;
New York[B]: Non-joint: --;
Oregon[D]: Joint: $19.42;
Oregon[D]: Non-joint: --;
Virginia[E]: Joint: --;
Virginia[E]: Non-joint: ---;
Washington[F]: Joint: $22.14;
Washington[F]: Non-joint: --;
Wisconsin: Joint: --;
Wisconsin: Non-joint: --.
Painter;
California[A]: Joint: $30.98;
California[A]: Non-joint: $29.08;
Kentucky: Joint: $17.20;
Kentucky: Non-joint: $16.00;
Maryland[B]: Joint: --;
Maryland[B]: Non-joint: --;
Massachusetts[B]: Joint: --;
Massachusetts[B]: Non-joint: --;
Minnesota[C]: Joint: --;
Minnesota[C]: Non-joint: --.
New York[B]: Joint: --;
New York[B]: Non-joint: --;
Oregon[D]: Joint: $18.77;
Oregon[D]: Non-joint: --;
Virginia[E]: Joint: --;
Virginia[E]: Non-joint: ---;
Washington[F]: Joint: $20.44;
Washington[F]: Non-joint: $18.51;
Wisconsin: Joint: --;
Wisconsin: Non-joint: --.
Operating engineer;
California[A]: Joint: $34.34;
California[A]: Non-joint: ---;
Kentucky: Joint: $21.03;
Kentucky: Non-joint: $12.00;
Maryland[B]: Joint: --;
Maryland[B]: Non-joint: --;
Massachusetts[B]: Joint: --;
Massachusetts[B]: Non-joint: --;
Minnesota[C]: Joint: --;
Minnesota[C]: Non-joint: --.
New York[B]: Joint: --;
New York[B]: Non-joint: --;
Oregon[D]: Joint: $22.67;
Oregon[D]: Non-joint: --;
Virginia[E]: Joint: $18.73;
Virginia[E]: Non-joint: --;
Washington[F]: Joint: $25.52;
Washington[F]: Non-joint: $25.63;
Wisconsin: Joint: --;
Wisconsin: Non-joint: --.
Source: Data were provided by selected council-monitored states.
Note: Data include wages for apprentices who achieved journey status
that year.
[A] California reported no structural steel worker non-joint programs.
Also, no apprentices completed pipe fitter and operating engineer non-
joint programs that year.
[B] Maryland, Massachusetts, New York, and Wisconsin do not collect
wage data.
[C] Minnesota reported data for electrician, carpenter, and plumber
programs only. We told state directors they could do this in order to
save resources and because these three fields represent over half of
all apprentices in the construction trades.
[D] Oregon reported no non-joint apprenticeship programs are registered
in the state.
[E] Virginia reported no roofer and painter joint programs, and no
roofer, painter, and operating engineer non-joint programs, and no
apprentices completed joint painter and non-joint structural steel
worker programs as of September 30, 2004.
[F] Washington reported no pipe fitter, structural steel worker,
bricklayer, and roofer non-joint programs as of September 30, 2004.
[End of table]
[End of section]
Appendix III: Responses to Survey of Directors of Apprenticeships in
Federally-Monitored States:
Q1. At the close of federal fiscal year (FFY) 2004 (September 30,
2004), what was the total number of registered apprentices in your
state?
Mean: 4,792;
Median: 3437;
Minimum: 271;
Maximum: 20,496;
Number of respondents: 23.
[End of table]
Q1a. At the close of FFY 2004, what was the total number of registered
apprentices in construction trades in your state?
Mean: 3,057;
Median: 2226;
Minimum: 207;
Maximum: 10,396;
Number of respondents: 22.
[End of table]
Q1b. At the close of FFY 2004, what was the total number of approved
apprenticeship programs in construction trades in your state?
Mean: 169;
Median: 84;
Minimum: 22;
Maximum: 844;
Number of respondents: 23.
[End of table]
Q2. During FFY 2004, how many full-time equivalency (FTE)
apprenticeship training staff were employed by OATELS to monitor and
oversee apprenticeship programs in your state?
Mean: 3;
Median: 2;
Minimum: 0;
Maximum: 6;
Number of respondents: 22.
[End of table]
Q3. Of the FTE apprenticeship training staff reported above,
approximately what percentage of their time was spent in the field
monitoring and overseeing apprenticeship programs or providing
technical assistance related to monitoring and oversight during FFY
2004?
0 - 19%: 5;
20 - 39%: 6;
40 - 59%: 5;
60 - 79%: 4;
80 - 100%: 2;
Don't know: 1;
Number of respondents: 23.
[End of table]
Q4. During FFY 2004, how many full-time equivalency (FTE)
apprenticeship training representative, field, and other
nonadministrative staff were employed by the state to monitor and
oversee apprenticeship programs in your state?
Mean: 0;
Median: 0;
Minimum: 0;
Maximum: 2;
Number of respondents: 22.
[End of table]
Q6. In your opinion, would the following updates or modifications
improve Registered Apprenticeship Information System's (RAIS)
usefulness to your state?
a. Increasing timeliness of notifying state and regional offices of
changes to RAIS;
Yes: 22;
No: 1;
Don't know: 0;
Number of respondents: 23.
b. Increasing available coding options to explain why apprentices leave
the programs;
Yes: 14;
No: 7;
Don't know: 2;
Number of respondents: 23.
c. Allowing production of customized state or regional reports by
sponsorship type;
Yes: 17;
No: 2;
Don't know: 4;
Number of respondents: 23.
d. Allowing production of customized state or regional reports by
industry type;
Yes: 18;
No: 1;
Don't know: 4;
Number of respondents: 23.
e. Allowing production of customized state or regional reports by
occupational type;
Yes: 17;
No: 2;
Don't know: 3;
Number of respondents: 22.
f. Allowing production of customized state or regional reports by sex
of apprentices;
Yes: 14;
No: 1;
Don't know: 8;
Number of respondents: 23.
g. Allowing production of customized state or regional reports by race
of apprentices;
Yes: 14;
No: 1;
Don't know: 8;
Number of respondents: 23.
h. Simplifying instructions and procedures for inputting and updating
data;
Yes: 16;
No: 6;
Don't know: 1;
Number of respondents: 23.
i. Simplifying procedures required to produce reports;
Yes: 18;
No: 3;
Don't know: 2;
Number of respondents: 23.
j. Increasing frequency of RAIS training;
Yes: 17;
No: 1;
Don't know: 5;
Number of respondents: 23.
k. Improving quality of RAIS training;
Yes: 16;
No: 3;
Don't know: 4;
Number of respondents: 23.
l. Other;
Yes: 9;
No: 0;
Don't know: 2;
Number of respondents: 11.
[End of table]
Q8. Did your state use WIA Governor's 15% State Set-Aside funds to
support new and/or established apprenticeship programs in FFY 2004?
Yes: 3;
No: 15;
Don't know: 5;
Number of respondents: 23.
[End of table]
Q9. Were WIA State Set-Aside funds used to support new and/or
established apprenticeship programs in your state in FFY 2004 to do any
of the following?
a. To provide related instruction or other education that satisfied
specific apprenticeship requirements;
Yes, new apprenticeship programs: 1;
Yes, established apprenticeship programs: 2;
No: 0;
Don't know: 0;
Number of respondents: 3.
b. To provide on-the-job training;
Yes, new apprenticeship programs: 0;
Yes, established apprenticeship programs: 0;
No: 1;
Don't know: 0;
Number of respondents: 1.
c. To disseminate information about apprenticeship programs;
Yes, new apprenticeship programs: 0;
Yes, established apprenticeship programs: 1;
No: 0;
Don't know: 0;
Number of respondents: 1.
d. To encourage entities to sponsor and register additional or new
programs;
Yes, new apprenticeship programs: 1;
Yes, established apprenticeship programs: 0;
No: 1;
Don't know: 0;
Number of respondents: 2.
e. Other;
Yes, new apprenticeship programs: 1;
Yes, established apprenticeship programs: 0;
No: 0;
Don't know: 0;
Number of respondents: 1.
[End of table]
Q11. For which of the following reasons did your state not use WIA Set-
Aside Funds to support apprenticeship programs in FFY 2004?
a. Decision-makers gave priority to other programs;
Yes: 11;
No: 0;
Don't know: 6;
Number of respondents: 17.
b. Decision-makers did not believe funds could be used to support new
apprenticeship programs;
Yes: 7;
No: 2;
Don't know: 8;
Number of respondents: 17.
c. Decision-makers did not believe funds could be used to support
established apprenticeship programs;
Yes: 7;
No: 2;
Don't know: 7;
Number of respondents: 16.
d. Decision-makers did not believe funds could be used to provide
related instruction or other education that satisfied specific
apprenticeship requirements;
Yes: 6;
No: 2;
Don't know: 9;
Number of respondents: 17.
e. Decision-makers did not believe funds could be used to provide on-
the-job training;
Yes: 6;
No: 3;
Don't know: 8;
Number of respondents: 17.
f. Decision-makers did not believe funds could be used to disseminate
information about apprenticeship programs;
Yes: 5;
No: 2;
Don't know: 10;
Number of respondents: 17.
g. Decision-makers did not believe funds could be used to encourage the
recruitment of entities to sponsor and register new programs;
Yes: 6;
No: 1;
Don't know: 10;
Number of respondents: 17.
h. Decision-makers did not establish linkages between the state
apprenticeship unit and unit(s) responsible for WIA;
Yes: 11;
No: 2;
Don't know: 4;
Number of respondents: 17.
i. Other;
Yes: 8;
No: 0;
Don't know: 3;
Number of respondents: 11.
[End of table]
Q13. Were WIA funding sources other than State Set-Aside Funds used in
your state to support new and/or established apprenticeship programs in
FFY 2004?
Yes: 4;
No: 12;
Don't know: 6;
Number of respondents: 22.
[End of table]
Q14. Other than State Set-Aside Funds, which of the following WIA
funding sources were used to support new and/or established
apprenticeship programs in FFY 2004?
a. Adult Funds;
Yes, new programs: 1;
Yes, established programs: 1;
No: 1;
Don't know: 1;
Number of respondents: 4.
b. Dislocated Worker Funds;
Yes, new programs: 0;
Yes, established programs: 1;
No: 1;
Don't know: 2;
Number of respondents: 4.
c. Youth Funds;
Yes, new programs: 0;
Yes, established programs: 0;
No: 2;
Don't know: 2;
Number of respondents: 4.
d. Other;
Yes, new programs: 0;
Yes, established programs: 2;
No: 0;
Don't know: 1;
Number of respondents: 3.
[End of table]
Q16. Did your state establish linkages between WIA state unit and the
state apprenticeship unit in FFY 2004 for any of the following purposes?
a. Shared decision making;
Yes: 0;
No: 18;
Don't know: 4;
Number of respondents: 22.
b. Shared information gathering;
Yes: 4;
No: 14;
Don't know: 4;
Number of respondents: 22.
c. Shared information dissemination, including presentations;
Yes: 7;
No: 11;
Don't know: 4;
Number of respondents: 22.
d. Shared use of educational programs that satisfy specific
apprenticeship requirements;
Yes: 2;
No: 15;
Don't know: 5;
Number of respondents: 22.
e. Shared grant development activities;
Yes: 4;
No: 13;
Don't know: 6;
Number of respondents: 23.
f. Other;
Yes: 7;
No: 2;
Don't know: 4;
Number of respondents: 13.
[End of table]
Q19. How often does your unit conduct formalized Quality Reviews of
individual apprenticeship programs that address on-the-job training,
related instruction, and/or program operations in your state?
Less frequently than every three years: 7;
Once every three years: 2;
Once every two years: 6;
Once a year: 3;
Twice a year: 0;
More than twice a year: 3;
Don't know: 1;
Number of respondents: 22.
[End of table]
Q21. Approximately how many Quality Reviews did your unit conduct in
FFY 2004? (Click in the box and then enter up to a 4-digit whole number
only.)
Mean: 17;
Median: 10;
Minimum: 0;
Maximum: 67;
Number of respondents: 22.
[End of table]
Q22. To what extent, if at all, did your state find the FFY 2004
Quality Reviews useful for the following purposes?
a. Making informed decisions about the administration and operation of
apprenticeship programs;
Very great extent: 2;
Great extent: 8;
Moderate extent: 5;
Some extent: 5;
Little or no extent (Please specify in Question 24): 1;
Don't know: 1;
Number of respondents: 22.
b. Evaluating the strengths and weaknesses of apprenticeship programs
in your state;
Very great extent: 4;
Great extent: 8;
Moderate extent: 4;
Some extent: 3;
Little or no extent (Please specify in Question 24): 2;
Don't know: 1;
Number of respondents: 22.
c. Assessing how well the programs comply with federal regulations;
Very great extent: 3;
Great extent: 10;
Moderate extent: 3;
Some extent: 2;
Little or no extent (Please specify in Question 24): 3;
Don't know: 1;
Number of respondents: 22.
d. Completing reports about your state's apprenticeship program;
Very great extent: 1;
Great extent: 7;
Moderate extent: 5;
Some extent: 3;
Little or no extent (Please specify in Question 24): 4;
Don't know: 2;
Number of respondents: 22.
e. Other;
Very great extent: 1;
Great extent: 3;
Moderate extent: 0;
Some extent: 2;
Little or no extent (Please specify in Question 24): 3;
Don't know: 4;
Number of respondents: 13.
[End of table]
Q26. How often does your unit conduct formalized Equal Employment
Opportunity (EEO) Reviews of individual apprenticeship programs?
Less frequently than every three years: 8;
Once every three years: 3;
Once every two years: 6;
Once a year: 2;
Twice a year: 0;
More than twice a year: 3;
Don't know: 1;
Number of respondents: 23.
[End of table]
Q28. Approximately how many EEO Reviews did your unit conduct in FFY
2004? (Click in the box and then enter up to a 4-digit whole number
only.)
Mean: 10;
Median: 8;
Minimum: 0;
Maximum: 35;
Number of respondents: 23.
[End of table]
Q29. To what extent, if at all, did your state find the FFY 2004 EEO
Reviews useful for the following purposes?
a. Making informed decisions about the administration and operation of
apprenticeship programs in your state;
Very great extent: 4;
Great extent: 9;
Moderate extent: 3;
Some extent: 3;
Little or no extent (Please specify in Question 31): 2;
Don't know: 1;
Number of respondents: 22.
b. Evaluating the strengths and weaknesses of apprenticeship programs
in your state;
Very great extent: 6;
Great extent: 10;
Moderate extent: 2;
Some extent: 2;
Little or no extent (Please specify in Question 31): 1;
Don't know: 1;
Number of respondents: 22.
c. Assessing how well the programs comply with federal regulations;
Very great extent: 7;
Great extent: 8;
Moderate extent: 4;
Some extent: 2;
Little or no extent (Please specify in Question 31): 0;
Don't know: 1;
Number of respondents: 22.
d. Completing reports about the state's apprenticeship programs;
Very great extent: 1;
Great extent: 8;
Moderate extent: 7;
Some extent: 2;
Little or no extent (Please specify in Question 31): 2;
Don't know: 2;
Number of respondents: 22.
e. Other;
Very great extent: 3;
Great extent: 0;
Moderate extent: 0;
Some extent: 1;
Little or no extent (Please specify in Question 31): 2;
Don't know: 5;
Number of respondents: 11.
[End of table]
Q33. Did your state have procedures or policies for recording
complaints filed in FFY 2004 that were elevated to the level of the
state or regional OATELS office?
Yes: 18;
No: 3;
Don't know: 1;
Number of respondents: 22.
[End of table]
Q34a1. In your state, how many total complaints were referred to state
officials in FFY 2004?
Mean: 2;
Median: 1;
Minimum: 0;
Maximum: 10;
Number of respondents: 18.
[End of table]
Q34a2. Check if actual, estimate, or do not know or cannot estimate:
Check here if actual;
Count: 12;
Number of respondents: 19.
Check here if estimate;
Count: 7;
Number of respondents: 19.
Check here if do not know or cannot estimate;
Count: 0;
Number of respondents: 19.
[End of table]
Q34b1. How many complaints concerned termination in FFY 2004?
Mean: 1;
Median: 0;
Minimum: 0;
Maximum: 8;
Number of respondents: 18.
[End of table]
Q34b2. Check if actual, estimate, or do not know or cannot estimate:
Check here if actual;
Count: 12;
Number of respondents: 18.
Check here if estimate;
Count: 6;
Number of respondents: 18.
Check here if do not know or cannot estimate;
Count: 0;
Number of respondents: 18.
[End of table]
Q34c1. How many complaints concerned discrimination in FFY 2004?
Mean: 0;
Median: 0;
Minimum: 0;
Maximum: 5;
Number of respondents: 18.
[End of table]
Q34c2. Check if actual, estimate, or do not know or cannot estimate:
Check here if actual;
Count: 13;
Number of respondents: 18.
Check here if estimate;
Count: 5;
Number of respondents: 18.
Check here if do not know or cannot estimate;
Count: 0;
Number of respondents: 18.
[End of table]
Q34d1. How many complaints concerned wages in FFY 2004?
Mean: 0;
Median: 0;
Minimum: 0;
Maximum: 2;
Number of respondents: 18.
[End of table]
Q34d2. Check if actual, estimate, or do not know or cannot estimate:
Check here if actual;
Count: 13;
Number of respondents: 18.
Check here if estimate;
Count: 5;
Number of respondents: 18.
Check here if do not know or cannot estimate;
Count: 0;
Number of respondents: 18.
[End of table]
Q34e1. How many complaints concerned related instruction in FFY 2004?
Mean: 0;
Median: 0;
Minimum: 0;
Maximum: 3;
Number of respondents: 18.
[End of table]
Q34e2. Check if actual, estimate, or do not know or cannot estimate:
Check here if actual;
Count: 13;
Number of respondents: 18.
Check here if estimate;
Count: 5;
Number of respondents: 18.
Check here if do not know or cannot estimate;
Count: 0;
Number of respondents: 18.
[End of table]
Q34f1. How many complaints concerned on-the-job training in FFY 2004?
Mean: 0;
Median: 0;
Minimum: 0;
Maximum: 4;
Number of respondents: 20.
[End of table]
Q34f2. Check if actual, estimate, or do not know or cannot estimate:
Check here if actual;
Count: 12;
Number of respondents: 16.
Check here if estimate;
Count: 4;
Number of respondents: 16.
Check here if do not know or cannot estimate;
Count: 0;
Number of respondents: 16.
[End of table]
Q34g1. How many complaints concerned other issues in FFY 2004?
Mean: 1;
Median: 0;
Minimum: 0;
Maximum: 5;
Number of respondents: 18.
[End of table]
Q34g2. Check if actual, estimate, or do not know or cannot estimate:
Check here if actual;
Count: 13;
Number of respondents: 18.
Check here if estimate;
Count: 5;
Number of respondents: 18.
Check here if do not know or cannot estimate;
Count: 0;
Number of respondents: 18.
[End of table]
Q36. Which of the following were sources of data used to answer the
prior questions about complaints regarding apprenticeship programs in
FFY 2004?
Electronic statewide system;
Yes: 3;
No: 12;
Don't know: 1;
Number of respondents: 16.
Centralized listing, log, or other paper compilation;
Yes: 6;
No: 10;
Don't know: 1;
Number of respondents: 17.
Manual search of files;
Yes: 7;
No: 8;
Don't know: 0;
Number of respondents: 15.
Other;
Yes: 3;
No: 2;
Don't know: 1;
Number of respondents: 6.
[End of table]
[End of section]
Appendix IV: Responses to Survey of Directors of Apprenticeships in
Council-Monitored States:
Q2. At the close of your state's FY 2004, what was the total number of
registered apprentices in your state?
Mean: 8,949;
Median: 4748;
Minimum: 689;
Maximum: 72,920;
Number of respondents: 27.
[End of table]
Q2a. At the close of your state's FY 2004, what was the total number of
registered apprentices in construction trades in your state?
Mean: 6,287;
Median: 4052;
Minimum: 323;
Maximum: 52,277;
Number of respondents: 26.
[End of table]
Q2b. At the close of your state's FY 2004, what was the total number of
approved apprenticeship programs in construction trades in your state?
Mean: 308;
Median: 225;
Minimum: 28;
Maximum: 1,320;
Number of respondents: 26.
[End of table]
Q3. During state FY 2004, how many full-time equivalency (FTE)
apprenticeship training staff were employed by your apprentice unit to
monitor and oversee apprenticeship programs in your state?
Mean: 7;
Median: 4;
Minimum: 0;
Maximum: 32;
Number of respondents: 27.
[End of table]
Q4. Of the FTE apprenticeship training staff reported above,
approximately what percentage of their time was spent in the field
monitoring and overseeing apprenticeship or providing technical
assistance related to monitoring or oversight during state FY2004?
0 - 19%: 4;
20 - 39%: 5;
40 - 59%: 7;
60 - 79%: 6;
80 - 100%: 5;
Don't know: 0;
Number of respondents: 27.
[End of table]
Q5. Do you have a BAT agency in your state?
Yes: 20;
No: 7;
Don't know: 0;
Number of respondents: 27.
[End of table]
Q6. During state FY 2004, how many full-time equivalency (FTE)
apprenticeship training staff were employed by the BAT agency in your
state to monitor and oversee apprenticeship programs in your state?
Mean: 2;
Median: 1;
Minimum: 0;
Maximum: 10;
Number of respondents: 19.
[End of table]
Q8. How often does your OATELS conduct SAC 29/29 Review (Review of
Labor Standards for Registration of Apprenticeship Programs) in your
state?
Less frequently than every three years: 12;
Once every three years: 5;
Once every two years: 3;
Once a year: 2;
Twice a year: 0;
More than twice a year: 0;
Don't know: 5;
Number of respondents: 27.
[End of table]
Q10. To what extent did your state find OATELS' most recent SAC 29/29
Review (Review of Labor Standards for Registration of Apprenticeship
Programs) useful for the following purposes in your state?
a. Making informed decisions about the administration and operation of
apprenticeship programs;
Very great extent: 2;
Great extent: 3;
Moderate extent: 5;
Some extent: 4;
Little or no extent: 6;
Don't know: 7;
Number of respondents: 27.
b. Evaluating the strengths and weaknesses of apprenticeship programs
in your state;
Very great extent: 2;
Great extent: 4;
Moderate extent: 2;
Some extent: 5;
Little or no extent: 7;
Don't know: 7;
Number of respondents: 27.
c. Assessing how well the programs comply with federal regulations;
Very great extent: 3;
Great extent: 6;
Moderate extent: 2;
Some extent: 3;
Little or no extent: 6;
Don't know: 7;
Number of respondents: 27.
d. Completing reports about your state's apprenticeship program;
Very great extent: 1;
Great extent: 1;
Moderate extent: 3;
Some extent: 6;
Little or no extent: 8;
Don't know: 8;
Number of respondents: 27.
e. Other;
Very great extent: 2;
Great extent: 0;
Moderate extent: 1;
Some extent: 1;
Little or no extent: 1;
Don't know: 9;
Number of respondents: 14.
[End of table]
Q15. How often does OATELS conduct SAC 29/30 Review (Review of Equal
Employment Opportunity in Apprenticeship and Training) in your state?
Less frequently than every three years: 12;
Once every three years: 5;
Once every two years: 3;
Once a year: 2;
Twice a year: 0;
More than twice a year: 0;
Don't know: 5;
Number of respondents: 27.
[End of table]
Q17. To what extent, if at all, did your state find OATELS' most recent
SAC 29/30 Review (Equal Employment Opportunity in Apprenticeship and
Training) useful for the following purposes?
a. Making informed decisions about the administration and operation of
apprenticeship programs;
Very great extent: 2;
Great extent: 2;
Moderate extent: 6;
Some extent: 3;
Little or no extent: 5;
Don't know: 9;
Number of respondents: 27.
b. Evaluating the strengths and weaknesses of apprenticeship programs
in your state;
Very great extent: 1;
Great extent: 3;
Moderate extent: 5;
Some extent: 4;
Little or no extent: 5;
Don't know: 9;
Number of respondents: 27.
c. Assessing how well the programs comply with federal regulations;
Very great extent: 2;
Great extent: 6;
Moderate extent: 4;
Some extent: 2;
Little or no extent: 4;
Don't know: 9;
Number of respondents: 27.
d. Completing reports about your state's apprenticeship program;
Very great extent: 1;
Great extent: 0;
Moderate extent: 5;
Some extent: 6;
Little or no extent: 5;
Don't know: 10;
Number of respondents: 27.
e. Other;
Very great extent: 1;
Great extent: 0;
Moderate extent: 0;
Some extent: 1;
Little or no extent: 2;
Don't know: 4;
Number of respondents: 8.
[End of table]
Q21. Does your state presently use OATELS' Registered Apprenticeship
Information System (RAIS) to register apprentices and to track
apprentice and program information?
Yes: 6;
No: 21;
Number of respondents: 27.
[End of table]
Q23. Does you state plan or intend to use RAIS to register apprentices
and track apprenticeship and program information in the future ?
Yes: 7;
No: 12;
Don't know: 5;
Number of respondents: 24.
[End of table]
Q26. Did your state use the WIA Governor's 15% State Set-Aside funds to
support new and/or established apprenticeship programs in state FY 2004?
Yes: 7;
No: 20;
Don't know: 0;
Number of respondents: 27.
[End of table]
Q27. Were WIA State Set-Aside funds used to support new and/or
established apprenticeship programs in your state in state FY 2004 to
do any of the following?
a. To provide related instruction or other education that satisfied
specific apprenticeship requirements;
Yes, new apprenticeship programs: 4;
Yes, established apprenticeship programs: 1;
No: 2;
Don't know: 0;
Number of respondents: 7.
b. To provide on-the-job training;
Yes, new apprenticeship programs: 2;
Yes, established apprenticeship programs: 1;
No: 3;
Don't know: 0;
Number of respondents: 6.
c. To disseminate information about apprenticeship programs;
Yes, new apprenticeship programs: 4;
Yes, established apprenticeship programs: 0;
No: 3;
Don't know: 0;
Number of respondents: 7.
d. To encourage entities to sponsor and register additional or new
programs;
Yes, new apprenticeship programs: 2;
Yes, established apprenticeship programs: 1;
No: 3;
Don't know: 0;
Number of respondents: 6.
e. Other;
Yes, new apprenticeship programs: 2;
Yes, established apprenticeship programs: 2;
No: 1;
Don't know: 0;
Number of respondents: 5.
[End of table]
Q29. For which of the following reasons did your state not use WIA Set-
Aside Funds to support apprenticeship programs in state FY 2004?
a. Decision-makers gave priority to other programs;
Yes: 10;
No: 5;
Don't know: 7;
Number of respondents: 22.
b. Decision-makers did not believe funds could be used to support new
apprenticeship programs;
Yes: 8;
No: 4;
Don't know: 10;
Number of respondents: 22.
c. Decision-makers did not believe funds could be used to support
established apprenticeship programs;
Yes: 8;
No: 4;
Don't know: 10;
Number of respondents: 22.
d. Decision-makers did not believe funds could be used to provide
related instruction or other education that satisfied specific
apprenticeship requirements;
Yes: 5;
No: 5;
Don't know: 12;
Number of respondents: 22.
e. Decision-makers did not believe funds could be used to provide on-
the-job training;
Yes: 4;
No: 7;
Don't know: 11;
Number of respondents: 22.
f. Decision-makers did not believe funds could be used to disseminate
information about apprenticeship programs;
Yes: 3;
No: 6;
Don't know: 13;
Number of respondents: 22.
g. Decision-makers did not believe funds could be used to encourage the
recruitment of entities to sponsor and register new programs;
Yes: 5;
No: 6;
Don't know: 11;
Number of respondents: 22.
h. Decision-makers did not establish linkages between the state
apprenticeship unit and unit(s) responsible for WIA;
Yes: 9;
No: 9;
Don't know: 5;
Number of respondents: 23.
i. Other;
Yes: 2;
No: 2;
Don't know: 3;
Number of respondents: 7.
[End of table]
Q31. Were WIA funding sources other than State Set-Aside Funds used in
your state to support new and/or established apprenticeship programs in
state FY 2004?
Yes: 7;
No: 17;
Don't know: 3;
Number of respondents: 27.
[End of table]
Q32. Other than State Set-Aside Funds, which of the following WIA
funding sources were used to support new and/or established
apprenticeship programs in state FY 2004?
a. Adult Funds;
Yes, new programs: 1;
Yes, established programs: 4;
No: 0;
Don't know: 1;
Number of respondents: 6.
b. Dislocated Worker Funds;
Yes, new programs: 0;
Yes, established programs: 5;
No: 1;
Don't know: 1;
Number of respondents: 7.
c. Youth Funds;
Yes, new programs: 2;
Yes, established programs: 1;
No: 2;
Don't know: 1;
Number of respondents: 6.
d. Other;
Yes, new programs: 0;
Yes, established programs: 1;
No: 1;
Don't know: 1;
Number of respondents: 3.
[End of table]
Q34. Did your state establish linkages between WIA and the state
apprenticeship unit in state FY 2004 for any of the following purposes?
a. Shared decision making;
Yes: 7;
No: 17;
Don't know: 2;
Number of respondents: 26.
b. Shared information gathering;
Yes: 15;
No: 10;
Don't know: 1;
Number of respondents: 26.
c. Shared information dissemination, including presentations;
Yes: 14;
No: 13;
Don't know: 0;
Number of respondents: 27.
d. Shared use of educational programs that satisfy specific
apprenticeship requirements;
Yes: 6;
No: 17;
Don't know: 3;
Number of respondents: 26.
e. Shared grant development activities;
Yes: 5;
No: 18;
Don't know: 3;
Number of respondents: 26.
f. Other;
Yes: 1;
No: 4;
Don't know: 1;
Number of respondents: 6.
[End of table]
Q37. Did your state have a mechanism for conducting formalized reviews
of apprenticeship programs that address on-the-job training, related
instruction, and/or program operations in state FY 2004?
Yes: 25;
No: 0;
Don't know: 0;
Number of respondents: 25.
[End of table]
Q38. Which of the following components --on-the-job training, related
instruction, and/or program operations --were included in these reviews?
a. Currency of on-the-job training with acceptable industry practice;
Yes: 22;
No: 3;
Don't know: 0;
Number of respondents: 25.
b. Relative continuity of employment for on-the-job training;
Yes: 25;
No: 0;
Don't know: 0;
Number of respondents: 25.
c. Provision of on-the-job training in all aspects of trades;
Yes: 25;
No: 0;
Don't know: 0;
Number of respondents: 25.
d. Consistency with standards for related instructions;
Yes: 25;
No: 0;
Don't know: 0;
Number of respondents: 25.
e. Currency of related instruction with acceptable industry practice;
Yes: 21;
No: 3;
Don't know: 1;
Number of respondents: 25.
f. Appropriateness of wages to actual hours of related instruction and
on-the-job training;
Yes: 24;
No: 1;
Don't know: 0;
Number of respondents: 25.
g. Establishment of criteria or guidelines for instructors;
Yes: 14;
No: 11;
Don't know: 0;
Number of respondents: 25.
h. Completion rates;
Yes: 23;
No: 2;
Don't know: 0;
Number of respondents: 25.
i. Cancellation rates;
Yes: 22;
No: 2;
Don't know: 1;
Number of respondents: 25.
j. Relative amount of time taken by apprentices to complete programs
relative to time required for program;
Yes: 18;
No: 7;
Don't know: 0;
Number of respondents: 25.
k. Maintenance of required records;
Yes: 25;
No: 0;
Don't know: 0;
Number of respondents: 25.
l. Other;
Yes: 1;
No: 1;
Don't know: 1;
Number of respondents: 3.
[End of table]
Q40. How often does your state conduct formalized reviews of individual
apprenticeship programs that address on-the-job training, related
instruction, and/or program operations?
Less frequently than every three years: 4;
Once every three years: 1;
Once every two years: 10;
Once a year: 7;
Twice a year: 2;
More than twice a year: 3;
Don't know: 0;
Number of respondents: 27.
[End of table]
Q42. Does your state have a mechanism for conducting formalized Equal
Employment Opportunity (EEO) reviews of individual apprenticeship
programs?
Yes: 24;
No: 3;
Don't know: 0;
Number of respondents: 27.
[End of table]
Q43. How often does your state conduct formalized Equal Employment
Opportunity (EEO) reviews of individual apprenticeship programs?
Less frequently than every three years: 2;
Once every three years: 3;
Once every two years: 8;
Once a year: 11;
Twice a year: 0;
More than twice a year: 0;
Don't know: 0;
Number of respondents: 24.
[End of table]
Q45. Did your state have procedures or policies for recording
complaints filed in state FY 2004 that were elevated to the level of
state apprenticeship agencies?
Yes: 22;
No: 3;
Don't know: 1;
Number of respondents: 26.
[End of table]
Q46a1. In your state, how many total complaints were referred to state
officials in state FY 2004?
Mean: 46;
Median: 1;
Minimum: 0;
Maximum: 699;
Number of respondents: 20.
[End of table]
Q46a2. Check if actual, estimate, or do not know or cannot estimate:
Check here if actual;
Count: 15;
Number of respondents: 22.
Check here if estimate;
Count: 5;
Number of respondents: 22.
Check here if do not know or cannot estimate;
Count: 2;
Number of respondents: 22.
[End of table]
Q46b1. How many complaints concerned termination in state FY 2004?
Mean: 10;
Median: 1;
Minimum: 0;
Maximum: 100;
Number of respondents: 20.
[End of table]
Q46b2. Check if actual, estimate, or do not know or cannot estimate:
Check here if actual;
Count: 16;
Number of respondents: 22.
Check here if estimate;
Count: 3;
Number of respondents: 22.
Check here if do not know or cannot estimate;
Count: 3;
Number of respondents: 22.
[End of table]
Q46c1. How many complaints concerned discrimination in state FY 2004?
Mean: 0;
Median: 0;
Minimum: 0;
Maximum: 2;
Number of respondents: 20.
[End of table]
Q46c2. Check if actual, estimate, or do not know or cannot estimate:
Check here if actual;
Count: 17;
Number of respondents: 21.
Check here if estimate;
Count: 1;
Number of respondents: 21.
Check here if do not know or cannot estimate;
Count: 3;
Number of respondents: 21.
[End of table]
Q46d1. How many complaints concerned wages in state FY 2004?
Mean: 2;
Median: 0;
Minimum: 0;
Maximum: 25;
Number of respondents: 20.
[End of table]
Q46d2. Check if actual, estimate, or do not know or cannot estimate:
Check here if actual;
Count: 15;
Number of respondents: 22.
Check here if estimate;
Count: 4;
Number of respondents: 22.
Check here if do not know or cannot estimate;
Count: 3;
Number of respondents: 22.
[End of table]
Q46e1. How many complaints concerned related instruction in state FY
2004?
Mean: 1;
Median: 0;
Minimum: 0;
Maximum: 5;
Number of respondents: 21.
[End of table]
Q46e2. Check if actual, estimate, or do not know or cannot estimate:
Check here if actual;
Count: 16;
Number of respondents: 21.
Check here if estimate;
Count: 4;
Number of respondents: 21.
Check here if do not know or cannot estimate;
Count: 1;
Number of respondents: 21.
[End of table]
Q46f1. How many complaints concerned on-the-job training in state FY
2004?
Mean: 1;
Median: 0;
Minimum: 0;
Maximum: 9;
Number of respondents: 23.
[End of table]
Q46f2. Check if actual, estimate, or do not know or cannot estimate:
Check here if actual;
Count: 17;
Number of respondents: 22.
Check here if estimate;
Count: 4;
Number of respondents: 22.
Check here if do not know or cannot estimate;
Count: 1;
Number of respondents: 22.
[End of table]
Q46g1. How many complaints concerned other issues in state FY 2004?
Mean: 37;
Median: 0;
Minimum: 0;
Maximum: 664;
Number of respondents: 19.
[End of table]
Q46g2. Check if actual, estimate, or do not know or cannot estimate:
Check here if actual;
Count: 16;
Number of respondents: 20.
Check here if estimate;
Count: 2;
Number of respondents: 20.
Check here if do not know or cannot estimate;
Count: 2;
Number of respondents: 20.
[End of table]
Q48. Which of the following were sources of data used to answer the
prior questions about complaints regarding apprenticeship programs in
the construction trade in state FY 2004?
a. Electronic statewide system;
Yes: 5;
No: 6;
Don't know: 1;
Number of respondents: 12.
b. Centralized listing, log, or other paper compilation;
Yes: 8;
No: 5;
Don't know: 1;
Number of respondents: 14.
c. Manual search of files;
Yes: 11;
No: 4;
Don't know: 1;
Number of respondents: 16.
d. Other;
Yes: 3;
No: 1;
Don't know: 0;
Number of respondents: 4.
[End of table]
[End of section]
Appendix V: Comments from the Department of Labor:
U.S. Department of Labor:
Assistant Secretary for Employment and Training:
Washington, D.C. 20210:
AUG 12 2005:
Mr. Sigurd R. Nilsen:
Director:
Education, Workforce, and Income Security Issues:
U.S. Government Accountability Office:
441 G. Street, N.W.:
Washington, D.C. 20548:
Dear Mr. Nilsen:
The Employment and Training Administration (ETA) is in receipt of the
draft Government Accountability Office (GAO) report entitled, "Labor
Can Better Use Data to Target Oversight" (GAO-05-886).
ETA's goal of providing businesses and workers with skills for the 21
st century will be enhanced by strategically using data to improve
apprenticeship programs. We support the report's recommendations and
are providing an overview of efforts that we will undertake or have
already been instituted to address these recommendations.
1) GAO Recommendation:
Better utilize information in DOL's database, such as indicators of
program performance, for management oversight, particularly for
apprenticeship programs in occupations with expected future labor
shortages.
ETA Response:
The Department concurs with this recommendation. ETA will use its
existing data resources as well as data from the Bureau of Labor
Statistics (BLS) to identify occupations with skill shortages to help
us better target our program performance and oversight activities. We
will seek input from DOL's Advisory Committee on Apprenticeship
regarding industry labor shortages. ETA will expand its use of the
Registered Apprenticeship Information System (RAIS) indicators along
with implementation of WebCEO, a data mining tool.
2) GAO Recommendation:
Develop a cost-effective strategy for collecting data from council-
monitoring states.
ETA Response:
The Department agrees with this recommendation. It is our desire to
have the most complete national apprenticeship data possible. Efforts
underway have resulted in two additional State Apprenticeship Agency
(SAA) states agreeing to participate in RAIS. Kentucky started using
RAIS in June of this year and North Carolina is in the process of
converting to RAIS. We are hopeful that these early successes will
culminate in other states joining the system in the near future.
Targeted resources are being utilized to make this a priority for the
Department.
ETA instituted an Apprentice Electronic Registration (AER) process for
RAIS in October 2004. This new feature is offered to SAA states as a
cost-effective measure to improve data integrity and efficiency of
apprenticeship data collection because the sponsor will enter the data.
ETA has been in negotiations with five SAA states since this process
went on-line.
3) GAO Recommendation:
Conduct reviews of apprenticeship activities in states that regulate
their own programs on a regular basis to ensure that state activities
are in accord with Labor's requirements for recognition of
apprenticeship programs.
ETA Response:
The Department agrees with this recommendation. During Fiscal Year
2005, ETA staff has conducted 10 SAA state reviews and 13 project
reviews, including the District of Columbia, will be completed by
September 30, 2005. The Department's goal for Fiscal Year 2006 is to
complete the remainder of the reviews in SAA states.
ETA's strategic plan for future reviews is to complete one-third of the
SAA states each year. This three-year cycle will provide the necessary
oversight to ensure that SAA states continue to meet the Department's
requirements to maintain recognition for federal purposes.
4) GAO Recommendation:
Offer substantive feedback to states after reviews.
ETA Response:
The Department agrees with this recommendation. Final reports of the
SAA reviews will provide additional feedback and technical assistance.
In addition, the Department will institute an improved follow-up
process to ensure recommendations are implemented.
Please let us know if we can be of further assistance.
Sincerely,
Signed by:
Emily Stover DeRocco:
[End of section]
Appendix VI: GAO Contact and Staff Acknowledgments:
GAO Contact:
Sigurd R. Nilsen (202) 512-7215:
Staff Acknowledgments:
Patrick DiBattista, Assistant Director, Scott Heacock, Linda W. Stokes,
and Kathleen D. White managed all aspects of the assignment. The
following individuals made significant contributions to this report:
Susan Bernstein, Jessica Botsford, Richard Burkard, Cathy Hurley, and
Jean McSween.
[End of section]
Related GAO Products:
Workforce Investment Act: Substantial Funds Are Used for Training, but
Little Is Known Nationally about Training Outcomes. GAO-05-650.
Washington, D.C.: June 2005.
Public Community Colleges and Technical Schools: Most Schools Use Both
Credit and Noncredit Programs for Workforce Development. GAO-05-4.
Washington, D.C.: October 2004.
Registered Apprenticeships: Labor Could Do More to Expand to Other
Occupations. GAO-01-940. Washington, D.C.: September 2001.
Youth Training. PEMD-94-32R. Washington, D.C.: September 1994.
Apprenticeship Training: Administration, Use, and Equal Opportunity.
HRD-92-43. Washington, D.C.: March 1992.
FOOTNOTES
[1] Labor's database also includes data from some federally registered
programs in council-monitored states.
[2] Most apprenticeship programs in construction require 4 years to
complete. In our analysis, we allowed for 6 years, to account for slow
work periods and other delays.
[3] Enrollment increased through fiscal year 2000, reaching a total of
59,625. Since then, there have been fewer apprentices enrolling in 2001
through 2004, with 36,325 apprentices enrolling in fiscal year 2004.
[4] Apprentices entering programs during 1994 to 1998 would be expected
to completed these programs by 2000 to 2004 unless they dropped out.
GAO's Mission:
The Government Accountability Office, the investigative arm of
Congress, exists to support Congress in meeting its constitutional
responsibilities and to help improve the performance and accountability
of the federal government for the American people. GAO examines the use
of public funds; evaluates federal programs and policies; and provides
analyses, recommendations, and other assistance to help Congress make
informed oversight, policy, and funding decisions. GAO's commitment to
good government is reflected in its core values of accountability,
integrity, and reliability.
Obtaining Copies of GAO Reports and Testimony:
The fastest and easiest way to obtain copies of GAO documents at no
cost is through the Internet. GAO's Web site (www.gao.gov) contains
abstracts and full-text files of current reports and testimony and an
expanding archive of older products. The Web site features a search
engine to help you locate documents using key words and phrases. You
can print these documents in their entirety, including charts and other
graphics.
Each day, GAO issues a list of newly released reports, testimony, and
correspondence. GAO posts this list, known as "Today's Reports," on its
Web site daily. The list contains links to the full-text document
files. To have GAO e-mail this list to you every afternoon, go to
www.gao.gov and select "Subscribe to e-mail alerts" under the "Order
GAO Products" heading.
Order by Mail or Phone:
The first copy of each printed report is free. Additional copies are $2
each. A check or money order should be made out to the Superintendent
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or
more copies mailed to a single address are discounted 25 percent.
Orders should be sent to:
U.S. Government Accountability Office
441 G Street NW, Room LM
Washington, D.C. 20548:
To order by Phone:
Voice: (202) 512-6000:
TDD: (202) 512-2537:
Fax: (202) 512-6061:
To Report Fraud, Waste, and Abuse in Federal Programs:
Contact:
Web site: www.gao.gov/fraudnet/fraudnet.htm
E-mail: fraudnet@gao.gov
Automated answering system: (800) 424-5454 or (202) 512-7470:
Public Affairs:
Jeff Nelligan, managing director,
NelliganJ@gao.gov
(202) 512-4800
U.S. Government Accountability Office,
441 G Street NW, Room 7149
Washington, D.C. 20548: