PDF

BLS WORKING PAPERS
U.S. DEPARTMENT OF LABOR
Bureau of Labor Statistics
OFFICE OF RESEARCH
AND EVALUATION
Informal Training: A Review of Existing Data and Some New Evidence
Mark A. Loewenstein and James R. Spletzer
Working Paper 254
June 1994
The authors would like to thank Marilyn Manser, Michael Pergamit, and Jonathan Veum for helpful comments. The views
expressed here are those of the authors and do not necessarily reflect the views of the U.S. Department of Labor or the Bureau of
Labor Statistics.
Formal and Informal Training:
Evidence from the NLSY
May 11, 1998 (revised from June 1994 version)
Forthcoming: Research in Labor Economics
Mark A. Loewenstein
Bureau of Labor Statistics
2 Massachusetts Ave NE, Room 4130
Washington D.C. 20212
Phone: 202-606-7385
E-Mail: [email protected]
James R. Spletzer
Bureau of Labor Statistics
2 Massachusetts Ave NE, Room 4945
Washington D.C. 20212
Phone: 202-606-7393
E-Mail: [email protected]
The views expressed here are those of the authors and do not necessarily reflect the views of the
U.S. Department of Labor or the Bureau of Labor Statistics. We thank John Bishop, George
Jakubson, Duane Leigh, an anonymous referee, and participants at the November 1996 ILRCornell Institute for Labor Market Policies Conference “New Empirical Research on Employer
Training: Who Pays? Who Benefits?” for helpful comments on a previous draft of this paper.
ABSTRACT
Although economists have long been aware of the importance of a worker's on-the-job human
capital investments, current knowledge about the quantity of on-the-job training and its returns is
still relatively scant. This paper analyzes the formal and the informal training information from
four commonly used surveys, paying particular attention to the 1993 and 1994 data from the
National Longitudinal Surveys of Youth. After accounting for differences across surveys with
regard to their sample population and the reference period over which training is measured, the
formal training responses appear quite consistent across data sets. The same cannot be said for
informal training, and the variation in the incidence of informal training across surveys appears to
be due to unsatisfactory routing patterns in the household survey questionnaires. Nevertheless, the
new 1993 and 1994 NLSY informal training data appears to be capturing quite a bit of human
capital accumulation that is missed by the formal training questions. When used together, the
formal and informal training measures can explain much, but not all, of an individual's within job
wage growth.
1
I. Introduction
As suggested some time ago by Becker (1962) and Mincer (1962), on-the-job training
investments are likely responsible for a significant part of the wage growth that occurs in the early
years of tenure. While economists have long been aware of the importance of on-the-job training,
current knowledge about its quantity and its returns is still relatively scant. This is in large part
due to the fact that much on-the-job training is informal.
The National Longitudinal Surveys of Youth (NLSY) have been obtaining information on
formal training for some time. Used in conjunction with the wealth of information that the NLSY
contains on individual demographic characteristics, employment history, schooling, and ability,
these data offer a great advantage for the study of the acquisition of and the returns to training.
The NLSY included questions about informal training for the first time in 1993. These questions
were then repeated in the 1994 survey. In this paper, we analyze the formal and the informal
training information from the 1993 and 1994 NLSY data.
We begin with a general critique of the training information in four commonly used surveys:
the Current Population Survey (CPS), the National Longitudinal Survey of the High School Class
of 1972 (NLS-72), the Employer Opportunity Pilot Project (EOPP), and the National Longitudinal
Surveys of Youth (NLSY). We highlight how differences in training incidence and duration across
surveys can be explained by sample differences, different reference periods, and differences in
question wording. We conclude that the formal training responses appear to be quite consistent
across all four surveys, but that the CPS and NLS-72 are missing most informal training (while the
pre 1993 National Longitudinal Surveys of Youth make no attempt to measure informal training).
With this comparison as background, we analyze the new informal training data in the 1993 and
1994 NLSY. We find that the routing pattern in the survey limits the usefulness of the new NLSY
informal training data for estimating the incidence and duration of informal training. Nevertheless,
2
the NLSY informal training data do capture a sizable number of episodes of skill upgrading that
the formal training questions miss.
We then turn to an analysis of the relationship between wages and training. Our intent here is
twofold. First, to estimate the separate effects of formal and informal training and, second, to
determine the effect of omitting such a potentially important variable as informal training, which is
unavailable in some data sets. We use both the NLSY and the EOPP datasets in our wage growth
analysis. Interestingly, when used together, the formal and informal training measures explain
much, but certainly not all, of the wage growth experienced by individuals in our data. We then
conclude our analysis with a discussion of how individuals learn and become productive at their
jobs, and we offer our opinion of how surveys might best measure this human capital
accumulation.
II. A Critical Review of What is Currently Known about Training
As mentioned in the introduction, much of our lack of knowledge regarding on-the-job training
is attributable to the fact that much of this training is informal. As Brown (1989) notes:
While there are difficulties in measuring formal training, what we would like to measure is
relatively well-defined: an individual is either in a training program or not, formal training has an
identifiable start and end, and one should in principle be able to determine either how many hours
the worker spent or how many dollars the employer spent on any particular training program.
In contrast, informal training is produced jointly with the primary output of the worker, and is
therefore more elusive. Workers learn from watching other workers, may share easier ways to do
the work either while working or during breaks, and are indirectly instructed whenever a supervisor
constructively criticizes their work. Knowing whether informal training is happening in any given
week is difficult to determine; one hopes that for most workers it never ends. The dollar cost is
elusive not only because the time spent by supervisors and other workers is not logged, but also
because the worker's productivity is also likely to be reduced while in training...
Several existing surveys have attempted to obtain explicit information about whether workers
have participated in various training activities, and if so, how much time is devoted to these
3
training spells.1 The questions used to measure training in several of the most widely used surveys
are summarized in Table 1.
The Current Population Survey (CPS) is one source of information on both formal and informal
training.2 This information comes from individuals' responses to two questions. First, individuals
were asked, "Since you obtained your present job did you take any training to improve your
skills?" Analysis of the data reveals that forty-four percent of workers in the CPS indicated that
they had received training. These individuals were then asked, "Did you take the training in school,
a formal company training program, informal on the job, [and/or] other?" Further analysis of the
data indicates that the incidence of informal training is sixteen percent.
The National Longitudinal Survey of the High School Class of 1972 (NLS-72) is a second data
set with information on formal and informal training.3 Individuals who held a full-time job
between October 1979 and February 1986 were asked the question, "Considering the most recent
full-time job you have held, did you receive or participate in any type of employer-provided
training benefits or training programs?" Forty-six percent of individuals indicated that they had
participated in some type of training program. These individuals were then asked to indicate the
number of hours per week and the total number of weeks that they spent in the following types of
1
The existing training measures can be divided into two categories. Some surveys attempt to measure
the explicit costs of training by inquiring about the incidence and duration of various training activities.
Other surveys ask how long it takes workers to become fully productive in their jobs as a means of
measuring the implicit costs of training via the time devoted to "learning by doing." This paper focuses
on explicit measures of training. We will return to a discussion of the learning by doing approach in the
conclusion.
2
The CPS is a monthly survey of approximately 60,000 households that provides information for the
Bureau of Labor Statistics' monthly report on the nation's employment and unemployment situation. In
January 1983 and January 1991, the survey obtained supplementary information about individuals'
training. These training data have been used by Bowers and Swaim (1994), Constantine and Neumark
(1994), Lillard and Tan (1992), Loewenstein and Spletzer (1997a), and Pergamit and Shack-Marquez
(1987). The sample we use in this paper is that used by Loewenstein and Spletzer (1997a).
3
The NLS-72 is a Department of Education survey of 22,652 people who were high school seniors during
the 1971-72 academic year. 12,841 of the individuals in the initial survey were re-surveyed in 1986, and
training information was obtained in this 1986 follow-up survey. See Altonji and Spletzer (1991) for
4
programs: "formal registered apprenticeship, employer-provided job training during hours on
employer premises, informal on-the-job training (e.g., assigned to work with someone for
instruction or guidance, etc.), employer-provided education or training during working hours away
from employer premises, or other." Further analysis of the data reveals that the incidence of
informal training is twenty percent.
A third public use data set with information on training is an employer survey that was carried
out in conjunction with the Employment Opportunity Pilot Project (EOPP).4 Unlike the CPS
supplement and the NLS-72 survey, which did not attempt to obtain very detailed information
about training activities in general and informal training in particular, the EOPP survey asked
employers about several specific types of formal and informal training. Each employer was asked
“Is there formal training, such as self-paced learning programs or training done by specially trained
personnel ...” In addition, referring to the first three months of employment, the employer was
asked about the total number of hours a) “the average new employee spends in training activities
in which he or she is watching other people do the job rather than doing it himself," b)
“management and line supervisors spent away from other activities giving informal individualized
training or extra supervision to (the) typical worker,” and c) “co-workers who are not supervisors
spent away from their normal work giving informal individualized training or extra supervision to
(the) typical worker.” The incidence of formal training is thirteen percent [question 2 in table 1],
and the probability that a new worker spends some time watching others, receiving informal
further details and a complete analysis of the incidence and returns to training. The sample we use in this
paper is that used by Altonji and Spletzer (1991).
4
The training questions are asked in the second of a two-wave longitudinal survey; the sample in the
second wave survey consists of 2,625 employers. See Barron, Black, and Loewenstein (1987, 1989, 1993)
and Bishop (1988, 1991) for further details of the EOPP and an analysis of training and wages. The
sample we use in this paper is that used by Loewenstein and Spletzer (1997b). The training questions in
Barron, Berger, and Black’s (1997) recent surveys for the Small Business Administration and for the
Upjohn Institute are very similar to those in the EOPP survey and (when comparable) yield similar
estimates of incidence and duration.
5
training from management and line supervisors, or receiving informal training from co-workers is
ninety-six percent [questions 1, 4, and 5 in table 1].
The National Longitudinal Survey of Youth (NLSY) is a fourth data set with information on
training.5 Individuals who have worked during the last year are asked: “Since [date of the last
interview] did you attend any training program or any-on-the-job training program designed to help
people find a job, improve job skills, or learn a new job?” An analysis of the 1993 and 1994 data
indicates that average annual incidence of formal training is seventeen percent. The informal
training data from the 1993 and 1994 NLSY surveys will be described in the next section of this
paper.
We have summarized the incidences of formal and informal training from the various surveys in
the top panel of Table 2. One sees that the incidence of formal training varies from approximately
fifteen percent in the EOPP and the NLSY to approximately forty-five percent in the CPS and the
NLS-72. The variation in informal training incidence across data sets is even greater than the
variation of formal training incidence, ranging from sixteen to twenty percent in the CPS and the
NLS-72 to ninety-six percent in the EOPP. The large variation in training incidence across
surveys is somewhat disconcerting at first glance, but upon further reflection is not so surprising
since the surveys differ in their sample population, in the reference period over which training is
5
The NLSY is a dataset of 12,686 individuals who were aged 14 to 21 in 1979. The sample size was
reduced to 11607 in 1985 when interviewing of the full military sample ceased, and in 1991, the sample
was further reduced to 9964 persons when the economically disadvantaged white supplemental sample
was eliminated. These youths have been interviewed annually since 1979, and the response rate has been
90 percent or greater in each year. In each year between 1988 and 1994, the NLSY has collected detailed
information on formal training. Between 1979 and 1986, information was only obtained on formal
training spells that lasted longer than one month. Analysis of the 1993 data indicate that 81.6 percent of
training spells completed during the year are less than or equal to four weeks in duration. There were no
training questions in the 1987 survey. The NLSY data have been used extensively to examine the
relationship between training, tenure, wages, and mobility: see Lynch (1991a, 1991b, 1992) and Royalty
(1996) for an analysis of the early years of the NLSY data, see Loewenstein and Spletzer (1996, 1997a,
1997b, 1998) and Veum (1993, 1995) for an analysis of the more recent years.
6
measured, in the concepts covered by the use of the word training, in whether the survey
respondent is an employee or employer, and in the routing patterns in their questionnaires.
Some of the variation in training incidence can likely be explained by the fact that the various
surveys have different sample populations. While the CPS sample is a representative cross section
of the employed, the workers in the NLSY sample used here are aged 27-38, and the workers in the
NLS-72 are aged 31 and are all high school graduates.6 And the EOPP survey asks about the
training received by the most recently hired worker. A sample of workers whom employers most
recently hired will tend to have a disproportionate number of younger and higher turnover workers.
While the relationship between age and training is uncertain, one would certainly expect higher
turnover workers to be less likely to receive training (for example, see Loewenstein and Spletzer
(1996, 1997a) for a theoretical discussion of this point). In addition, the EOPP survey deliberately
oversampled employers with a relatively high proportion of low wage workers. This last
consideration would also lead one to expect a lower training incidence in the EOPP data, as there is
pretty strong evidence across data sets that workers in higher training positions receive higher
wages.
While differences in sample composition may explain some of the difference in training
incidence among the surveys, it surely cannot possibly explain most of it. One way to see this is to
take advantage of the large sample size of the CPS survey and attempt to “replicate” the other
sample populations of the other various surveys. The results of this exercise can be seen in the
bottom panel of table 2. Note that restricting the age group in the CPS sample has little effect on
training incidence. In contrast, restricting the sample on the basis of tenure has a marked effect on
training incidence. For example, when one restricts the sample to have tenure of three months or
6
Our computations of incidence use unweighted data, and therefore do not adjust for survey nonresponse. The mere fact that workers in the NLS-72 went through the trouble of filling out a long and
7
less, the incidence of formal training falls by over half from 44 percent to 20.5 percent, a figure
that is much closer to the EOPP formal training incidence of 13 percent. However, this reduction
in training incidence is not so much due to a sample composition effect as it is to a training
reference period effect.
As can be seen from table 2, the reference period in the training questions varies across surveys.
While the CPS and the NLS-72 ask about all past training in the current job, the NLSY asks about
training in the past year and the EOPP asks about training in the first three months of the job.
Loewenstein and Spletzer (1997a) use the longitudinal aspect of the NLSY survey to transform the
NLSY training question from a “last year” concept to a “current job” concept. They find that
while the annual incidence of formal training for persons in their third or fourth year of tenure is 18
percent, 35 percent of persons in their third year of tenure have received formal training in their
current job and 43 percent of persons in their fourth year of tenure have received formal training in
their current job. These constructed statistics from the NLSY are remarkably similar to those from
the CPS -- see Loewenstein and Spletzer (1997a).7 Thus, a sizable part (if not all) of the higher
detailed mail survey means that the NLS-72 sample used here is undoubtedly unrepresentative of the
population it is meant to portray.
7
Individuals with more than one year of tenure are not a random sample of workers who started the job.
In cross-sectional data, this matching process between workers and employers could generate the observed
positive relationship between tenure and the probability of ever having received training even if all
training occurs at the start of the job. Loewenstein and Spletzer (1997a) show that delayed formal on-thejob training is the norm rather than the exception, which implies that the positive cross-sectional
relationship between tenure and the probability of ever having received training is not entirely due to
sample composition.
The analysis in Loewenstein and Spletzer (1997a) also provides some empirical evidence relating to
recall bias. The effects of recall bias on the probability of correctly reporting the incidence of training are
likely to be more serious for workers with greater tenure because the greater the period of time between a
worker’s last training spell and the survey date, the more likely the worker is to have forgotten that he
received training. The finding that the probability of ever having received training increases with tenure
indicates that delayed training dominates any potential recall bias. Furthermore, the problem of recall
bias is mitigated by the findings of Mincer (1988), Altonji and Spletzer (1991), Lynch (1992), and
Loewenstein and Spletzer (1996), who find that training is positively correlated over time, implying that
workers who are trained early in their job tenure are likely to be retrained periodically over time.
8
formal training incidence in the CPS and the NLS-72 can clearly be traced to the fact that their
reference period is much longer.
The formal training questions in the various surveys differ in another important way. The
questions in the NLS-72 and, especially the CPS, are broader in that they allow formal schooling
to be included as training. Questions about formal schooling are asked in another part of the
survey and are not explicitly part of the training data collected in the NLSY. Similarly, the
training question in the EOPP also appears to exclude formal schooling. Our analysis of the CPS
and the NLS-72 indicates that if schooling (as measured by CPS question 2a in table 1 and NLS72 question 2e in table 1) is excluded as a training category, the incidence of training (as measured
by CPS question 1 in table 1 and NLS-72 question 1 in table 1) falls by twenty percent in the CPS
(from 44.1 percent to 35.5 percent) and by six percent in the NLS-72 (from 45.7 percent to 42.8
percent). This difference in concepts regarding what constitutes training further helps to reconcile
the difference between the CPS and the NLS-72 relative to the EOPP and the NLSY formal
training measures.8
To summarize, when one takes into account the differences in sample populations, differences
in reference periods, and differences in concepts underlying the definition of formal training, the
formal training responses appear quite consistent across data sets. The evidence from the various
surveys suggests that approximately 45 percent of workers have received formal training while on
8
Note that the CPS training question in table 1, “Since you obtained your present job did you take any
training to improve your skills?” does not explicitly ask about “formal training”. However, the wording,
“take any training to improve your skills” certainly implies formal training. Rather than treating
affirmative responses to question 1 as an indicator of formal training in the CPS, one might argue that we
should take affirmative responses to the follow up question 2b, “Did you take the training in a formal
company training program?” as an indicator of formal training. In this case, the concept of the CPS
formal training question would be narrower than the formal training concepts implied in question 2 of
table 1 for the EOPP survey. After adjusting for the longer reference period in the CPS, formal training in
the CPS as measured by this alternative construct would be lower than in the EOPP. Similar comments
apply to the NLS-72 and the NLSY formal training measures.
9
their current job, and the annual incidence of formal training appears to be about 17 percent. What
can we conclude about the incidence of informal training?
While the incidence of informal training is sixteen percent in the CPS and twenty percent in the
NLS-72, it is ninety-six percent in the EOPP. Thus, in spite of their longer reference period and
the notable difference in sample populations (higher tenure and higher wage samples), the incidence
of informal training is markedly lower in the CPS and the NLS-72 than in the EOPP. One possible
explanation might lie in the fact that the EOPP is an employer survey, while the CPS and the NLS72 are individual surveys. After all, there may well be some ambiguity as to whether the time that
a supervisor spends with a new worker constitutes training or merely supervision and monitoring.
The low incidence of reported informal training in the employee surveys might lead one to suspect
that workers simply underestimate the amount of informal training that they receive from
supervisors and co-workers. Interestingly, however, in a survey funded by the Upjohn Foundation,
Barron, Berger, and Black (1997) ask the EOPP type training questions of both new workers and
their employers. Although conclusions from this survey should be tempered by its small sample
size, it is noteworthy that while workers' responses are only imperfectly correlated with those of
their employers, employers' training estimates are not appreciably higher than those of their
workers. Indeed, there is an alternative explanation for the low incidence of informal training in
the CPS and the NLS-72.
We believe quite strongly that the peculiar routing patterns in the CPS and the NLS-72 surveys
can explain their relatively low incidences of informal training. For example, in the CPS, only if
an individual answered that he had taken training to improve his skills was he routed into a followup question that allowed him to indicate that he received informal training. It is likely that an
individual who received informal training, but who did not actively take formal training, would not
respond yes to the initial incidence question. The same point applies to the routing pattern in the
NLS-72 survey.
10
Besides their routing patterns, the actual wording of the informal training questions may also
play a role in the CPS and the NLS-72’s low incidence of informal training. The CPS does not
give the respondent any indication as to what is meant by informal training and the NLS-72 only
lists one example. In contrast, the EOPP survey explicitly asks about different types of informal
training (e.g., time the new hire spends watching others).
Given the differences in survey construction, one might expect that the EOPP survey would be
likely to pick up short (and perhaps relatively insignificant) training spells that are missed by the
CPS and the NLS-72 surveys. Indeed, conditional on receiving informal training, the average
length of an informal training spell is 233 hours in the NLS-72 while the mean number of total
hours devoted to informal training (the sum of watching others, informal training by supervisors,
and informal training by managers) is only 143 in the EOPP.9 However, this comparison is
misleading. Recall that the EOPP and the NLS-72 have different reference periods. Because
EOPP only asks about training during the first three months of employment, training spells that
last longer than 12 weeks will be censored. In fact, evidence from Barron, Berger, and Black’s
(1997) SBA survey suggests that about one-third of all spells in the EOPP are censored.
Interestingly, when one performs the experiment of censoring spells in the NLS-72 at twelve
weeks, the mean number of hours of informal training falls to 116, which is roughly comparable to
that in the EOPP. In fact, as seen in figure 1, the distributions of the informal training hours are
nearly identical in the two data sets.10 For completeness, we present the formal training durations
9
This EOPP statistic may well be an overstatement because of possible double counting across the
individual training components. The first training question in Table 1 occurs in a different part of the
survey than the last four questions. Interestingly, this possible problem does not occur in the SBA survey,
and the mean duration of training in the SBA survey is very similar to that in the EOPP survey.
10
The conclusion that the distributions of the informal training durations are similar in the EOPP and the
NLS-72 is simplistic in that we have not attempted to control for differences across the surveys. For
example, even after censoring the NLS-72 durations at 12 weeks, we would still expect the reference
period to have an effect on the comparison. Since the NLS-72 is asking about all training that has
accumulated while on the current job whereas EOPP is asking only about training in the first three
months, one would expect the mean weeks to be longer in the NLS-72, and thus one would expect the
11
(conditional on positive incidence with censoring at 12 weeks) in figure 2. The distributions of
formal training hours are remarkably similar across surveys.
In conclusion, one has to be extremely skeptical of the informal training information in the CPS
and the NLS-72. One strongly suspects that these surveys are simply missing most informal
training. Indeed, because it is hard to imagine a job that does not have some informal training,
especially at the start of the job, the EOPP incidence of 96 percent appears quite reasonable. Thus,
while the existing household surveys appear to be doing a reasonable job measuring formal
training, their information on informal training appears to be woefully inadequate.
III. The New NLSY Informal Training Data
IIIa. The Survey Questions
In an attempt to improve our knowledge about informal on-the-job training, the National
Longitudinal Survey of Youth (NLSY) began asking detailed questions about informal on-the-job
training in the 1993 survey, and then repeated these questions in the 1994 survey. The informal
training information is collected in two different parts of the survey. The first section is intended to
measure training at the start of the job, and is asked in the employer supplements. The second
section is intended to measure training that occurred in the previous 12 months, and is asked
immediately following the sequence of formal training questions.
Referring to the duties the worker currently does, the survey leads into the training questions at
the start of the job by asking, "When you started doing this kind of work for [employer name],
about what percentage of the duties you currently do were you able to perform adequately?" If an
individual responds that he was not able to perform all of his duties adequately, he is then asked
probability of censoring at 12 weeks to be higher in the NLS-72 (however, only 23 percent of NLS-72
informal training durations are 12 weeks or longer). Another effect of the longer reference period in the
12
about how he learned to perform his job duties. He is asked whether he participated in classes or
seminars, whether his supervisor and/or coworkers showed him how to do his job, or whether he
used self-study materials. An individual who responds that he received a particular type of training
is then asked how many weeks and how many hours per week he participated in the training
activity. In the rest of this paper, we will refer to this training as "Start Job Training." The start
job training questions are presented in the left-hand column of table 3.11
As shown in table 3, if the individual responds that he was able to perform 100 percent of his
duties adequately when he started his current work, he is not asked the sequence of "Start Job
Training" questions. This routing pattern is specific to the 1993 survey only, and was changed in
1994 so that all individuals were asked the start job training questions. The effects of this change
will be examined in the empirical work below.
After completing the formal training questions, an individual in the 1993 and 1994 NLSY is
also asked whether he had to learn new job skills in the past 12 months because of any of the
following changes at work:
(His) employer introduced a new product or service
(His) employer introduced new equipment and/or repair procedures
(His) employer needed to upgrade employees' basic skills such as math, reading, or writing
(His) employer needed employees to acquire or upgrade their computer skills
Work teams were created or changed
(His) work site was reorganized in other ways
Changes have occurred in (his) employer's policies such as compensation, benefits,
pensions, and safety
New government regulations went into effect
Changes have occurred in work rules for reasons other than new government regulations.
NLS-72 is that respondents might recall the spell of informal training with the highest intensity as
measured by hours per week.
11
To avoid cluttering table 3, we have taken some liberties when presenting the NLSY informal training
questions: we do not present certain intervening questions nor do we present the questions for “other”
training.
13
If the respondent answers that there has been a change at work that made it necessary to learn new
job skills, then he is asked whether he learned these skills by participating in classes or seminars,
whether his supervisor and/or co-workers showed him how the changes at work would affect his
job, or whether he made use of self-study materials. The individual who responds that he received
training is then asked how many weeks and how many hours per week he participated in the
training activity. In the rest of this paper, we will frequently find it convenient to the refer to this
training as "New Skills Training." The questions about "New Skills Training" are listed on the
right hand side of table 3. Note that they parallel the questions about "Start Job Training."
We will often use the label "informal training" to distinguish the two new sequences of training
questions from the sequence of formal training questions.12 This is partly motivated by the fact
that the “New Skills Training” questions in particular are designed to measure the more informal
types of training that were not already recorded in the preceding sequence of formal training
questions. Note, however, that the categories "Start Job Training" and "New Skills Training" are
both quite broad and that the individual training components are of varying degrees of informality.
While training where supervisors or co-workers show an individual how do to his job corresponds
quite closely to what is typically meant by "informal training," classes and seminars and possibly
self-study might perhaps be thought of as "formal training."
The new NLSY informal training questions have some features in common with the training
questions from the EOPP survey and some features in common with the CPS and the NLS-72
training questions. Similar to the EOPP survey, the new NLSY questions ask explicitly about
various sources of training: classes and seminars, instruction provided by supervisors and/or coworkers, and self-study. In contrast, the CPS and the NLS-72 only ask about an individual's
12
Although we label “Start Job Training” and “New Skills Training” as informal training, these new
NLSY training questions are ideal for analyzing topics such as workers’ responses to technological
14
"informal" training, leaving the definition of informal up to the respondent. The new NLSY
training questions are similar to the NLS-72 questions in that both employ incidence screener
questions when attempting to measure duration: individuals are asked to report the time spent being
trained only if they responded that they had received a specific type of training. The EOPP survey,
on the other hand, encompasses both duration and incidence (zero hours of duration) in a single
question by asking about the total hours spent in a particular type of training.
The new NLSY informal training questions also have a much more serious variant of the
incidence screener. Specifically, the question about the percentage of duties that the individual was
able to perform adequately and the question on whether changes occurred at work can also be
interpreted as incidence screener questions. In the 1993 survey, individuals who were able to
perform all their duties adequately at the start of the job are not routed into the detailed start job
training questions. In both the 1993 and 1994 surveys, individuals who did not experience changes
at work within the past 12 months are not routed into the detailed training questions. The routing
patterns implicit in these new question sequences assume that these individuals have not received
training and almost certainly diminishes their usefulness as a source of information regarding onthe-job skill acquisition.
The incidence screener is likely to be especially problematic in the case of Start Job Training.
Even workers who have the requisite skills to initially perform their duties adequately may receive
job orientation help from supervisors and co-workers. And training may make workers who can
perform their tasks adequately even more productive. In contrast, most New Skills Training is
likely to fit into one of the classifications specified in the changes at work question, although one
should be worried that this question may miss some episodes of skill upgrading not accompanied
by a major change in an individual’s job duties.
change, government regulation, and workplace transformation. In particular, see Leigh and Gifford
15
IIIb. Descriptive Statistics of the NLSY Informal Training Data
In Table 4a, we present descriptive statistics concerning the incidence and duration of Start Job
Training. The top panel is based on data from 1993, and the bottom panel is based on data from
1994. The key distinction between these two years is dictated by the routing pattern of the Start
Job Training questions. In 1993, only individuals who responded that were able to perform less
than 100 percent of their duties adequately at their new job were asked the training questions,
whereas in 1994, all individuals with a new job answered the training questions.13
In 1993, a worker was only asked about his Start Job Training if he indicated that he was not
initially able to perform all of his current job duties. As presented in column 2 of table 4a, nearly
all workers (98.18 percent) in their first year of tenure who were not initially able to perform all of
their job duties received Start Job Training. This incidence rate appears to be closer to that from
the EOPP data rather than that from the CPS or the NLS-72 data. Of course, the implicit
assumption in the routing pattern of the NLSY questionnaire appears to be that workers who were
able to perform all of their duties adequately when they started their job received no informal
training. Under this assumption, only 27.10 percent of those persons in their first year of tenure in
1993 received training at the start of their job (see column 1 of table 4a), an incidence rate that is
closer to that in the CPS and the NLS-72 rather than that in EOPP. Therefore, the new 1993
NLSY informal training questions do not help us resolve the existing confusion regarding how well
household surveys measure the incidence of informal training.
(1996).
13
In 1993, the question regarding what percentage of duties an individual performed adequately was
asked of all persons. In many cases, this is a retrospective measure since only one quarter of working
individuals in the 1993 survey indicated that they have been in their job less than one year. In 1994, this
question was asked only of persons who had started a new job or started new duties within the previous
year. In order to make the responses consistent across surveys, we have restricted the data in each year to
those individuals who are in their first year of tenure.
16
As shown in the bottom panel of Table 4a, 79.89 percent of workers in 1994 received Start Job
Training. This percentage is calculated from all workers in their first year of tenure, regardless of
whether they could or could not initially perform all of their current duties adequately. Comparing
the top and bottom panels of table 4a leads to the conclusion that the implicit presumption in the
1993 routing pattern that only individuals who could not initially perform all of their duties
correctly receive Start Job Training is clearly incorrect. Many of the individuals who indicated
that they could initially perform all of their duties adequately did, in fact, receive some Start Job
Training in 1994. Thirty-two percent of individuals in 1994 were not fully comfortable at the start
of their job, and the incidence of Start Job Training for these persons is 96.51 percent. These
statistics are very similar to the ones for 1993. What we are able to compute from the 1994 data,
but not from the 1993 data, is that the incidence of Start Job Training is 72.15 percent for the 68
percent of individuals who were fully comfortable at the start of their job (the 79.89 percent figure
in the bottom panel of table 4a is a weighted average of 96.51 percent and 72.15 percent). It is
worth noting that, as one would expect, individuals who were not initially fully comfortable have a
higher incidence of informal training than those individuals who could initially perform all of their
duties adequately.
Although the 80 percent incidence rate of informal training at the start of the job in the 1994
NLSY data is not quite as high as the 96 percent incidence rate in EOPP, it is much closer to the
EOPP figure than the 17 percent and 20 percent incidence rates in the CPS and NLS72.
Apparently, workers are able to recognize when they are receiving informal training and will
provide this information when asked if the survey questions are phrased appropriately. This
conclusion is reinforced when one notes that most of the start job training that workers received is
of the more informal kind. In fact, only about 1 percent of workers in the first year of tenure
indicated that they received start job training in the form of classes and seminars. Instead, most of
17
their start job training was in the form of supervisors or coworkers showing them how to do their
work.
Conditional on a worker having received training at the start of his or her job, the quantities of
training received are also reported in table 4a. Looking at the bottom panel with the 1994 data, the
mean training received at the start of the job was 16 hours per week for 6 weeks, resulting in a
mean of 96 total hours. The standard deviation for total hours is 176.64, which reflects the fact
that the distribution of total hours has a long right hand tail. Comparing the top and bottom panels
of table 4a, it is noteworthy that even individuals who could initially perform all their job duties
adequately received a substantial amount of informal training at the start of the job. It is also
reassuring to note that individuals who could not initially perform all of their job duties adequately
receive slightly more training (in 1994, individuals who were not fully comfortable and received
positive training received an average of 124 hours, while who were fully comfortable and received
positive training received an average of 78 hours).
The top panel of table 4b presents the incidences and durations of New Skills Training spells,
where "New Skills Training" refers to the training that was received in the previous twelve months
for the subsample that experienced changes at work requiring new skills. We see that 91.74
percent of those who reported changes at the workplace reported having received training.14
However, the routing pattern of both the 1993 and the 1994 NLSY questionnaires implicitly
assumes that workers who did not experience changes at the workplace requiring the learning of
new skills did not receive any new skills training in the past year. Under this assumption, one
would conclude that 40.43 percent of all workers received new skills training in the current year.
Unlike the Start Job Training questions, this routing pattern was not “fixed” in 1994 and thus we
14
Recall that this training is distinct from the formal training captured in the regular section of the
questionnaires.
18
have no test of how many individuals received New Skills informal training during the previous
year.
The incidence and duration of formal training are presented in the bottom panel of table 4b (the
wording of the formal training questions is given in table 1). We see that 17.25 percent of workers
received formal training in the current year. This is less than half the incidence rate of New Skills
training over the same reference period. Furthermore, one may note that the mean completed
formal training spell is approximately as long as the mean new skills training spell (48.19 hours
versus 56.74 hours).15 This similarity of mean total hours serves to highlight the importance of the
new "informal" training questions for improving our understanding of human capital accumulation.
The “New Skills Training” questions appear to be capturing a sizable number of both short and
long episodes of skill upgrading that are missed by the formal training questions.
A common finding in the training literature is that training tends to be provided to more
educated workers having greater ability. Is the same thing true of informal training in the NLSY?
A second question concerning incidence has to do with whether formal and informal training tend
to be substitutes or complements. Table 4c presents estimates of training incidence equations,
where the reported probit coefficients (and the associated standard errors) are the estimated effects
of the explanatory variables on the probability of training at the sample means. Asterisks indicate
statistical significance at the five percent level. Our discussion begins by focusing on the incidence
15
We should note that we do not know whether or not a “New Skills Training” spell is completed by the
date of the interview; if the training spell is still ongoing, then the number of weeks measure will be right
censored, and thus the mean hours statistic will probably be downward biased. On the other hand, formal
training durations are only asked for completed spells; durations of formal training spells ongoing at the
date of the interview are not recorded, and thus the mean hours statistic will also be downward biased if
the ongoing spells are on average longer duration spells. Furthermore, the statement in the text that the
mean total hours of formal training is roughly comparable to the mean total hours of new skills training
hours is quite sensitive to how one treats outliers in both the formal and the informal training duration
distributions. We have omitted the roughly two percent of formal and informal training durations that are
clearly outliers.
19
of new skills training and formal training, the focus of our wage growth analysis in the next
section.
The columns titled “New Skills Training” and “Formal Training” report results from equations
in which the dependent variable is a 0-1 variable indicating whether or not an individual receives
new skills training or formal training, respectively. We should remind the reader that we are forced
to assume that individuals who indicated that they did not experience a workplace change received
no new skills informal training. Stated differently, since 92 percent of those who experienced a
workplace change received new skills training, we are essentially estimating equations for
experiencing a workplace change and interpreting these as training equations.
The coefficients in the “New Skills Training” column of table 4c indicates that jobs that offer
formal training are also likely to offer informal training. The fraction of persons who receive both
new skills training and formal training in the same year is .2153 higher than the fraction of persons
who receive only new skills training, and this differential represents 53 percent of the mean of the
informal training probability. As we will point out in the next section, this strong positive
correlation between formal and informal training in the current year has important implications for
our understanding of the effects of formal and informal training on wage growth.
Not only does formal training have a positive and significant coefficient in the new skills
training equation and vice versa, but the various demographic and job characteristic variables have
similar effects in the two equations. The incidence of new skills and formal training both increase
with ability as measured by the armed forces qualifying test (AFQT) score. Interestingly, the
theoretical prediction about the relationship between education and training is unclear. On the one
hand, if education and training impart similar skills and if there are diminishing returns to these
skills in production, then the return to training a more poorly educated person will be higher than
the return to training a more highly educated one; this effect will be reinforced by the fact that the
opportunity cost of a poorly educated person's time is lower than that of a more educated person.
20
On the other hand, if training and education are complementary (which often seems to be the case),
then there will be greater benefit from training more highly educated persons. In addition, education
may serve as an indicator of ability and (as suggested by the positive AFQT coefficient) training
and ability are likely to be complements in production. All previous studies have found that
training increases with education, although this relationship may become negative for those with
graduate school. We obtain the same result here: the probability of receiving both formal training
and new skills training is higher for persons with education beyond high school.
Loewenstein and Spletzer (1996, 1997a) have shown that formal training is not all concentrated
at the very beginning of an employment relationship, but persists throughout. We obtain the same
result in our current sample: the positive coefficient on tenure and the negative coefficient on
tenure squared indicates that the incidence of formal training actually increases with tenure,
although at an decreasing rate. Interestingly, the same is true of new skills training. And finally, it
is now well established in the literature that larger firms offer more training. This result is
confirmed here. The incidence of formal and new skills training are both positively related to
establishment size. The probability of training is also higher at a multisite firm. This latter effect
is particularly large in magnitude: the coefficients on multiple site firms are equal to roughly one
quarter of the means of the dependent variables.
Turning to informal training at the start of the job, recall that the 1993 NLSY implicitly
presumes that individuals who can initially perform all of their job duties adequately do not receive
Start Job Training. This routing pattern was changed in 1994, and we use the 1994 data to
estimate the probit equations in table 4d. In column 1, the 0-1 dependent variable is whether or not
the individual can perform all of his job duties adequately. This specification is intended to
replicate the routing pattern in the 1993 Start Job Training data, where individuals who were fully
comfortable are assumed to have not received training. Note that variables indicating whether an
individual receives formal training and new skills training have significantly positive coefficients in
21
this equation, as does the individual’s AFQT score. The effects of the other explanatory variables
are not significantly different from zero.
The probit equation in column 2 of Table 4d analyzes the determinants of informal training at
the start of the job, where the informal training questions are asked of all persons (as defined in the
bottom panel of table 4a). The probit equation in column 3 analyzes the determinants of not
performing all job duties adequately conditional on receiving informal training. Note in particular
that AFQT has no effect on receiving informal training during the first year of tenure, but has a
very large positive effect on the probability that an individual is not initially able to perform all of
his job duties. These results suggest quite strongly that informal training at the start of the job
does not appear to be selectively given to certain individuals (quite a different conclusion than new
skills training and formal training which occur at all years of tenure), but conditional on receiving
informal training at the start of the job, more able individuals are routed into more demanding jobs
where they might not be fully comfortable.
IV. Training and Wage Growth
The recent availability of data with explicit information on the incidence and duration of
training has led to the growth of a literature examining the relationship between training and
wages. The papers in this literature find that employer provided training is positively related to
wages and wage growth. However, as we noted earlier, most if not all public use individual based
datasets do not appear to accurately measure the totality of formal and informal training. Our goal
in this section is to use the both the recent NLSY data (despite its flaws) and the EOPP data to
estimate how much of wage growth can be accounted for by existing measures of formal and
informal training.
To fix ideas, suppose the log real wage of a worker i who is in his tth year of tenure at employer
j is given by
22
j−1 ~
j−1 ~
t
t
(1) Wijt = γ t + ∑ βτ Tijτ + δT ∑ Tik + ∑ ατ Iijτ + δI ∑ Iik + c1Xi + c2Xij + c3Xijt + ui + vij + εijt
τ =1
k =1
τ =1
k =1
where γt is a tenure specific effect on wages common to all individuals and all jobs, Tijτ is a
variable indicating whether individual i received any formal training in his τth year of tenure at his
~
current job j, Tik ≡ ∑ Tikτ is the total number of times that individual i received formal training at
τ
his kth employer, Iijτ is a variable indicating whether individual i received any informal training in
~
his τth year of tenure at his current job j, Iik ≡ ∑ Iikτ is the total number of times that individual i
τ
received informal training at his kth employer, Xi is a vector consisting of individual-specific
variables (such as ability, race, or gender), Xij is a vector of match-specific variables (such as
employer size), and Xijt is a vector of observable variables that can vary within a match over time
(such as the individual's marital status). Using a standard decomposition of the error term, ui is an
individual fixed effect that captures the net wage effect of unobserved person-specific variables, vij
is an individual-job match effect, and εijt is a transitory mean zero error component that is
uncorrelated with both the explanatory variables and the fixed effects.
Note that in equation (1), we allow for the possibility that formal training and informal training
might have different effects on wages. These different effects could arise from possibly
unobserved differences in intensity, duration, or content of the different training programs.
Furthermore, following Loewenstein and Spletzer (1996), we index the β and the α coefficients by
tenure, thereby allowing the effects of formal and informal training on the wage to vary in different
years of tenure. And finally, following the discussion in Loewenstein and Spletzer (1998), we
allow for the possibility that training in previous jobs may have different wage effects than training
23
in the current job because training may be specific and/or past employers may have attempted to
extract some of the returns to general training.
As pointed out initially by Barron, Black and Loewenstein (1989) and more recently by
Loewenstein and Spletzer (1996), the generally established finding that more able workers are
more likely to receive training coupled with the fact that the controls for observable individual
heterogeneity (Xi) are imperfect measures for ability will cause OLS estimates of the training
coefficients in equation (1) to be biased. That is, a positive correlation between the individual fixed
effect ui and training will lead to upward biased training coefficients. Furthermore, it is likely that
there will be more training when the job match is better. In this case, the positive correlation
between the unobserved quality of the job match, vij, and training will be an additional source of
bias.
We can eliminate the individual fixed effect bias by estimating a wage growth equation.
Specifically, taking first differences of equation (1), we obtain
(2a)
Wijt − Wijt −1 = ( γ t − γ t −1) + βt Tijt + α t Iijt
+ c3( Xijt − Xijt −1) + (εijt − εijt −1)
for individuals with tenure t>2 in job j,
(2b)
ℑ
ℑ
Wij1 − Wij−1ℑ = ( γ1 − γ ℑ) + β1 Tij1 + ∑ (δT − βτ ) Tij−1τ + α1 Iij1 + ∑ (δI − ατ ) I ij−1τ
τ =1
τ =1
+c2 ( Xij − Xij−1) + c3( Xij1 − Xij−1ℑ) + ( vij − vij−1) + (εij1 − εij−1ℑ)
for individuals with tenure ℑ in job j-1 and tenure 1 in job j.
Equation (2a) is the wage growth that occurs within a job, and equation (2b) is the wage growth
that occurs across jobs. Note that the individual specific Xi variables have been differenced out of
both wage growth equations (2a) and (2b), and furthermore, the match-specific variables Xij have
been differenced out of the within job wage growth equation (2a).
24
In the within job wage growth equation (2a), the positive correlation between Tijt and (ui+vij)
does not cause any problems for OLS estimation because the error components ui and vij are first
differenced away.16 However, only the individual fixed effect and not the match specific fixed
effect has been eliminated in the across job wage growth equation (2b). Also note that any
empirically identifiable effect of training -- both formal and informal -- that occurs in the first year
of tenure is differenced away in equation (2a) but remains in equation (2b); this is important if one
believes that informal training is more important at the start of the job rather than later in the job.17
We will analyze the wage growth of job stayers and job movers separately. From the
discussion above, estimating a wage growth equation for job stayers using only information on
training during the current year will provide consistent estimates of the effect of training on wages
in the later years of tenure. However, estimating a wage growth equation for job movers requires a
complete history of training in the previous job and will yield possibly biased effects of the
potentially more important training at the start of the job.
IVa. Within Job Wage Growth
16
Loewenstein and Spletzer (1996) show that a substantial amount of training occurs well after the match
has started. It is possible that belated training may be correlated with the arrival of information indicating
a good worker-firm match, which could cause OLS training coefficients to be biased upward. However,
the results from Loewenstein and Spletzer’s instrumental variable estimation indicates that this is not a
serious problem.
17
We cannot measure job stayers’ returns to first-year training because we do not have a good measure of
an individual’s starting wage: although the NLSY inquires about the wage at the start of the job, our
analysis of these data suggests that this variable may contain more noise than information. The wages that
we use to calculate wage growth are wages on the primary job at the date of the survey interview. We
should point out that we are classifying a person as having one year of tenure if he has been on his current
job between 1 and 52 weeks at the time of the interview and having two years of tenure if he has been on
his current job between 53 and 104 weeks at the time of the interview. Since formal training is measured
over the previous year, our measure of second year formal training would include training from the 2nd
through the 53rd week of tenure for someone in his 53rd week of tenure at the interview date. For
someone in his 104th week of tenure at the interview date, second year formal training would include
training from the 53rd through the 104th week of tenure. The same point applies to new skills training.
25
In this section, we shall concentrate on the within job wage growth equation (2a). The tenure
specific intercepts ∆ γ t ≡ γ t − γ t −1 measure mean wage growth after controlling for formal and
informal training. According to the basic human capital model, the tenure specific intercepts
should be zero if all productivity enhancements are attributable to training. However, the tenure
specific intercepts will be positive if either the measures of formal training Tijt and informal
training Iijt that we observe in the data are imperfect measures for the training that actually occurs
in the job, or if training does not account for all wage growth within the job. This latter possibility
will occur if wages grow for reasons unrelated to productivity. As suggested by Lazear (1981),
one such reason may be an effort by employers to prevent shirking.18
Most (if not all) individual based data such as the CPS, the NLS-72, and the early waves of the
NLSY do not have independent measures of both formal and informal training.19 As such, the
estimates of the return to formal training are based on wage equations that omit informal training
from the wage regression and thus possibly suffer from omitted variable bias.
Anticipating the empirical work to come, assume that the training measures Tijt and Iijt in the
data are {0,1} incidence variables. Also, for purely expository convenience (all specifications in
the empirical work control for observable heterogeneity), assume that there are no explanatory
variables in the within job wage growth equation. Then one can show that the OLS estimates of βt
and ∆ γ t from equation (2a) when informal training Iijt is omitted are
(3a)
18
 cov ( Iijt , T ijt ) 
t
β$ t = βt + 
αt
var

t ( Tijt ) 
Indeed, Lazear and Moore (1984) went so far as to suggest that “most of the slope in age-earnings
profiles is accounted for by the desire to provide incentives, rather than by on-the-job training.”
19
As discussed in section II, the CPS and the NLS-72 do ask about informal training, but the routing
patterns of the questionnaire suggest that these measures are of dubious value.
26
(3b)
 cov t (Iijt , Tijt ) 
∆ γˆ t = ∆ γ t + α t mean t (Iijt ) − mean t (Tijt ) 
α t
 vart (Tijt ) 
where meant (⋅) , cov t (⋅,⋅) and vart (⋅) are the mean, covariance and variance measured “within” the
tth year of tenure.20
Equation (3a) illustrates that the sign of the bias in β̂ t , the estimated return to formal training,
depends on both the covariance between formal and informal training and αt, the effect of informal
training on wages. As we showed in the previous section, the measures of formal and “informal”
New Skills training in the NLSY are positively correlated within a given year of tenure. Therefore,
if α t is positive, βt is upward biased. Furthermore, equation (3b) illustrates that the OLS estimate
of the tenure specific intercept ∆ γ t will also be upward biased if α t is positive.21
Equations (3a) and (3b) provide the motivation for our empirical work. Specifically, omitting
informal training in a within job wage growth equation will bias both the estimated return to formal
training and the tenure specific intercepts. A comparison of these coefficients from a specification
that excludes informal training with the coefficients from a specification that includes informal
training will empirically describe both the sign and the magnitude of this omitted variable bias.
20
That is, letting Nt denote the number of observations where an individual is in the tth year of tenure of
 1 N t
 1 N t
 ∑ Iijt , meant ( Tijt ) =   ∑ Tijt , vart (Tijt ) = meant (Tijt ) 1 − meant ( Tijt ) ,
 N t i=1
 N t i=1
 1 N t
and cov t (I ijt , Tijt ) =   ∑ Tijt I ijt − meant ( Tijt ) meant ( Iijt ) .
 N t i =1
tenure, meant ( Iijt ) = 
21
This statement is obvious if we rewrite equation (3b) as:
  1 N t

   ∑ I ijt (1 − Tijt ) 
 N t i=1
α .
∆ γ$ t = ∆ γ t + 
  1 N t
 t
∑ (1 − Tijt )


 N 



t i=1
[
]
27
With this knowledge, we will be able to better interpret estimates of the return to formal training
that arise from specifications that omit informal training.
To estimate equation (2a) one needs measures of wage growth, formal training, and informal
training. Since the NLSY new informal training questions were asked in 1993 and 1994, we obtain
our sample for estimating the within job wage growth equation (2a) by pooling the training data
from 1993 and 1994 and the wage data from 1992 through 1994. Given that the tenure for this
within job wage growth sample ranges up to 20 years, we can conceivably estimate 19 tenure
specific intercepts, 19 formal training coefficients, and 19 informal training coefficients. Not only
would this be too many coefficients to report, but many of the coefficients would be estimated very
imprecisely. We have chosen to compromise and aggregate the sample into two distinct groups:
tenure=2 and tenure>2; this specification allows training earlier in the job to have a different wage
effect than training later in the job (recall that we cannot estimate the tenure=1 coefficients in the
within job equation). Investigation of the data with 19 different tenure effects does not lead one to
reject this specification.
The means for the explanatory variables in the within job sample are presented in Table 5. We
see that 15 percent of the individuals in the sample are in their second year of tenure, and the other
85 percent are beyond their second year of tenure. The incidence of formal training in the second
year of tenure is 16.7 percent (.0254/.1524), and the average annual incidence of formal training
beyond the second year of tenure is 18.8 percent (.1592/.8476). In light of our discussion in the
previous section, we will use the New Skills Training as our measure of informal training. As is
evident from the table, 38 percent of individuals in their second year of tenure receive new skills
28
training, while 44 percent of individuals who are beyond their second year of tenure have received
new skills training in the past year.22
From the first regression in table 5, we see that real wages grow by approximately four and a
half percent between the first and second year of tenure, and by approximately two and half
percent per year beyond the second year of tenure. These averages include the wage growth of
both those who received training during the previous year and those who did not receive training.
In the second regression, we add controls for formal training. Real wages of those who do not
receive training in their second year of tenure grow by three and a half percent, whereas real wages
of those who receive training in their second year of tenure grow by approximately nine percent
(.0363 + .0554).
In column 3 of table 5, we add controls for the informal training as measured by the new skills
questions in the 1993 and 1994 NLSY. Persons who receive informal training clearly have wage
growth that is significantly greater than persons who have not received this training. The estimated
coefficient for informal training in the second year of tenure is over three percent, and is
statistically greater than zero. The estimated coefficient for informal training beyond the second
year of tenure is considerably smaller – a little more than one percent – but still statistically
different from zero.
A noteworthy feature of the estimation results presented in Table 5 is the relatively low return
to training beyond the second year of tenure. As mentioned above, the return to a spell of informal
training in the second year of tenure is nearly two percentage points higher than the return to
informal training later in the job. This differential is even greater for formal training. According to
equation (3) in table 5, the estimated return to a spell of formal training in the second year is over
22
The fact that the average annual incidence of informal training is well below 100 percent is not in and
of itself bothersome since we are only looking at individuals who are in at least their second year of
tenure.
29
four and a half percent, but the return to a spell of formal training beyond the second year is
essentially zero; the same pattern holds in the other equations. One possible explanation for the
relatively low return to training spells that occur later in the match is that these spells may be of
shorter duration. However, this hypothesis is not supported by the data. Later spells do not
appear to be shorter in length than earlier spells. Furthermore, the estimated return to an hour of
training is lower for training beyond the second year of employment.23
The existence of diminishing returns to training is another possible explanation for our finding
that the return to training falls sharply beyond the second year. Loewenstein and Spletzer (1996)
have shown that training incidence is highly correlated over time: individuals who have received
training in the past are significantly more likely to receive training in the future. In this case,
diminishing returns would cause the estimated return to training to fall with tenure. A final
explanation is that the division of the returns to training between an employer and a worker may
change with tenure. Because individuals with longer tenure have demonstrated that they place a
relatively high valuation on their current job, the employer may be able to extract much of the
returns to their training. Knowing this, he should also be more willing to pay the cost of training.
We argued earlier that the estimated return to formal training and the tenure specific intercepts
will be biased upward if one does not control for informal training. The estimation results in table
5 bear this out. In the second year of tenure, the estimated formal training coefficient declines by
16 percent from .0554 to .0465 when controlling for informal training, and the estimated tenure
specific intercept declines by 31 percent from .0363 to .0252. That is, in a wage growth equation
23
Estimation results using ln(hours) of training are similar to those obtained using training incidence,
but keep in mind that unlike formal training, the NLSY does not provide information on whether a spell
of informal training is completed or still ongoing. This is especially serious when using duration rather
than incidence as the explanatory variable in a wage growth equation because longer spells are more
likely to be ongoing than shorter spells.
30
with no informal training variables, about one-third of the estimated tenure specific intercept and
one-fifth of the estimated return to training is omitted variable bias.
Our empirical work in table 5 is related to the earlier work of Lynch (1992). Lynch
hypothesizes that the returns to tenure -- an implied statistically significant 3.12 percent per year in
her sample -- may be capturing other unmeasured factors such as informal on-the-job training.
This is indeed what our estimates in table 5 imply, where we demonstrate that the return to tenure
falls dramatically when controls for informal training are added to the wage equation.
Interestingly, Lynch finds that the estimated return to tenure is unaffected by the inclusion of the
formal training variables in the wage equation. This is quite different from our results, where a
comparison of columns 1 and 2 in table 5 illustrates that the return to the second year of tenure
falls by 21 percent when we control for formal training. One explanation for this difference may
be the improved quality of the recent NLSY formal training data relative to the early waves of the
NLSY survey.24
Besides the 1993-1994 NLSY, the only other publicly available data set with reasonably
complete information on training is EOPP. Estimated wage growth equations using the EOPP data
are presented in Table 6. The dependent variable is the natural log of the hourly wage paid to the
typical worker after two years minus the natural log of the hourly wage paid to the typical worker
at the start of the job. The key explanatory variables are the amounts of time spent in formal and
informal training during the first three months of employment, where the estimate of formal
training comes from the employer’s response to question 3 in Table 1 while informal training
comes from his responses to questions 1, 4 and 5 in Table 1. We may note that the EOPP results
24
One immediate difference between Lynch’s data and our data is that her measure of formal training
only records spells that are four weeks or longer in duration. When we replicate this definition of formal
training in our sample, the return to the second year of tenure only falls by 5 percent (from .0457 to .0434,
as compared to the 21 percent decline from .0457 to .0363 that we document in table 5). This illustrates
31
are quite similar to those using the NLSY: adding the informal training variable to a regression that
only includes formal training has the effect of reducing the coefficient on formal training by 12
percent and the intercept by 16 percent.
As discussed above, the NLSY’s informal training variable may have substantial measurement
error. This means that one-fifth is a lower bound estimate of the upward bias in the estimated
formal training coefficient caused by omitting informal training from the wage growth equation.
However, in this connection, it is important to note that formal training itself may be measured
with considerable error, which by itself will lead to downward bias in the estimated OLS formal
training coefficient. In fact, Loewenstein and Spletzer (1996) have estimated that in their 19881991 NLSY sample, 38 percent of the individuals in their second year of tenure who truly received
formal training did not report this training, and 8 percent who did not receive formal training
falsely reported having received such training. These estimates imply a very substantial downward
measurement error bias in the OLS formal training coefficient.25
While the net effect of measurement error on the estimated formal training coefficient in
column 3 of Table 5 is ambiguous, it fairly safe to presume that mismeasurement of formal and
informal training will both cause the intercept to be biased upward.26 This has important
implications for one’s view of the labor market. The regression in column 3 indicates that the log
real wages of persons who do not receive any formal or informal training are growing on average
by two and a half percent between the first and the second year of tenure. This estimate is both
the importance of the short spells of training that are recorded in the post 1987 NLSY surveys but omitted
from the pre 1987 NLSY surveys.
25
While instrumental variables estimation is the standard method of correcting for measurement error,
instrumental variables estimates are not consistent when the explanatory variable measured with error (in
our case, training) can only take on a limited range of values. Loewenstein and Spletzer (1996) obtain
their estimates of measurement error by comparing the relative magnitudes of the OLS and the IV
training coefficients.
26
If the probability that workers falsely report training that did not occur is sufficiently high, one can
construct cases where the intercept is actually biased downward. However, these situations appear
unlikely in practice.
32
economically significant and statistically different from zero. One might be tempted to argue that
this positive coefficient is capturing wage growth that is occurring for reasons other than human
capital accumulation. For example, employers may be delaying compensation for incentive
purposes. However, if training is measured with considerable error, it is not necessary to resort to
sources other than human capital accumulation to explain the positive intercept in the wage growth
equation. The positive intercept in the wage growth equation may simply be a mixture of the small
if not zero wage growth of those not investing in human capital and the large positive returns
accruing to the human capital investment that we are not correctly measuring.
Finally, one might be concerned that the positive returns to training that we are measuring in
column 3 of table 5 are proxying for unobserved factors that are omitted from the wage growth
equation. In this explanation, our positive returns to both formal and informal training might be
nothing more than the effects of unobserved characteristics that are positively associated with both
training incidence and wage growth. We have conducted a simple sensitivity analysis that leads us
to reject this argument. If there were a fixed effect in wage growth, we should be able to at least
partially proxy for this effect by including fixed individual characteristics such as race, gender, and
AFQT score in the wage growth regression. However, the estimates in column 4 of table 5 show
quite strongly that the inclusion of these variables has essentially no effect on the training
coefficients. Furthermore, the race, gender, and AFQT coefficients are neither individually nor
jointly statistically significant.
IVb. Across Job Wage Growth
Estimating the across job wage growth equation (2b) requires not only a measure of informal
training in the first year of the new job, but also a complete history of informal training in the
previous job. Unfortunately, we have only two years of informal training in the NLSY. If we
insist on a complete history of informal training in the previous job, we can only estimate wage
33
growth equations for individuals who change jobs in two consecutive years. Besides the problem
of small sample size (our wage growth equation would only have 319 observations), the resulting
sample would be very atypical, leading to concerns about potential selection bias. We have thus
chosen to estimate a wage growth equation for all individuals who are in their first year of tenure
and to treat past training as an omitted variable. As mentioned above, another possible bias stems
from the presence of unobservable match effects in the worker’s new and old jobs (vij and vij-1 in
equation (2b)). A correlation between these effects and training will cause the estimated training
coefficients to be biased.
The estimated across job wage growth equations are presented in Table 7. Using the New
Skills Training as our measure of informal training, the estimated returns to formal and informal
training for persons in their first year of tenure are positive and roughly similar in magnitude to the
estimated returns in table 5. However, the standard errors are quite high, so that the estimated
coefficients are not significantly different from zero. Importantly, the estimated formal training
coefficient declines by 14 percent when adding controls for informal training, and the negative
intercept in column 2 doubles in magnitude when adding informal training in column 3. Although
the across job wage growth equations are beset with multiple sources of bias, these specifications
suggest that similar to the within job wage growth equations, the new NLSY informal training
measures explain quite a bit of wage growth above and beyond that measured by the formal
training measure.
V. Conclusions and Discussion
Although economists have recognized the importance of a worker's on-the-job human capital
investments since the seminal papers by Becker (1962) and Mincer (1962), micro-datasets
containing explicit measures of on-the-job training have started to become available only relatively
recently. The existing data have been analyzed fairly thoroughly in a number of studies, and
34
researchers agree that the human capital model's prediction that a worker's wage is positively
related to past investments in his training is supported by the data, even after one controls for the
fact that those who receive training have different characteristics than those who do not. However,
much of this research has used a composite measure of training, and attention has not been given to
any possible differences between formal and informal training.
This current paper was largely motivated by the findings of Lynch (1992) and Loewenstein and
Spletzer (1996). Specifically, after controlling for formal training in a wage growth regression, is
it possible that the remaining return to tenure is due to the unobserved informal training that occurs
over the duration of a job match? This issue is at the heart of the human capital model, and is the
main difference in alternative theories of wage growth. The addition of informal training questions
to the 1993 and 1994 National Longitudinal Surveys of Youth makes it possible to examine the
relationship between wages, formal training, and informal training.
As we described in section 3 of this paper, the routing pattern of the NLSY survey instrument is
consistent with the assumption that roughly two-thirds of workers do not receive any informal
training. This implicit assumption is at odds not only with the empirical estimates of incidence
from the EOPP survey, but also with one’s intuition that every job contains at least some element
of informal training. Indeed, the change between 1993 and 1994 in the routing pattern for the
NLSY Start Job Training questions (but unfortunately not for the New Skills Training questions)
confirms that most workers receive some informal training at the start of their job. Even taking
into account this problem with the routing patterns, our analysis suggests that the new 1993 and
1994 NLSY informal training data are measuring quite a bit of human capital accumulation that is
missed by the formal training questions. This conclusion is consistent with our wage analysis,
where we have found that both formal and informal training are positively related to wage growth.
We have presented our wage growth equations in the context of the basic human capital model,
where all productivity enhancements on the job are attributable to training. Our empirical
35
estimates are quite supportive of this theory: used together, formal and informal training can
explain a sizable amount of within job wage growth. It is worth noting that this result is not
confined to the NLSY data -- similar findings arise when one analyzes the EOPP data. However,
even after controlling for formal and informal training, the tenure specific intercepts in the NLSY
wage growth equations remain substantially positive. While this is consistent with the hypothesis
that wages grow for reasons unrelated to productivity, it also may be caused by measurement error
in the training variables. Indeed, our analysis of training incidence in this paper and the results in
Loewenstein and Spletzer (1996) both indicate that the NLSY training data are plagued with quite
extensive amounts of measurement error.
Our measurement error finding leads to the issue of labeling. The analysis in this paper
indicates quite strongly that there is a distinction between formal and informal training. But the
question remains: are the NLSY and EOPP measures of formal and informal training exhaustive?
We believe that it may be useful to distinguish between three types of human capital accumulation
on the job: formal training, informal training, and learning by doing. Casual experience and
intuition suggest that in many jobs most learning may not come from instruction from co-workers
and supervisors, but simply from experience and experimentation. This "learning by doing" may
well be what Brown (1989) had in mind when he stated that "one hopes that for most workers
[training] never ends."
Although learning by doing is by its nature subjective and thus more difficult to measure than
other training, EOPP and the Panel Study of Income Dynamics both contain measures of learning
by doing. EOPP attempts to measure learning on the job by asking the question, "How many
weeks does it take a new employee to become fully trained and qualified if he or she has no
experience in this job, but has had the necessary school-provided training." A similar question
appears in the PSID. Evidence of this variable's potential importance is provided by Barron,
Black, and Loewenstein's (1993) finding that it explains a significant part of the gender wage gap,
36
even though hours of formal and informal training during the first three months of employment are
similar for men and women. Interestingly, the new NLSY informal training questions also ask
about learning by doing. As reported in Table 3 (question 61AA), the NLSY asks every worker
who indicates that he could not initially perform 100 percent of his initial duties adequately the
question, "How long did it take before you were fully comfortable doing this kind of work on your
own?" An extensive analysis of the relationship between the measures of training and learning by
doing -- in both the NLSY and the other data sets -- remains a topic for future research.
37
References
Altonji, Joseph G. and James R. Spletzer. 1991. "Worker Characteristics, Job Characteristics,
and the Receipt of On-the-Job Training." Industrial and Labor Relations Review, pp. 58-79.
Barron, John M., Mark C. Berger, and Dan A. Black. 1997. "How Well do We Measure
Training?" Journal of Labor Economics, pp. 507-528.
Barron, John M., Dan A. Black, and Mark A. Loewenstein. 1987. "Employer Size: The
Implications for Search, Training, Capital Investment, Starting Wage, and Wage Growth."
Journal of Labor Economics, pp. 76-89.
Barron, John M., Dan A. Black, and Mark A. Loewenstein. 1989. "Job Matching and On-theJob Training." Journal of Labor Economics, pp. 1-19.
Barron, John M., Dan A. Black, and Mark A. Loewenstein. 1993. "Gender Differences in
Training, Capital, and Wages." Journal of Human Resources, pp. 343-364
Becker, Gary S. 1962. "Investment in Human Capital: A Theoretical Analysis." Journal of
Political Economy, pp. 9-49.
Bishop, John H. 1988. “Do employers Share the Costs and Benefits of General Training?”
Unpublished paper, Cornell University.
Bishop, John H. 1991. “On-the-Job Training of New Hires.” In Market Failure in Training?
edited by David Stern and Jozef M. M. Ritzen. Springer-Verlag, Berlin.
Bowers, Norman and Paul Swaim. 1994. "Recent Trends in Job Training." Contemporary
Economic Policy, pp. 79-88.
Brown, Charles. 1989. “Empirical Evidence on Private Training.” In Investing in People,
Background Papers, Vol.1, Washington D.C.: Commission on Workforce Quality and Labor
Market Efficiency, U.S. Department of Labor, pp. 301-320.
Constantine, Jill M. and David Neumark. 1994. "Training and the Growth of Wage Inequality."
NBER Working Paper #4729.
Lazear, Edward P. 1981. "Agency, Earnings Profiles, Productivity, and Hours Restrictions,"
American Economic Review, pp. 606-620.
Lazear, Edward P. and Robert L Moore. 1984. “Incentives, Productivity, and Labor Contracts.”
Quarterly Journal of Economics, pp. 275-296.
Leigh, Duane E. and Kirk D. Gifford. 1996. “Workplace Transformation and Worker
Upskilling: Evidence from the National Longitudinal Survey of Youth.” Unpublished paper,
Washington State University.
Lillard, Lee A. and Hong W. Tan. 1992. Private Sector Training: Who Gets It and What Are Its
38
Effects? Research in Labor Economics, pp. 1-62.
Loewenstein, Mark A. and James R. Spletzer. 1996. "Belated Training: The Relationship
Between Training, Tenure, and Wages." Unpublished paper, Bureau of Labor Statistics.
Loewenstein, Mark A. and James R. Spletzer. 1997a. "Delayed Formal On-the-Job Training."
Industrial and Labor Relations Review, pp. 82-99.
Loewenstein, Mark A. and James R. Spletzer. 1997b. "General and Specific Training: Evidence
and Implications" Unpublished paper, Bureau of Labor Statistics.
Loewenstein, Mark A. and James R. Spletzer. 1998. "Dividing the Costs and Returns to
General Training." Journal of Labor Economics, pp. 142-171.
Lynch, Lisa M. 1991a. "The Role of Off-the-Job vs. On-the-Job Training for the Mobility of
Women Workers." American Economic Review, pp. 151-156.
Lynch, Lisa M. 1991b. "The Impact of Private-Sector Training on Race and Gender Wage
Differentials and the Career Patterns of Young Workers." Report #NLS 92-8, Bureau of
Labor Statistics.
Lynch, Lisa M. 1992. "Private Sector Training and the Earnings of Young Workers." American
Economic Review, pp. 299-312.
Mincer, Jacob. 1962. "On-the-Job Training: Costs, Returns, and Some Implications." Journal
of Political Economy, pp. 50-79.
Mincer, Jacob. 1988. "Job Training, Wage Growth, and Labor Turnover." NBER Working
Paper #2690.
Pergamit, Michael R. and Janice Shack-Marquez. 1987. "Earnings and Different Types of
Training." Bureau of Labor Statistics Working Paper #165.
Royalty, Anne Beeson. 1996. “The Effects of Job Turnover on the Training of Men and
Women.” Industrial and Labor Relations Review, pp. 506-521.
Veum, Jonathan R. 1993. "Training among young adults: who, what kind, and for how long?"
Monthly Labor Review, pp. 27-32.
Veum, Jonathan R. 1995. "Sources of Training and their Impact on Wages." Industrial and
Labor Relations Review, pp. 812-826.
Table 1: Training Questions in the CPS, NLS-72, EOPP, and NLSY Surveys
Current Population Survey (CPS)
1) Since you obtained your present job did you take any training to improve your skills?
2) Did you take the training in
a) school,
b) a formal company training program,
c) informal on-the-job,
d) other
National Longitudinal Survey of the High School Class of 1972 (NLS-72)
1) Considering the most recent full time job you have held, did you receive or participate in any type of employerprovided training benefits or training programs?
2) Indicate each type of training benefit or program you participated in. Then record the
number of hours per week and the total number of weeks:
a) formal registered apprenticeship (your state or labor union)
b) employer-provided job training during hours on employer premises
c) informal on-the-job training (e.g., assigned to work with someone for instruction or guidance, etc.)
d) employer-provided education or training during working hours away from employer premises
e) tuition aid and/or financial assistance for attending educational institutions after working hours
f) other
Employer Opportunity Pilot Project (EOPP)
1) During the first three months, how many total hours does the average new employee spend in training activities
in which he or she is watching other people do the job rather than doing it himself?
2) Is there formal training, such as self-paced learning programs or training done by specially trained personnel, for
people hired in ...'s position, or is all the training done as informal on the job training?
3) During the first three months of work what was the total number of hours spent on formal training
such as self-paced learning programs or training done by specially trained personnel of your typical worker
in ...'s position?
4) ...during their first three months of work, what was the total number of hours management and line supervisors
spent away from other activities giving informal individualized training or extra supervision to your typical worker
in ...'s position?
5) During the first three months of work what was the total number of hours co-workers who are not supervisors
spent away from their normal work giving informal training or extra supervision to your typical worker in ...'s
position?
National Longitudinal Survey of Youth (NLSY)
1) Since [date of the last interview], did you attend any training program or any on-the-job training designed to help
people find a job, improve job skills, or learn a new job?
2) Which category best describes where you received this training?
a) business school
b) apprenticeship program
c) a vocational or technical institute
d) a correspondence course
e) formal company training run by employer or military training
f) seminars or training programs at work run by someone other than employer
g) seminars or training programs outside of work
h) vocational rehabilitation center
i) other (specify)
Table 2: Incidence of Training
EOPP, NLSY, NLS-72, and CPS Surveys
Survey
EOPP
NLSY
NLS-72
CPS
Sample
Workers aged 16-64,
Tenure <3 months
Workers aged 27-38
Tenure unrestricted
Workers aged 31,
Tenure unrestricted
Workers aged 16-64
Tenure unrestricted
Formal
Training
Incidence
13.4%
Informal
Training
Incidence
95.8%
Worker
Reference
Period
First Three
Months of Job
Last Year
Worker
Current Job
45.7%
19.7%
Worker
Current Job
44.1%
16.3%
Respondent
Employer
17.3%
EOPP: 1982 survey, unweighted data. Formal training measured from question 2 in table 1. Informal training
measured from questions 1, 4, and 5 in table 1. Source: authors’ calculations.
NLSY: 1993-1994 surveys, unweighted data. Formal training measured from question 1 in table 1.
Source: authors’ calculations.
NLS-72: 1986 survey, unweighted data. Formal training measured from question 1 in table 1. Informal training
measured from question 2c in table 1. Source: authors’ calculations.
CPS: January 1991 survey, unweighted data. Formal training measured from question 1 in table 1. Informal
training measured from question 2c in table 1. Source: authors’ calculations.
Survey
CPS
Sample
Workers aged 16-64,
Tenure <3 months
Workers aged 27-38
Tenure unrestricted
Workers aged 31,
Tenure unrestricted
Respondent
Worker
Reference
Period
Current Job
Formal
Training
Incidence
20.5%
Informal
Training
Incidence
9.7%
Worker
Current Job
45.5%
16.8%
Worker
Current Job
43.6%
17.3%
Figure 1: CDF of Informal Training Hours
Censored at 12 Weeks
100%
NLS-72
80%
EOPP
60%
40%
20%
200
180
190
0
10
20
30
40
50
60
70
80
90
100
110
120
130
140
150
160
170
0%
Hours
NLS-72: 1986 survey, unweighted data. Informal training measured from question 2c in table 1.
Mean=116.0 hours. Source: authors’ calculations.
EOPP: 1982 survey, unweighted data. Informal training measured from questions 1, 4, and 5 in table 1.
Mean=140.0 hours. Source: authors’ calculations.
Figure 2: CDF of Formal Training Hours
Censored at 12 Weeks
NLS-72
100%
EOPP
NLSY
80%
60%
40%
20%
90
100
110
120
130
140
150
160
170
180
190
200
70
80
50
60
0
10
20
30
40
0%
Hours
NLS-72: 1986 survey, unweighted data. Formal training measured from question 1 in table 1.
Mean=132.7 hours. Source: authors’ calculations.
EOPP: 1982 survey, unweighted data. Formal training measured from question 3 in table 1.
Mean=80.3 hours. Source: authors’ calculations.
NLSY: 1993-1994 surveys, unweighted data. Formal training measured from question 1 in table 1.
Mean=40.0 hours. Source: authors’ calculations.
Table 3: Informal Training Questions in the 1993 NLSY
Start Job Training
60A) When you started doing this kind of work for
[employer name], about what percentage of the duties
you currently do were you able to perform adequately?
<100%: Continue to 61AA
=100%: Exit Informal Training Questions
New Skills Training
36A) From time to time changes occur at work that
make it necessary to learn new job skills. On this card
are a number of examples. As I read each example, tell
me whether these changes have required you to learn
new job skills in the past 12 months?
Yes: Continue to 39
No: Exit Informal Training Questions
61AA) How long did it take before you were fully
comfortable doing this kind of work for [employer
name] on your own?
63) There are a variety of ways that people learn to do
their jobs. Please think about the [time in 61AA] when
you were learning to perform your job duties for
[employer name]. In learning how to perform these
duties, did you participate in any classes or seminars?
Yes: Continue to 65C
No: Skip to 67
39) As a result of these changes at work, did you
participate in any classes or seminars to learn how the
changes would affect how you do your job?
Yes: Continue to 39D
No: Skip to 40
65E) Over how many weeks did you attend these classes
or seminars?
65F) During the [65E] weeks that you attended these
classes or seminars, how many hours per week did you
spend in them?
39D) Over how many weeks did you attend these classes
or seminars?
39E) During the [39D] weeks that you attended these
classes or seminars, how many hours per week did you
spend in them?
67) Who explained or showed you how your job tasks
should be done.
Was it your supervisor, your
coworker(s), or both?
"Supervisor": Continue to 67E
"Coworker(s)": Skip to 67L
"Both": Ask 67E and 67L
"Neither": Skip to 68
40) Who explained or showed you how these changes at
work would affect how you do your job. Was it your
supervisor, your coworker(s), or both?
"Supervisor": Continue to 40C
"Coworker(s)": Skip to 40G
"Both": Ask 40C and 40F
"Neither": Skip to 41
67E) Over how many weeks did you spend time with
your supervisor learning how to do this kind of work?
67F) During the [67E] weeks you spent time with your
supervisor learning how your job tasks should be done,
how many hours per week did you spend?
40C) Over how many weeks did you spend time with
your supervisor learning how the changes would affect
how you do your job?
40Ca) During the [40C] weeks you spend with your
supervisor learning how to do your new duties, how
many hours per week did you spend?
67L) Over how many weeks did you spend time with
coworkers learning how to do this kind of work?
67M) During the [67L] weeks you spent time with
coworkers learning how your job tasks should be done,
how many hours per week did you spend?
40G) Over how many weeks did you spend time with
coworkers learning how the changes would affect how
you do your job?
40H) During the [40G] weeks you spent time with
coworkers learning how the changes would affect how
you do your job, how many hours per week did you
spend?
Table 3 (continued)
68) In learning to do the kind of work you are now doing,
did you make use of any self-study material or selfinstructional packages, such as manuals, workbooks, or
computer-assisted teaching programs?
Yes: Continue to 68E
No: Skip to 69A
41) In learning how these changes at work would affect
how you do your job, did you make use of any self-study
material or self-instructional packages, such as manuals,
workbooks, or computer-assisted teaching programs?
Yes: Continue to 41C
No: Skip to 42A
68E) Over how many weeks did you spend time using
self-teaching packages?
68F) During the [68E] weeks when you were using selfteaching packages, how many hours per week did you
spend?
41C) Over how many weeks did you spend time using
self-teaching packages?
41D) During the [41C] weeks when you were using selfteaching materials, how many hours per week did you
spend?
69A) Besides what we've talked about so far, can you 42A) Besides what we've talked about so far, can you
think of anything else that you did that helped you learn think of anything else that you did that helped you learn
to do the kind of work you are doing for [employer how the changes would affect your job?
name]?
Table 4a: Descriptive Statistics, 1993-1994 NLSY data
1993 Start Job Training
Total
Classes/Seminars
Incidencea
27.10%
Incidenceb
98.18%
0.38%
1.36%
Supervisor Show You
19.57%
70.91%
Coworkers Show You
19.57%
70.91%
Self-Study
12.92%
46.82%
Means Conditional on Receiving Training
Hours/Week
# Weeks
Total Hours
18.85
6.68
123.60
(14.97)
(9.15)
(194.10)
24.00
1.33
26.67
(16.00)
(0.58)
(12.22)
19.05
2.50
47.84
(16.93)
(3.32)
(69.44)
21.58
3.93
87.63
(17.21)
(5.26)
(136.77)
12.61
4.33
56.18
(13.07)
(7.81)
(138.91)
Data restricted to those in their first year of tenure. Standard deviations in parentheses. “Other” training not
reported.
a
Sample Size=797 (Those who initially performed less than 100% are assumed to have not received training).
b
Sample Size=220 (Those who initially performed less than 100% of their current duties adequately).
1994 Start Job Training
Total
Classes/Seminars
Incidence
79.89%
1.11%
Supervisor Show You
57.28%
Coworkers Show You
46.32%
Self-Study
24.55%
Means Conditional on Receiving Training
Hours/Week
# Weeks
Total Hours
16.12
6.11
96.05
(14.28)
(10.40)
(176.64)
12.13
7.00
24.63
(9.55)
(10.03)
(13.38)
15.18
3.12
51.69
(15.51)
(5.84)
(132.65)
18.63
3.55
69.11
(16.10)
(5.43)
(118.87)
12.26
5.54
66.26
(12.44)
(8.70)
(142.63)
Data restricted to those in their first year of tenure. Standard deviations in parentheses.
“Other” training not reported. Sample Size=721.
Table 4b: Descriptive Statistics, 1993-1994 NLSY data
New Skills Training
Total
Classes/Seminars
Incidencea
40.43%
Incidenceb
91.74%
9.79%
22.22%
Supervisor Show You
25.67%
58.24%
Coworkers Show You
13.25%
30.05%
Self-Study
19.99%
45.35%
Means Conditional on Receiving Training
Hours/Week
# Weeks
Total Hours
13.24
5.51
56.74
(26.99)
(13.51)
(168.15)
11.50
2.23
18.35
(13.44)
(5.32)
(35.50)
7.32
2.56
22.02
(10.36)
(6.82)
(114.27)
10.11
3.04
30.94
(12.83)
(7.11)
(98.05)
8.08
4.45
33.66
(10.32)
(9.66)
(115.74)
Standard deviations in parentheses. “Other” training not reported.
a
Sample Size=9362 (Those not requiring new skills are assumed to have not received training).
b
Sample Size=4126 (Those for whom workplace changes required new skills).
Formal Training
Total
Incidence
17.25%
Apprentice, Business
School, Voc-Tech
On-the-Job Training
1.53%
Inside Seminars
3.31%
Outside Seminars
3.82%
7.64%
Means Conditional on Receiving Training
Hours/Week
# Weeks
Total Hours
18.76
3.15
48.19
(14.98)
(5.74)
(126.87)
13.74
7.86
78.38
(12.00)
(11.35)
(183.27)
20.28
2.86
50.17
(15.73)
(4.52)
(108.74)
17.60
3.03
36.24
(13.90)
(6.35)
(70.93)
18.52
2.42
38.35
(14.00)
(4.52)
(107.25)
Standard deviations in parentheses. Sample Size=9362. “Other” training not reported.
Training durations reported only for spells not ongoing at the date of the interview.
Table 4c: Training Incidence Regressions, 1993-1994 NLSY data
1 if Formal Training
Mean
.1725
1 if “New Skills Training”
.4043
AFQT (Ability)
65.59
1 if Female
.4687
1 if Hispanic
.1830
1 if Black
.2763
1 if Education <12
.0936
1 if Education 13-15
.2410
1 if Education =16
.1430
1 if Education >16
.0819
1 if Multiple Site Employer
.7143
Ln(Firm Size)
4.146
Tenure (years)
5.205
Tenure Squared
44.91
Sample Size
Mean of Dependent Variable
9362
New Skills
Training
.2153 *
(.0139)
.0013 *
(.0003)
.0269 *
(.0110)
.0407 *
(.0157)
-.0039
(.0146)
-.0347
(.0206)
.0657 *
(.0138)
.0606 *
(.0172)
.0292
(.0218)
.0953 *
(.0124)
.0158 *
(.0027)
.0242 *
(.0043)
-.0013 *
(.0003)
9362
.4043
Formal
Training
.1181 *
(.0075)
.0013 *
(.0002)
.0302 *
(.0078)
-.0020
(.0113)
-.0059
(.0105)
-.0789 *
(.0186)
.0414 *
(.0097)
.0533 *
(.0117)
.0400 *
(.0147)
.0471 *
(.0093)
.0070 *
(.0019)
.0112 *
(.0032)
-.0007 *
(.0002)
9362
.1725
Probit coefficients (standard errors) refer to the effect of the explanatory variable on the training probability
evaluated at the sample mean. * implies statistically significant at the 5% level.
Start Job Training regressions are restricted to those in their first year of tenure. New Skills Training regressions
and Formal Training regressions are based on the full sample.
All equations include an intercept, age, enrollment status, a quadratic in experience, marital status, number of
children, number of previous jobs, indicators for government employment and part-time employment,
union, local area unemployment rate, urban residence, and SMSA.
Table 4d: Training Incidence Regressions, 1994 NLSY data
1 if Formal Training
Mean
.0915
1 if “New Skills Training”
.2469
AFQT (Ability)
61.66
1 if Female
.4105
1 if Hispanic
.1567
1 if Black
.3093
1 if Education <12
.1290
1 if Education 13-15
.2205
1 if Education =16
.1207
1 if Education >16
.0735
1 if Multiple Site Employer
.6019
Ln(Firm Size)
3.465
Sample Size
Mean of Dependent Variable
721
Perform
<100% of
initial duties
adequately
.1786 *
(.0599)
.0934 *
(.0403)
.0037 *
(.0010)
.0136
(.0374)
.0792
(.0557)
.0058
(.0486)
-.1070
(.0641)
-.0895
(.0487)
-.0023
(.0623)
.1000
(.0742)
-.0013
(.0376)
.0150
(.0094)
721
.3065
Informal
Training
-.0040
(.0550)
.1723 *
(.0410)
.0012
(.0008)
.0262
(.0321)
.0411
(.0478)
.0446
(.0402)
-.0200
(.0467)
-.0936
(.0388)
.0307
(.0581)
.0499
(.0721)
.0207
(.0311)
.0137
(.0080)
721
.7989
Perform
<100% of
initial duties
adequately
conditional
on Informal
Training
.2619 *
(.0758)
.0397
(.0475)
.0041 *
(.0012)
.0083
(.0449)
.0785
(.0687)
-.0089
(.0598)
-.1259
(.0772)
-.0670
(.0592)
-.0197
(.0734)
.1124
(.0883)
-.0189
(.0451)
.0131
(.0115)
576
.3837
Probit coefficients (standard errors) refer to the effect of the explanatory variable on the training probability
evaluated at the sample mean. * implies statistically significant at the 5% level.
All equations include an intercept, age, enrollment status, a quadratic in experience, marital status, number of
children, number of previous jobs, indicators for government employment and part-time employment,
union, local area unemployment rate, urban residence, and SMSA.
Table 5: Within Job Wage Growth Regressions, NLSY data
Mean
(1)
(2)
(3)
(4)
Intercept
.1524
.0457 *
(.0061)
Formal Training
.0254
.0363 *
(.0066)
.0554 *
(.0152)
“Informal” Training
.0581
.0252 *
(.0077)
.0465 *
(.0156)
.0327 *
(.0119)
.0260 *
(.0077)
.0452 *
(.0156)
.0324 *
(.0119)
.0247 *
(.0034)
.0236 *
(.0036)
.0050
(.0062)
.0188 *
(.0041)
.0019
(.0063)
.0120 *
(.0050)
.0194 *
(.0041)
.0009
(.0063)
.0111 *
(.0050)
No
No
No
Tenure=2
Tenure>2
Intercept
.8476
Formal Training
.1592
“Informal” Training
.3719
Gender, Race, AFQT
Yes
1992-1994 NLSY. Sample size=7745. Dependent variable is log real wage growth. Mean (standard deviation)
of dependent variable in the second year of tenure is .0499 (.2395); Mean (standard deviation) of dependent
variable beyond the second year of tenure is .0291 (.1859). OLS regression coefficients, standard errors in
parentheses. * implies statistically different from zero at the 5% level of significance (two tailed test).
All equations control for, in first differences, marital status, number of children, enrollment status, highest grade
completed, and the local area unemployment rate. All equations also include an indicator for year.
Gender, Race, and AFQT are defined as deviations from sample means.
Table 6: Within Job Wage Growth Regressions, EOPP data
Mean
Intercept
Formal Training
Ln(1+Hours)
“Informal” Training
Ln(1+Hours)
.4257
4.140
(1)
.3816 *
(.0591)
(2)
.3876 *
(.0587)
.0209 *
(.0045)
(3)
.3237 *
(.0606)
.0184 *
(.0045)
.0151 *
(.0038)
1982 EOPP. Sample size=1527. Dependent variable is log real wage growth measured
over the first two years of employment. Mean (standard deviation) of dependent
variable is .1889 (.2109). OLS regression coefficients, standard errors in parentheses.
* implies statistically different from zero at the 5% level of significance (two tailed test).
All equations control for a quadratic in age, a quadratic in experience, education, gender,
union, seasonal employment, employer size, and industry and occupation.
Table 7: Across Job Wage Growth Regressions, NLSY data
Mean
(1)
(2)
(3)
(4)
-.0025
(.0138)
-.0071
(.0143)
.0383
(.0309)
-.0144
(.0155)
.0331
(.0312)
.0262
(.0217)
-.0141
(.0156)
.0328
(.0315)
.0263
(.0218)
No
No
No
Yes
Tenure=1
Intercept
Formal Training
.1074
“Informal” Training
.2730
Gender, Race, AFQT
1993-1994 NLSY. Sample size=1509. Dependent variable is log real wage growth. Mean (standard deviation)
of dependent variable is .0256 (.3721). OLS regression coefficients, standard errors in parentheses.
* implies statistically different from zero at the 5% level of significance (two tailed test).
All equations control for, in first differences, marital status, number of children, enrollment status, highest grade
completed, the local area unemployment rate, SMSA, urban residence, an indicator for private or government
employment, multiple site employer, and union status. Gender, Race, and AFQT are defined as deviations
from sample means.