Cognitive Demands of Craft Professionals based on

Construction Research Congress 2014 ©ASCE 2014
Cognitive Demands of Craft Professionals based on Differing Engineering
Information Delivery Formats
Gabriel B. DADI1, Timothy R.B. TAYLOR2, Paul M. GOODRUM3, and William F.
MALONEY4
1
Assistant Professor, Department of Civil Engineering, 151C Raymond Building,
Univ. of Kentucky, Lexington, KY 40506-0281, email: [email protected]
2
Assistant Professor, Department of Civil Engineering, 151A Raymond Building,
Univ. of Kentucky, Lexington, KY 40506-0281
3
Nicholas R. Petry Professor of Construction Engineering and Management, Dept. of
Civil, Environmental, and Architectural Engineering, Univ. of Colorado at Boulder,
Boulder, CO, US 80309-0428
4
Raymond-Shaver Professor of Construction Engineering and Project Management,
Department of Civil Engineering, 151B Raymond Building, Univ. of Kentucky,
Lexington, KY 40506-0281
ABSTRACT
The communication of a project’s design to craft professionals can significantly
impact project performance. Despite advancements in 3D computer modeling and
integrated information systems in recent decades, the spatial design is still primarily
delivered to the construction work face in two dimensional (2D) drawings of various
views. These views must be combined and encoded to effectively understand all
orientations of the engineering element. Three dimensional computer aided design
(3D CAD) and additive manufacturing (3D printing) provide promising alternative
formats for presenting spatial engineering information. By asking craft professionals
to complete a reconstructing task of a simple structural frame using 2D drawings, a
3D CAD interface, and a 3D printed model, the cognitive workload demands can be
measured. After completing the task, the craft professionals were surveyed on their
perceptions of mental workload in six main factors using the NASA Task Load Index
(TLX); mental demand, physical demand, temporal demand, performance, effort, and
frustration. In addition, the subjects provided insights into their preferences of the
different model types. Lower workload scores are generally desirable and indicate
lower demands on an individual’s mental resources, allowing for concurrent
processing of other information. The results found that a physical 3D model, on
average, requires lower composite cognitive workloads than either two dimensional
drawings or a three dimensional computer model. The paper’s primary contribution to
the overall body of knowledge is to understand the cognitive demands of craft
professionals when presented with spatial engineering information in various formats.
INTRODUCTION
Construction project performance is often divided into four main categories:
productivity, safety, timeliness, and quality (Oglesby et al., 1989). While these
categories are often interrelated, a project’s productivity is a significant concern to the
construction industry and research fields alike. There are several major components
that drive jobsite productivity including tools, information, materials, and equipment
767
Construction Research Congress 2014 ©ASCE 2014
(Oglesby et al., 1989). If there are insufficient or improper tools, information,
materials, and equipment, productivity will suffer and potentially have a ripple effect
across the project.
Information delivery and construction drawing management has been shown
to be one of the three most impactful factors towards poor productivity based on a
survey of actual practitioners (Goodrum and Dai, 2006; Dai et al., 2009a; Dai et al.,
2009b). This research found inefficiencies in drawing management mainly due to
errors in the drawings, availability of drawings, slow responses to questions, illegible
drawings, and omission of necessary information (Goodrum and Dai, 2006; Dai et al.,
2009a; Dai et al., 2009b). In terms of information delivery, these studies focused
more on the effect of design errors and management, not necessarily errors in
understanding of the presented information. Less work has been performed
investigating how different individuals decode project documents.
Understanding how individuals interpret information by various spatial
information formats can provide useful insights into improving information delivery.
Research in studying mental workload demands of various information formats
provides a look into the cognitive performance of practitioners and uncovers basic,
scientific findings that can provide the framework for future work in the delivery of
engineering information.
Mental workload
Mental workload is a measure of the amount of mental resources required to
complete a task compared to the total amount of mental resources available to that
individual (Carswell et al., 2005). Often, the desired workload imposed by a task has
a balance. Too much mental workload and the user may not have the capacity to
maintain an acceptable level of performance. Too little mental workload and the user
may not have the focus and diligence required to complete the task appropriately
(Hart and Staveland, 1988; Mitropoulos and Memarian, 2013).
Mental workload measurement
Workload theory has led to the development of several measures used to
determine an individual’s mental workload. These measures can be categorized into
three classes; physiologic, secondary task, and subjective measures (Carswell et al.,
2005). Physiologic measures of mental workload incorporate indirect determinations
through the study of ocular and cardiac responses (Carswell, 2005). Secondary task
measures studies the spare workload capacity instead of a direct workload quantity
(Knowles, 1963). Non-intrusive measures of workload can be achieved with
subjective measures while also employing a simpler experiment design (Carswell et
al., 2005).
Subjective workload measures require subjects to self-evaluate cognitive
demands post-task completion. There are several subjective workload measures
available such as the Subjective Workload Analysis Technique (SWAT), the
Workload Profile (WP), the Multiple Resources Questionnaire (MRQ), and the
National Aeronautics and Space Administration Task Load Index (NASA-TLX)
(Carswell et al., 2005; Carswell et al., 2010). These tools are well known and widely
768
Construction Research Congress 2014 ©ASCE 2014
769
used due to their ease of administration and interpretation of results (Carswell et al.,
2005; Carswell et al., 2010; Hart, 2006).
The NASA-TLX workload measure is a multidimensional tool that rates
responses in mental demand, physical demand, temporal demand, effort,
performance, and frustration. The traditional NASA-TLX obtains pairwise weights
for each of the six factors prior to identifying a scaled response for the factors. These
sub-factors then combine based on the weight from the pairwise scores to form a
composite workload score. A derivative of the NASA-TLX is the NASA Raw Task
Load Index (NASA-rTLX) that does not weight the subscales by their pairwise
comparisons, resulting in a composite workload score from an average of the
subscales (Byers et al., 1989). Several studies have found a strong correlation
between the NASA-TLX and NASA-rTLX, which lends towards the adoption of the
simpler NASA-rTLX tool (Moroney et al., 1995; Moroney et al., 1992; and Byers et
al., 1989). For this reason, this study utilizes the NASA-rTLX tool to measure
workload for the study participants. Table 1 provides a description of the NASArTLX factors and the measurement scale.
Table 1. NASA-rTLX factors and descriptions
NASA-rTLX
Factors
Rating Scale
Mental
Demand
1-100 (LowHigh)
Physical
Demand
1-100 (LowHigh)
Temporal
Demand
1-100 (LowHigh)
Performance
1-100 (GoodPoor)
Effort
1-100 (LowHigh)
Frustration
1-100 (LowHigh)
Description
How much mental and perceptual activity was
required (e.g., thinking, deciding, calculating,
remembering, looking, searching, etc.)? Was the
task easy or demanding, simple or complex,
exacting or forgiving?
How much physical activity was required (e.g.,
pushing, pulling, turning, controlling, activating,
etc.)? Was the task easy or demanding, slow or
brisk, slack or strenuous, restful or laborious?
How much time pressure did you feel due to the
rate or pace at which the task occurred? Was the
pace slow and leisurely or rapid and frantic?
How successful do you think you were in
accomplishing the goals of the mission? How
satisfied were you with your performance in
accomplishing these goals?
How hard did you have to work (mentally and
physically) to accomplish your level of
performance?
How discouraged, stressed, irritated, and
annoyed versus gratified, relaxed, content, and
complacent did you feel during your task?
Construction Research Congress 2014 ©ASCE 2014
770
METHODOLOGY
The methodology for this research is developed from a basic scientific
experiment of subjects completing a simple task from a given information format.
Following the completion of each task, the subjects were administered the NASArTLX workload measurement tool. The responses from the subjects are aggregated
and statistically analyzed through an analysis of variance (ANOVA). The data
collected allows for comparisons on information format type, trial number, and
demographic factors.
The sample for the study consisted of twenty-six participants with varying
age, construction experience, education, and construction occupation. Table 2
highlights demographics of the participating subjects.
Table 2. Sample demographics
Demographics
Practitioners
26
Number
20-62
Age Range
(40.7/40.5)
(Mean/Median)
1-33
Years of Experience
Carpenter
Foreman
Laborer Foreman
Classification/Position
Titles
Electrical
Foreman
Mechanical
Foreman
Project Engineer
Design Engineer
Cognitive task experiment
Three different formats for spatial engineering information delivery were used
for the experiment; a conventional set of two dimensional (2D) drawings, a three
dimensional (3D) computer aided design (CAD) model using a computer, and a 3D
printed physical model. These formats represent the current method of engineering
information delivery in the 2D drawings, an emerging visualization tool for field use
in BIM models, and a potential innovation for visualization in the use of 3D printing.
The basis of design must be simple enough to solely capture the cognitive aspects of
spatial information processing, yet complex enough to where there is difficulty and
errors can occur. Subjects were presented with information in one of the three
formats, from a randomized sequence, and then instructed to build the structure from
simple building elements.
Figures 1, 2, 3, and 4 provide a sample of the 2D drawings, 3D computer
model, and physical model respectively. Figures 5 and 6 show the building elements
used to complete the task both incomplete and finished.
Construction Research Congress 2014 ©ASCE 2014
Figure 1. Sample plan sheet from 2D Drawing set
Figure 2. Sample elevation sheet from 2D Drawing set
Figure 3. Sample screenshot from the 3D computer model
771
Construction Research Congress 2014 ©ASCE 2014
Figure 4. Picture of 3D printed physical model (shown next to standard size
playing card for scale)
Figure 5. Scale model building elements disassembled (shown next to standard
size playing card for scale)
Figure 6. Scale model building elements completely assembled (shown next to
standard size playing card for scale)
772
Construction Research Congress 2014 ©ASCE 2014
773
Each subject began by completing an informed consent form after reading
through its entirety, followed by a demographic questionnaire. When the subjects
stated they understood the building elements, one of the information formats was
presented. After the subjects completed the task, the subjects were given the NASArTLX measure. Presenting an information format and completing the building and
NASA-rTLX form is repeated until all information formats are exhausted. This means
completing the cycle with a set of two dimensional drawings, a three dimensional
computer model, and a physical model.
RESULTS
The average response scores for each NASA-rTLX factor is reported in Table
3. While Table 3 found no statistically significant difference among the model types
and the resulting cognitive performance of the subjects, there are worthwhile
takeaways involving cognitive measures. Lower values are preferred for all response
factors. Overall, the physical model requires the least amount of mental workload
through all NASA-rTLX subfactors, as well as the overall composite workload score.
Table 3. Model type by NASA-rTLX factors
Physica
Tempora
Mental
Performanc Effor
l
Model Composit
l
Deman
e
t
Deman
Type
e
Demand
d
d
40.7
2D
33.72
39.42
30.96
45.38
22.12
7
44.4
3D
36.63
44.04
30.58
44.23
26.73
2
Physica
42.2
32.41
36.20
27.60
43.20
21.80
l
0
Frustratio
n
23.65
29.81
26.20
Preferences
The previous analysis shows that the subjects, both practitioners and students,
performed the experiment best with the physical model, then the 2D drawings, and
lastly, the 3D computer model. In the post-test questionnaire, subjects are asked
which information format was preferred in the completion of the task. Figure 7 shows
that only 39% of subjects prefer the physical model compared to 46% and 15% for
the 2D drawings and 3D model respectively.
Construction Research Congress 2014 ©ASCE 2014
Figure 7. Practitioners’ preference for task completion
Included in the data collection for preferences was an opportunity for the
subjects to provide insights into why he/she preferred a particular information format.
Table 4 outlines some of the interesting responses by subjects as to why a certain
format was preferred.
Table 4 Selected responses to model preferences
Responses from
Responses from
Responses from
practitioners that
practitioners that
practitioners that
preferred a 3D computer preferred a physical
preferred 2D drawings
model
model
“Agile, one-stop info
“Easier to build if you can
“Easy to understand”
source, and easily
see what it is supposed to
modifiable”
look like”
“Accessibility and ease of
“Used to reading from
viewing the model from
“Easy to figure out spatial
drawings”
any perspective without
shape in my mind”
having to do much”
“Can visually and
physically see what the
“Format that I am used to”
finished product should
look like rather than
imagine and think (it)”
“Can refer back easily and “You can turn, rotate, and
am accustomed to use”
flip to see all angles”
“Being able to process the
“Presents info floor by
3D at once is preferred
floor instead of all at one
over the multiple 2D
drawings for the same
time”
“Everything was clearer
info”
and less stressful”
774
Construction Research Congress 2014 ©ASCE 2014
The individuals that preferred the 2D drawing sets often responded it is due to
the fact that they were easy to understand and what they were used to. In fact, there
were 12 practitioners that preferred the 2D drawings and 6 responded that it was due
to their familiarity with drawings. 3D computer model preferences were often due to
the ability to rotate and visualize a full image as well as including relevant project
information. Lack of technical expertise to navigate a computer model and general
negative outlook on computers were the reasons that subjects did not prefer the
computer model. The subjects that preferred the physical model had several
interesting quotes as to their reasons. From the responses, the concept of a single,
physical source for information is well received by the subjects.
CONCLUSIONS
The paper’s primary contribution to the overall body of knowledge is to
understand the cognitive demands of craft professionals when presented with spatial
engineering information in various formats. The physical model was found to have
the lowest composite cognitive demand, as well as lowest mental, physical, and time
demands and highest performance scores. Conversely, the 3D computer model had
poor scores overall and in mental demand, performance, effort, and frustration.
A physical, haptic model performed well and was perceived well by the
subjects. Having a visual representation of a completed output allows practitioners to
mentally encode the spatial aspects of the model. 2D and 3D computer models then
allow for added details such as dimensions. Even with the performance results,
practitioners tend to favor familiarity with the 2D drawings as 46% of the subjects
preferred 2D drawings. For practitioners, this illustrates that training and experience
with various formats of spatial representation can improve understanding of spatial
design.
ACKNOWLEDGEMENTS
The authors wish to acknowledge the University of Kentucky Construction
Engineering and Project Management Advisory Board for their support and input to
the study. Furthermore, the study would not have been possible without the time and
commitment of the participating construction craft workers and their companies.
REFERENCES
Byers, J.C., Bittner Jr., A.C., and Hill, S.G. (1989). “Traditional and raw task load
index (TLX) correlations: are paired comparisons necessary?” In: Mita, A. (Ed.),
Advances in Industrial Ergonomics and Safety. Taylor and Francis, Philadelphia,
PA. 481-485.
Carswell, C.M., Clarke, D., and Seales, W.B. (2005). “Assessing Mental Workload
During Laparoscopic Surgery.” Surgical Innovation, 12(1), 80-90.
Carswell, C.M., Lio, C.H., Grant, R., Klein, M.I., Clarke, D., Seales, W.D., and Strup,
S. (2010). “Hands-free administration of subjective workload scales:
Acceptability in a surgical training environment.” Applied Ergonomics, 42(2010),
Elsevier, 138-145.
775
Construction Research Congress 2014 ©ASCE 2014
Goodrum, P.M. and Dai, J. (2006). “Work Force View of Construction Productivity.”
Research Summary 215-1. The Construction Industry Institute. University of
Texas at Austin, October 2006.
Dai, J., Goodrum, P.M., and Maloney, W.F. (2009a). “Construction Craft Workers’
Perceptions of the Factors Affecting Their Productivity.” Journal of Construction
Engineering and Management, 135(3), 217-226.
Dai, J., Goodrum, P.M., Maloney, W.F., and Srinivasan, C. (2009b). “Latent
Structures of the Factors Affecting Construction Labor Productivity.” Journal of
Construction Engineering and Management, 135(5), 397-406.
Dielman, T.E. (2005). Applied Regression Analysis: a Second Course in Business and
Economic Statistics 4th ed. South-Western Cengage Learning, Mason, OH.
Fellows, R. and Liu, A. (2008). Research Methods for Construction. 3rd ed. WileyBlackwell Publishing, Malden, MA.
Hart, S.G. (2006). “NASA-task load index (NASA-TLX); 20 years later.” In
Proceedings of the Human Factors and Ergonomics 50th Annual Meeting. Santa
Monica, CA: Human Factors and Ergonomics Society, 904-908.
Hart, S.G., and Staveland, L.E. (1988). “Development of the NASA-TLX (Task Load
Index): Results of empirical and theoretical research.”
Knowles, W.B. (1963). “Operator loading tasks.” Human Factors: The Journal of the
Human Factors and Ergonomics Society. 5, 155-161.
Mitropoulos, P. and Memarian, B. (2013). “Task Demands in Masonry Work:
Sources, Performance Implications and Management Strategies.” Journal of
Construction Engineering and Management. 139(5), 581-590.
Moroney, W.F., Biers, D.W., and Eggemeier, F.T. (1995). “Some measurement and
methodological considerations in the application of subjective workload
measurement techniques.” The International Journal of Aviation Psychology,
5(1), 87-106.
Moroney, W.F., Biers, D.W., Eggemeier, F.T., and Mitchell, J.A. (1992). “A
comparison of two scoring procedures with the NASA Task Load Index in a
simulated flight tasks.” Proceedings from the Aerospace Electronics Conference,
Dayton, OH, USA 2, 734-740.
Oglesby, C. H., Parker, H. W., and Howell, G. A. (1989). Productivity improvement
in Construction. McGraw-Hill, New York.
776