Development and Application of Safety Technology Adoption Decision-Making Tool

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 15

Development and Application of Safety Technology

Adoption Decision-Making Tool


Chukwuma Nnaji, A.M.ASCE 1; John Gambatese, M.ASCE 2;
Ali Karakhan, S.M.ASCE 3; and Robert Osei-Kyei 4
Downloaded from ascelibrary.org by PUKYONG NATIONAL UNIVERSITY on 05/03/23. Copyright ASCE. For personal use only; all rights reserved.

Abstract: The need to enhance worker safety in construction has often studied recently due to the high rates of fatalities and injuries in the
construction industry. To abate this scourge, various control methods including the use of technology have been adopted to improve work-
place safety performance in construction. However, not all of the technologies adopted by industry personnel for safety management have
achieved the desired outcome or have proved to be effective. More research is needed regarding the adoption of safety technologies in the
construction industry. Importantly, industry stakeholders need to evaluate their readiness to adopt safety technologies. The present study
develops a decision-making tool that can aid in the adoption of safety technology in the construction industry. The safety technology decision-
making adoption tool is developed using the fuzzy synthetic evaluation technique and consists of three categories of safety technology
predictors: external, organization, and technology-related. Technology-based predictors have emerged as the most impactful predictor.
Subsequently, a technology adoption assessment protocol was developed and applied to a case study to assess an organization’s readiness
to adopt wearable sensing devices. The present work contributes to the body of knowledge by identifying, quantifying, and categorizing
predictors of technology adoption and then integrating the predictors into an evaluation protocol to assess the adoption of safety tech-
nologies in the construction industry. This contribution is translated into practice by developing a user-friendly tool called the Construction
Safety Technology Adoption Index (C-STAI). The developed tool is expected to help construction professionals and practitioners make
informed decisions regarding the adoption of safety technology. Improved technology adoption in construction is expected to lead to
enhanced safety and reduce injuries on construction projects. DOI: 10.1061/(ASCE)CO.1943-7862.0001808. © 2020 American Society
of Civil Engineers.

Introduction Barbosa et al. 2017). Results from a recent study indicate that the
increased use of safety technologies and prevention through design
Innovation is considered a primary means of improving company (PtD) are two trends with the most potential to improve worker safety
and project performance within the construction industry in construction (SmartMarket 2017). Moreover, Karakhan et al.
(Gambatese and Hallowell 2011). As investment in information (2018) posited that the use of safety technologies enhances the over-
and technology research and development increases in most indus- all safety maturity of a construction organization. The overarching
tries, the development and use of advanced technology to support goal of the present research was to build on this finding by creating a
work operations are expected to increase as well. That being said, tool to improve safety technology adoption in construction.
the construction industry lags behind most industries in the devel- Adopting and implementing a new technology in construction
opment and implementation of technology (Barbosa et al. 2017). can be associated with significant capital expenditures and reoccur-
Peansupap and Walker (2005) stated that the construction industry ring costs (Goodrum et al. 2011), and this is a primary reason
is a traditional and technology-stagnant industry. This characteristic behind slow technology integration and adoption in construction
could be one industry feature contributing to the low productivity (Goodrum et al. 2011). According to the National Institute of Stan-
and unsafe nature of construction operations (Choudhry 2017; dards and Technology (NIST), there is a critical need for a readiness
index for assessing high-cost, high-risk, and high-impact construc-
1 tion innovation (NRC 2009). A readiness index would help
Assistant Professor, Dept. of Civil, Construction, and Environmental
decision-makers arrive at a congruent decision whether to fund,
Engineering, Univ. of Alabama, 3043 HM Comer, Tuscaloosa, AL 35487
(corresponding author). ORCID: https://orcid.org/0000-0002-3725-4376.
or not fund, the adoption and implementation of a new technology,
Email: cnnaji@eng.ua.edu thereby saving significant cost and time associated with technology
2
Professor, School of Civil and Construction Engineering, Oregon State integration. The aim of the present study is to develop a tool that
Univ., 101 Kearney Hall, Corvallis, OR 97331. ORCID: https://orcid.org could be used to assess the potential adoption of a safety technol-
/0000-0003-3540-6441. Email: john.gambatese@oregonstate.edu ogy in the construction industry. Utilizing data from previous
3
Ph.D. Candidate, School of Civil and Construction Engineering, efforts of the researchers (Nnaji et al. 2018, 2019a; Osei-Kyei and
Oregon State Univ., 101 Kearney Hall, Corvallis, OR 97331. Email: Chan 2017) and adapting frameworks proposed by other research-
karakhaa@oregonstate.edu ers allowed for the development of a technology adoption decision-
4
Lecturer, School of Computing, Engineering and Mathematics, support tool for safety management in construction. The proposed
Western Sydney Univ., Penrith, NSW 2751, Australia. Email: r.osei-kyei@
tool is called the Construction Safety Technology Adoption Index
westernsydney.edu.au
Note. This manuscript was submitted on April 19, 2019; approved on (C-STAI). It should be mentioned that safety technology is defined
October 4, 2019; published online on February 14, 2020. Discussion period in the present study as any technology (e.g., information technol-
open until July 14, 2020; separate discussions must be submitted for indi- ogy, digitalization, and sensing devices) used to monitor and
vidual papers. This paper is part of the Journal of Construction Engineer- improve safety management and/or performance in construction
ing and Management, © ASCE, ISSN 0733-9364. throughout the project lifecycle. The findings of this study are

© ASCE 04020028-1 J. Constr. Eng. Manage.

J. Constr. Eng. Manage., 2020, 146(4): 04020028


expected to facilitate and improve safety technology adoption in study indicates that the top three technologies used by construction
construction and, as a result, provide a significant practical contri- contractors are safety-specific software applications, information
bution to industry stakeholders. and communication technology (ICT) devices, and BIM. A recent
report released by SmartMarket (2017) assessed the current state of
construction safety management in the United States. Of the 334
Review of Technologies in Construction Safety participating contractors, approximately 50% indicated previous or
Research and Practice current use of BIM for safety management purposes. Of those con-
tractors using BIM, 69% indicated that BIM had a positive impact
Safety technologies in construction are primarily used to improve on worker safety. Eighty-two percent of respondents indicated that
worker safety during the operation phase of a project. However, these BIM’s ability to assist safety managers and workers to identify
technologies could be used in different phases of a project through- potential site hazards prior to construction commencement was
Downloaded from ascelibrary.org by PUKYONG NATIONAL UNIVERSITY on 05/03/23. Copyright ASCE. For personal use only; all rights reserved.

out its lifecycle to maximize safety performance. Szymberski (1997) the primary feature of BIM with respect to safety. In terms of the
suggested that the ability to mitigate workplace construction hazards application of emerging safety technologies such as WSDs, laser
is substantially greater during the early stages of the project lifecycle scanning, robotics, virtual reality, photogrammetry, and drones,
(e.g., design and planning). Relatedly, Karakhan et al. (2019) sum- 62% of the sampled construction contractors do not use any of
marized the different technological controls used for safety in con- these technologies for performing safety activities (SmartMarket
struction and found that technologies used during the planning and 2017). To be specific, only 13%, 14%, and 21% of participants
design stage, such as virtual reality and building information mod- indicated using WSDs, laser scanning, and drones, respectively,
eling (BIM), are more effective in mitigating workplace hazards than for safety applications. However, the same responding contractors
other alternatives used after construction begins. indicated that WSDs, laser scanning, and drones could have a sig-
Zhou et al. (2013) reviewed the application of advanced tech- nificant positive impact on worker safety, but they have yet to use
nologies in construction safety management research. Results from such technologies for safety management.
the study identified approximately 30 types of technologies used in
safety management based on appearance in the existing literature.
According to Zhou et al. (2013), publications related to safety Point of Departure
management technology in the building construction sector have
increased by 280% from 2002 to 2012. Based on the previous review of recent studies, there is currently an
Perlman et al. (2014) conducted an experimental study to exam- influx of technologies that could be applied in worker safety man-
ine the impact of virtual reality technologies on hazard identifica- agement. However, only a limited number of contractors use these
tion. Findings from the study indicated that virtual reality could technologies for managing construction worker safety; that is, a sig-
improve worker safety through early detection of potential hazards. nificant portion of construction organizations does not implement
Furthermore, other studies have assessed the utility of unmanned commercially-available safety technologies. Some reasons behind
aerial vehicles (UAVs), also known as drones, in safety manage- the low safety technology adoption rate include the relatively high
ment applications (Irizarry et al. 2012; Gheisari et al. 2014; Zhou cost of such technologies, worker resistance to using new technol-
and Gheisari 2018). Results from these studies suggest that UAVs ogies, technology uncertainty, and technology interoperability with
could be used to effectively conduct safety inspections, identify fall existing processes (Borhani 2016). Furthermore, the construction in-
hazards, and enable real-time communication. Moreover, UAVs are dustry lacks empirically supported multi-criteria decision-support
used to perform reality capture, especially in high-risk situations to tools for guiding complex decision-making such as safety technol-
eliminate worker exposure to the hazards (Şerban et al. 2016). ogy adoption (Haymaker et al. 2013). According to Haymaker et al.
The use of exoskeletons, another innovative technology, to re- (2013), the decision process models currently in use in the construc-
duce work-related musculoskeletal disorders has recently received tion industry rely primarily on using personal experience which, in
momentum in the construction safety domain. Cho et al. (2018) and itself, is insufficient to arrive at a sound decision. It is argued in the
Wang et al. (2017) examined the potential usefulness of exoskel- literature that utilizing empirically-backed decision-support tools
etons in construction safety and developed a passive exoskeleton leads to congruent decision-making (Suhr 1999), thereby ensuring
for lifting operations in construction. Another category of safety reduced cost and time associated with integrating the safety technol-
technology that has also received substantial attention in construc- ogy into work processes (Nnaji et al. 2018). As multi-criteria analysis
tion research is wearable sensing devices (WSDs). WSDs, such as has the capacity to influence the successful adoption of a safety tech-
smart helmets and smart vests, are considered effective safety tools nology, it is essential to provide practical and sound multi-criteria
due to their inexpensive price and high usability (Awolusi et al. decision-making tools to support technology adoption decisions.
2018). Choi et al. (2017) evaluated the key features that influence A review of available literature showed that most technology adop-
the adoption of WSDs and found that the invasion of privacy due to tion research in the construction industry is focused on utilizing
the consistent location tracking on the jobsite is perceived by work- adoption (or acceptance) models to understand factors that influence
ers as a primary factor hindering acceptance of such a technology in the actual use of a technology (Lee et al. 2015; Choi et al. 2017; Son
practice. The study also found that prior experience with and ease et al. 2012; Lee and Yu 2016; Park et al. 2012), rather than the
of use of WSDs are primary factors facilitating the adoption of such decision-making process to determine whether a technology should
a technology. Table 1 provides a summary of safety technologies be adopted or not. As a result, these studies suffer from a lack of
and their benefits. extended utility beyond the limited scope of the specific technology
Given that the presence of construction safety technologies being assessed and a lack of capacity to inform multi-criteria
among available literature does not imply the actual use of the tech- decision-making—which is a critical need in the construction
nologies, it is essential to investigate the actual use of the safety industry.
technologies in the construction industry. A study conducted by To the best of the researchers’ knowledge, only a limited number
Borhani (2016) suggested that although the use of emerging tech- of studies have focused on developing decision-support tools for in-
nologies for safety management only began in 2009, the rate of use tegrating safety technology into construction operations. However,
apparently increased between 2013 and 2016. Moreover, Borhani’s past studies that developed multi-criteria decision-support tools

© ASCE 04020028-2 J. Constr. Eng. Manage.

J. Constr. Eng. Manage., 2020, 146(4): 04020028


Table 1. Benefits associated with using safety technologies
Technology Safety benefit Sources
Building information Utilized in the design, planning, construction, and Karakhan et al. (2018), Martinez-Aires et al. (2018), and
modeling maintenance phases to: Tang et al. (2019)
• Improve hazard recognition and identification.
• Enhance safety planning, awareness, or communication.
• Improve safety inspections.
Unmanned aerial vehicles Utilized in the planning, construction, and maintenance Irizarry et al. (2012), Gheisari et al. (2014) Şerban et al.
phases to: (2016), Alizadehsalehi et al. (2018), Zhou and Gheisari
• Improve hazard recognition and identification. (2018), and Gheisari and Esmaeili (2019), and Gheisari et al.
Downloaded from ascelibrary.org by PUKYONG NATIONAL UNIVERSITY on 05/03/23. Copyright ASCE. For personal use only; all rights reserved.

• Enhance safety planning, awareness, or communication. (2018)


• Improve safety inspections.
Wearable sensing devices Utilized in the construction and maintenance phases to: Guo et al. (2017), Jebelli et al. (2015, 2018a, b, c), Jebelli and
• Improve hazard recognition and identification. Lee (2019), Choi et al. (2017), Kanan et al. (2018), Hwang
• Enhance safety planning, awareness or communication. and Lee (2017), Antwi-Afari et al. (2018), Awolusi et al.
• Mitigate safety and health hazards. (2018), Yang et al. (2019), Ryu et al. (2019), and Hwang
et al. (2018)
Robotics and automation Utilized in the construction, and maintenance phases to: Karakhan and Alsaffar (2019), Li et al. (2018), Cho et al.
• Enable hazard prevention or substitution. (2018), and Wang et al. (2017)
• Enhance safety planning, awareness or communication.
• Mitigate safety and health hazards.
Virtual reality and Utilized in the design, planning, construction, and Kim et al. (2017), Li et al. (2018), and Perlman et al. (2014)
augmented reality maintenance phase to:
• Improve hazard recognition and identification.
• Enhance safety planning, awareness, or communication.
Quick response codes Utilized in the construction and maintenance phase to: Karakhan and Alsaffar (2019), Awolusi et al. (2018), and
• Improve hazard recognition and identification. Tang et al. (2019)
• Enhance safety planning, awareness, or communication.
Radio frequency Utilized in the planning, construction, and maintenance Karakhan and Alsaffar (2019), Zhang et al. (2019a), and
identification phases to: Awolusi et al. (2018)
• Improve hazard recognition and identification.
• Enhance safety planning, awareness or communication.
• Improve physical workplace conditions.
• Improve on-site safety compliance.
Work zone safety Utilized in the construction and maintenance phases to: Awolusi and Marks (2019) and Nnaji et al. (2018)
technologies • Improve hazard recognition and identification.
• Enhance safety planning and communication.
Camera network systems Utilized in the construction and maintenance phases to: Zhang et al. (2019b), Zhu et al. (2017), and Park and Brilakis
• Enhance safety planning, awareness or communication. (2012)
• Improve physical workplace conditions.
• Improve on-site safety compliance.

focused on choosing between different available safety technologies, that guide industry professionals involved in technology integration
which is considered part of the implementation phase rather than the in the construction industry is essential. The aim of the present study
adoption phase of the technology integration process (Rankin and is to fill this gap in knowledge and practice by developing a statisti-
Luther 2006). For example, Nnaji et al. (2018) provided a systematic cally robust process for assessing the potential adoption of a safety
approach to perform multi-criteria decision-making analysis to select technology.
between different safety technologies for highway construction proj-
ects. Although noteworthy, utilizing a decision-support tool in the
adoption phase (before implementation starts) is more critical and Research Methodology
logically comes first (i.e., precedes implementation) in the full cycle To achieve the aim and objectives of the present study, the research-
of technology integration (Grover and Goslar 1993; Rankin and ers adopted a mixed-methods research approach that included a
Luther 2006). Applying a decision-support tool in the adoption phase literature review, industry survey, and case study. As depicted in
would ensure that only alternatives of a particular safety technology Fig. 1, multiple data analysis techniques were then used to extract
type that meet a pre-defined set of criteria are considered in the im- useful information from each data source. These analysis tech-
plementation phase, thereby ensuring a more effective technology niques composed of the following: (1) mean scoring ranking and
integration process. Given the impetus for increased technology factor analysis of the survey data; (2) fuzzy synthetic evaluation and
adoption and the projected growth in technology development in structural equation modeling of the survey data using the theory of
construction, developing empirically driven decision-support tools planned behavior (TPB), technology adoption model (TAM), and

© ASCE 04020028-3 J. Constr. Eng. Manage.

J. Constr. Eng. Manage., 2020, 146(4): 04020028


Downloaded from ascelibrary.org by PUKYONG NATIONAL UNIVERSITY on 05/03/23. Copyright ASCE. For personal use only; all rights reserved.

Fig. 1. Research methodology and outputs.

multi-criteria decision-making (MCDM) models as guidance; and Industry Survey to Determine the Importance of
(3) descriptive statistical analysis of the case study data. The results Predictors and Assessment Items
of all analyses were then integrated to create the Construction Safety
Technology Adoption Index. Using multiple, different research An industry survey was designed by the researchers and approved
methods is expected to improve the rigor and quality of the research by Oregon State University’s (OSU) Institutional Review Board
outcome as the methods chosen complement each other. The present (IRB) to establish the level of importance of each identified predic-
study’s overall research process is depicted in Fig. 1. However, due tor. The survey consisted of three primary sections. The first section
to page limitations, the present study does not include information on collected demographic information about the respondents. The
the predictor validation process. This information will be reported second part asked the respondents to rate the level of importance
elsewhere. of the 26 technology adoption predictors identified through the lit-
erature review. As part of the survey, the participants were provided
a definition of safety technology along with multiple examples of
Review of Literature to Identify Technology safety technologies to ensure that the participants are aware of the
Adoption Predictors
intent of the survey. In the last part, the respondents were asked to
An integrative literature review was performed to identify factors indicate their level of agreement with technology acceptance as-
influencing technology adoption decision-making in the construc- sessment items (e.g., “It will be easy to learn how to use wearable
tion industry. The authors systematically probed different databases sensing devices.”) for a case study that is described as follows. A
such as Scopus and Web of Science to identify studies that high- pilot survey was launched prior to the dissemination of the survey
lighted potential technology adoption predictors. Thirty-nine, out questionnaire to determine face and internal validity. Subsequently,
of the 78 articles identified, were reviewed due to their relevance the survey questionnaire was distributed nationwide to elicit re-
to the present study. The results of the review process revealed 26 sponses from construction stakeholders. Approximately 2,200 sur-
technology adoption predictors. These predictors were then con- vey questionnaires were distributed to construction personnel via
firmed and validated by a panel of subject-matter experts and a email. The potential participants were identified using two primary
group of industry professionals through a survey. A description sources: an OSU Civil and Construction Engineering Alumni list
of the analysis process used to identify the safety technology adop- serve (approximately 700 potential participants) and a construction
tion predictors in construction is available in a prior publication professional database managed by Qualtrics Analytics (approxi-
(Nnaji et al. 2019a). For the purpose of the present paper, the iden-
mately 1,500 potential participants). Qualtrics maintains an exten-
tified predictors are summarized in the Appendix (Nnaji et al.
sive and robust database of professionals in different industries
2019a). The Appendix also provides more information about sup-
porting references for each predictor of safety technology adoption. including construction (Qualtrics 2018). Qualtrics Panel was se-
The authors further validated the predictive strength of the predic- lected as a distribution channel to ensure that data was acquired
tors using several technology acceptance models (Mathieson et al. from different geographical regions throughout the United States,
2001; Chuttur 2009). The validation process and results are not in- thereby making it possible to generalize the results of the present
cluded in this publication due to page limit requirements. The iden- study to the entire US construction industry. A total of 337 survey
tified and validated predictors were used to develop the intended responses were received, and of which, 257 (76%) were sufficiently
C-STAI for measuring the potential successful adoption of a safety completed. Only the complete responses (n ¼ 257) were used in
technology, as described in the following section. the analysis.

© ASCE 04020028-4 J. Constr. Eng. Manage.

J. Constr. Eng. Manage., 2020, 146(4): 04020028


Data Analysis Tools size and sample techniques (Pedrycz et al. 2011). Typically, apply-
ing the FSE modeling process involves six primary steps (Hu et al.
Quantitative data analysis tools were implemented to develop an 2016; Osei-Kyei and Chan 2017):
empirically supported decision-support tool. In particular, the mean 1. Establish a basic set of predictors, Π ¼ fp1 ; p2 ; p3 : : : pn g
scoring ranking technique, factor analysis, and fuzzy synthetic where p = identified technology adoption predictors; and n =
evaluation (FSE) were utilized to analyze the collected data. To im- number of technology adoption predictors.
prove reliability, the research methodology was adapted from pre- 2. Label the set of rating alternatives as L ¼ fL1 ; L2 ; L3 : : : Ln g
vious studies conducted by Mu et al. (2014), Osei-Kyei and Chan where L = rating measurement scale; and n = number of cate-
(2017), and Hu et al. (2016). gories in the measurement scale. The present study adopted a
5-point Likert scale (L1 , L2 , L3 , L4 , and L5 ) where 1 = very low,
3 = moderate, and 5 = very high.
Downloaded from ascelibrary.org by PUKYONG NATIONAL UNIVERSITY on 05/03/23. Copyright ASCE. For personal use only; all rights reserved.

Mean Scoring Ranking Technique 3. Determine a weighting (W) for each technology adoption pre-
dictor using Eq. (2):
Multiple studies within construction management have relied on a
mean scoring (MS) method to establish the relative importance of a Mi X
set of identified factors (Osei-Kyei and Chan 2017). This method Wi ¼ P5 0 ≤ Wi ≤ 1; Wi ¼ 1 ð2Þ
i¼1 Mi
was utilized to analyze the data collected from the participants in
the present study. Specifically, the MS method was used to deter- where
P Wi = weighting; Mi = mean score of a predictor; and
mine the importance of each of the 26 identified technology adop- Wi ¼ 1 is the mean rating sum.
tion predictors. The MS for each predictor was calculated using a 4. Apply an evaluation matrix (Ri) for every technology adoption
5-point Likert scale ranging from 1 to 5 where “1” represents not predictor
important, “3” represents somewhat important, and “5” represents
extremely important. The MS for each predictor was calculated Ri ¼ ðrijÞmxn
using Eq. (1):
where rij = degree to which alternative Lj satisfies the criterion
Ps
sipi fj ; and m = mean score.
MS ¼ Pi¼1 s ð1Þ 5. Evaluate the weighting vector and fuzzy evaluation matrix using
i¼1 pi
Eq. (3):
where si = weight assigned to the ith response (si ¼ 1, 2, 3, 4, and 5
D ¼ Wi°Ri ð3Þ
for i ¼ 1, 2, 3, 4, and 5); pi = frequency of the ith response; and i =
response category 1, 2, 3, 4, and 5 ranging from the least important where D = final FSE evaluation matrix; and ○ = fuzzy compo-
to the most important. sition operator
6. Normalize the FSE matrix to develop the technology adoption
index (TAI) using Eq. (4):
Factor Analysis
X
5
Given the impracticality of developing an adoption index using a TAI ¼ DxL ð4Þ
large number of standalone technology adoption predictors, it is i¼1
essential to categorize these predictors into manageable groups.
Utilizing a widely acceptable statistical approach to categorize these where D = final FSE evaluation matrix; and L = the scale mea-
predictors enhances the validity of the intended C-STAI. In the surement for the set of grade categories.
present study, factor analysis (FA) was used to statistically identify
potential groups of technology adoption predictors. FA is considered
an effective statistical tool for identifying small groups from many Data Analysis and Results
interrelated variables (Lee and Yu 2016). In addition to its capacity to
identify and interpret non-correlated element clusters, FA excels in Demographic Information of Survey Participants
explaining complex phenomena (Osei-Kyei and Chan 2017). This
capability is especially valuable given the expected complexity as- Review of the geographical distribution of the survey respondents
sociated with developing an adoption index for MCDM. indicates that responses were received from 46 states spread across
the five regions in the United States (Northeast, Southeast, South-
west, Midwest, and West). Approximately 32% of the participants
Fuzzy Synthetic Evaluation (FSE) have 20 or more years of experience in the construction industry.
The level of experience observed in the data is particularly impor-
Researchers have recently encouraged the use of FSE as an tant since feedback from individuals with extensive experience is
effective statistical tool for developing measuring tools such as valuable for framework development. Only approximately 3% of
multi-criteria decision-support frameworks (Hu et al. 2016; Osei- the respondents had a year or less of work experience in the con-
Kyei and Chan 2017). FSE is a robust modeling technique used in struction industry at the time of the survey.
management and engineering research to quantify multi-levels, The survey asked the participants how often they are involved
multi-evaluations, and multi-attributes (Hu et al. 2016; Osei-Kyei in making purchasing decisions—including decisions to adopt
and Chan 2017). For instance, FSE was relied on as the primary innovative ideas such as new technology. More than 50% of the
tool in studies related to risk assessment, project performance as- respondents indicated that they make such decisions frequently,
sessment, project delivery assessment, and performance bench- while only 7% of the respondents are not directly involved in mak-
marking (Osei-Kyei and Chan 2017; Ameyaw and Chan 2015). ing purchasing and adoption decisions. The high level of partici-
FSE’s strength lies in its ability to deal with issues associated pation in decision-making to adopt innovative strategies along
with subjective, vague, and imprecise information (objectify sub- with the extensive experience level of the respondents provides
jective judgment) and its robustness against inadequacies in sample confidence in and reliability to the survey results.

© ASCE 04020028-5 J. Constr. Eng. Manage.

J. Constr. Eng. Manage., 2020, 146(4): 04020028


Selection of Critical Technology Adoption Predictors initial list of 26 predictors, leaving only 19 predictors. The 19 key
predictors provide the information required to develop the C-STAI
Results from the descriptive analysis of the collected data indicate
using FSE. Basically, these 19 key predictors are the output of the
that all predictors were rated above 3 (i.e., either highly or
extremely important) on a 5-point Likert scale. Table 2 shows first step in the FSE process.
the mean rating and the rank (based on the mean rating) of each
predictor. Factor Analysis
As shown in Table 2, technology reliability, proven effectiveness,
level of required training, technology durability, and having the re- Two primary requirements have to be met prior to conducting a
quired features are the five most important predictors as viewed by valid FA. First, it is essential to determine variable-to-sample ratio
the respondents, while technology brand is the least important out of sufficiency. Past research in construction management suggests that
a ratio of 1:5 (i.e., five samples to each studied variable) is required
Downloaded from ascelibrary.org by PUKYONG NATIONAL UNIVERSITY on 05/03/23. Copyright ASCE. For personal use only; all rights reserved.

the 26 predictors. Although all 26 factors were rated relatively high


(i.e., mean rating above 3), the development of an index requires only (Hair et al. 1998; Lingard and Rowlinson 2006). The variable-
critically important variables to be included in the index (Chan et al. to-sample ratio for the present study is approximately 1:13 (257
2010). Hence, only predictors with a relatively high level of impor- respondents and 19 variables), exceeding the minimum threshold
tance should be selected. One method for determining critical factors proposed by previous studies. Second, it is paramount to assess
is through normalizing the mean values obtained from a descriptive the relationship strength among the variables (Pallant 2001). The
analysis (Ameyaw and Chan 2015). Normalization typically in- existing literature suggests evaluating the relationships (correlation)
volves assessing the importance of each predictor relative to other using Bartlett’s test of sphericity and the Kaiser–Myer–Olkin (KMO)
predictors being evaluated [i.e., normalized value ¼ ðactual value − index (Xu et al. 2010) for sampling adequacy. To meet Bartlett’s test
min: valueÞ=ðmax: value − min: valueÞ]. of sphericity, a p-value of less than 0.05 is required. The obtainable
Consistent with past studies (Xu et al. 2010; Chan et al. 2015), KMO value ranges from 0 to 1, where values greater than 0.5 are
predictors with normalization values equal to or above 0.5 are re- considered sufficient (Norusis 2008). The analysis revealed statisti-
tained while predictors with normalization values below 0.5 are cally significant evidence (p-value ¼ 0.001) of a KMO value above
dropped. Therefore, the predictors “Potential level of resistance from the threshold (KMO ¼ 0.931), satisfying both requirements.
employees,” “Industry-level change requires technology adoption,” Using the SPSS software package 24 (SPSS, IBM, Armonk,
“Capital cost of technology,” “Direct competitors adopt similar tech- New York), a principal component extraction with varimax rotation
nology,” “Organization’s technology budget,” “Partners adopt simi- was executed. Varimax rotation was selected due to its ability to
lar technology,” and “Technology brand” were dropped from the simplify interpretation and the lack of high inter-correlation be-
tween components (Pallant 2007). Following seven iterations, three
categories, which accounted for 59% of the variance in responses
Table 2. Mean rating and criticality of technology adoption predictors
received, were extracted, as shown in Table 3. These categories
(n ¼ 257) were named Technology Predictors (Category 1), Organizational
Predictors (Category 2), and External Predictors (Category 3)
Mean Normalization based on the characteristics of the predictors within each category.
Technology adoption predictors rating SD Rank value
In addition, the categorization chosen as an outcome of the FA is
Technology reliability 4.33 0.87 1 1.00 similar to the technology-organization-environment framework
Proven technology effectiveness 4.19 0.89 2 0.87 extensively used in information systems research (Baker 2012).
Level of training required 4.18 0.95 3 0.86
According to Hair et al. (2010), each category should have an
Technology durability 4.18 0.90 3 0.86
Having the required features 4.17 0.85 5 0.85
eigenvalue greater than 1.0. Based on the analysis result (shown in
Level of complexity 4.13 0.98 6 0.81 Table 3), the eigenvalues for all categories are greater than 1.0.
Level of technical support required 4.13 0.89 6 0.81 The correlation between variables (the predictors in this case)
Level of technical support available 4.08 0.89 8 0.76 within the categories called the factor loading. For instance,
Client demand to use a technology 4.01 0.99 9 0.70 the level of correlation between “Having the required features”
Triability 3.99 0.92 10 0.68 and “Category 1” is 0.635. According to Norusis (2008), a factor
Observability 3.98 0.91 11 0.67 loading above 0.5 suggests acceptable correlation. Only one pre-
Organization culture 3.97 1.01 12 0.66 dictor (“Level of complexity”) out of the 19 has a factor loading
Competitive advantage 3.93 0.93 13 0.62
Versatility 3.93 0.97 13 0.62
below 0.5. Although below the threshold, the factor loading of
Potential cost savings 3.90 1.08 15 0.59 “Level of complexity” is relatively high (0.473) and was therefore
Peer Influence 3.88 0.92 16 0.57 retained.
Top management degree of 3.87 1.03 17 0.56 In addition, a reliability test was conducted using Cronbach’s
involvement coefficient alpha model to ensure internal consistency among
Level of compatibility with current 3.83 1.07 18 0.52 the predictors. First, data reliability was assessed among the pre-
processes dictors within each proposed category followed by an overall reli-
Government policy and regulation 3.81 1.08 19 0.50 ability test for all 19 predictors. According to Nunnally (1978) and
Potential level of resistance from 3.78 1.03 20 0.48
Hair et al. (2010), an alpha coefficient greater than 0.6 suggests
employees
Industry-level change requires 3.78 0.97 20 0.48 acceptable internal consistency among responses received from
technology adoption participants. The alpha coefficient values for the Technology Pre-
Capital cost of technology 3.67 1.02 22 0.37 dictors, Organization Predictors, and External Predictors categories
Direct competitors adopt similar 3.38 1.14 23 0.10 are 0.904, 0.872, and 0.576, respectively. The overall Cronbach’s
technology alpha of the categories is 0.934. Although not above the recom-
Organizations technology budget 3.36 1.00 24 0.08 mended 0.6 threshold, the alpha value for the External Predictors
Partners adopt similar technology 3.29 0.93 25 0.01 category is relatively high (0.576) and was therefore considered
Technology brand 3.28 1.18 26 0.00
sufficient.

© ASCE 04020028-6 J. Constr. Eng. Manage.

J. Constr. Eng. Manage., 2020, 146(4): 04020028


Table 3. Categorization of safety technology adoption predictors
Number Adoption predictors Factor loadings Eigenvalue PVE CPVE
Category 1: Technology predictors — 8.831 25.629 25.629
TP1 Level of complexity 0.473 — — —
TP 2 Having the required features 0.635 — — —
TP 3 Level of technical support required 0.661 — — —
TP 4 Level of technical support available 0.713 — — —
TP 5 Level of training required 0.575 — — —
TP 6 Technology durability 0.668 — — —
TP 7 Proven technology effectiveness 0.730 — — —
TP 8 Reliability 0.801 — — —
Downloaded from ascelibrary.org by PUKYONG NATIONAL UNIVERSITY on 05/03/23. Copyright ASCE. For personal use only; all rights reserved.

TP 9 Versatility 0.650 — — —
Category 2: Organizational predictors — 1.323 22.091 47.720
OP 1 Potential cost savings 0.661 — — —
OP 2 Compatibility with current processes 0.701 — — —
OP 3 Competitive advantage derived from 0.524 — — —
using the technology
OP 4 Top management participation 0.688 — — —
OP 5 Organization culture 0.657 — — —
OP 6 Peer influence 0.688 — — —
OP 7 Observability 0.673 — — —
OP 8 Triability 0.570 — — —
Category 3: External predictors — 1.080 11.414 59.133
EP 1 Government policy and regulation 0.731 — — —
EP 2 Client demand 0.677 — — —
Note: PVE = percentage of variance explained; and CPVE = cumulative percentage of variance explained.

Develop Weightings for Technology Adoption Determine Membership Function for Technology
Predictors and Categories Adoption Predictors and Categories
To develop the weighting for each technology adoption predictor Typically, the FSE membership function is derived at two or three
and the three primary categories, the researchers applied the pro- levels depending on the objective of the study (Mu et al. 2014;
cess described in the third step of the FSE. The mean rating needed Ameyaw and Chan 2015). A membership function is a value rang-
for applying Eq. (3) is extracted from responses received from the ing between 0 and 1 that represents the degree of an element’s
survey participants. To calculate the weight of each predictor, the membership in a fuzzy set (Osei-Kyei and Chan 2017). The levels
total MS of each category should be first calculated. The total MS is refer to the need to derive a membership function for a subset
derived by summing all of the mean scores of the predictors within (Level 2) prior to developing a membership function for a collec-
each category. The weight for each technology adoption predictor is tive (Level 1). In the present study, Level 2 refers to membership
then derived by dividing the mean score of a specific predictor by functions for the technology adoption predictors, while Level 1
the total MS. For instance, the weight of “level of complexity” is refers to the membership function for the categories. The Level
derived by dividing its mean rating (4.13) by the sum of the mean 2 membership function is obtained from the response distribution
ratings of the predictors in Category #1 (37.32). The result of this received from each survey participant, and the distribution is
calculation is 0.111 is as follows: based on the respondents’ ratings of predictors using the previ-
ously described 5-point Likert scale. For instance, 3% of the par-
ticipants rated “complexity of a technology” as least important
Wtap1 when adopting a safety technology, while 43% considered “com-
4.13 plexity of technology” as extremely important (Table 5). The
¼
4.13þ4.17þ4.08þ4.13þ4.18þ4.18þ4.19þ4.33þ3.93 membership function for T1 (Predictor #1 in the Technology Cat-
4.13 egory) is shown as follows:
¼ ¼ 0.111
37.32
0.03 0.03 0.14 0.37 0.43
MFI1 ¼ þ þ þ þ
L1 L2 L3 L4 L5
For the weight of each category, the total MS of a given
category is divided by the sum of all MSs (i.e., the sum of the where L = number values in the measurement scale [L1 ¼ 1 (not
three “Total MS for each category”). For instance, calculating important); and L5 ¼ 5 (extremely important)].
the weight of Category #1 is illustrated in the following equation: The membership function for T1 is stated as (0.03, 0.03, 0.14,
0.37, and 0.43), as shown in Table 5. A similar process is imple-
37.52 37.32 mented to derive the membership functions of the remaining 18
WCtap1 ¼ ¼ ¼ 0.488 technology adoption predictors.
37.32 þ 31.35 þ 7.82 76.49
Following the creation of the membership function for Level 2,
Eq. (4) (D ¼ Wi °Ri ) is then used to derive the membership function
This process was repeated to develop the weights for each for Level 1. Wi represents the weightings of all predictors within
predictor and category. Table 4 lists the mean ratings of the each category, and Ri represents the fuzzy matrix. For example, the
predictors, total MS for each category, and their respective membership function of external predictors can be calculated using
weights. the following matrix

© ASCE 04020028-7 J. Constr. Eng. Manage.

J. Constr. Eng. Manage., 2020, 146(4): 04020028


  Using the same process, the membership functions for the
 0.03 0.10 0.23 0.32 0.32 
ð0.49 and 0.51Þ 
 organizational category and external category were derived. Table 4
0.03 0.05 0.17 0.39 0.36 shows the Level 1 and Level 2 membership functions.
¼ 0.49 × 0.03 þ 0.51 × 0.03; þ0.49 × 0.10 þ 0.51 × 0.05; 0.49 Next, the TAI for each category is calculated using Eq. (5)
× 0.23 þ 0.51 × 0.17; 0.49 × 0.32 þ 0.51 × 0.39; 0.49 × 0.32 X
5
TAI ¼ DXL ð5Þ
þ 0.51 × 0.36 ¼ ð0.03; 0.07; 0.20; 0.35; 0.34Þ i¼1

Table 4. Mean rating and weights of technology adoption predictors and categories
Downloaded from ascelibrary.org by PUKYONG NATIONAL UNIVERSITY on 05/03/23. Copyright ASCE. For personal use only; all rights reserved.

Number Predictors Mean rating Weight of predictor Total MS for each category Category weight
Category 1: Technology predictors — — 37.32 0.488
1 Level of complexity 4.13 0.111 — —
2 Having the required features 4.17 0.112 — —
3 Level of technical support required 4.08 0.109 — —
4 Level of technical support available 4.13 0.111 — —
5 Level of training required 4.18 0.112 — —
6 Technology durability 4.18 0.112 — —
7 Proven technology effectiveness 4.19 0.112 — —
8 Reliability 4.33 0.116 — —
9 Versatility 3.93 0.105 — —
Category 2: Organizational predictors — — 31.35 0.410
1 Potential cost savings 3.90 0.124 — —
2 Compatibility with current processes 3.93 0.125 — —
3 Competitive advantage derived from 3.97 0.127 — —
using the technology
4 Top management participation 3.87 0.123 — —
5 Organization culture 3.83 0.122 — —
6 Peer influence 3.88 0.124 — —
7 Observability 3.98 0.127 — —
8 Triability 3.99 0.127 — —
Category 3: External predictors — — 7.82 0.102
1 Government policy and regulation 3.81 0.487 — —
2 Client demand 4.01 0.513 — —
Total group MS — — 76.49 —

Table 5. Level 1 and 2 membership functions


Weight of Membership function at Membership function at
Category (Level 1) Predictors (Level 2) predictors Level 2 (predictors) Level 1 (categories)
Category 1: Technology — 1 2 3 4 5 0.02 0.04 0.14 0.39 0.42
1 Level of complexity 0.11 0.03 0.03 0.14 0.37 0.43 — — — — —
2 Having the required features 0.11 0.00 0.05 0.13 0.42 0.40 — — — — —
3 Level of technical support required 0.11 0.01 0.04 0.16 0.44 0.35 — — — — —
4 Level of technical support available 0.11 0.02 0.04 0.14 0.42 0.39 — — — — —
5 Level of training required 0.11 0.02 0.06 0.16 0.36 0.45 — — — — —
6 Technology durability 0.11 0.02 0.04 0.11 0.41 0.42 — — — — —
7 Proven technology effectiveness 0.11 0.01 0.04 0.14 0.38 0.44 — — — — —
8 Reliability 0.12 0.02 0.02 0.10 0.34 0.53 — — — — —
9 Versatility 0.11 0.02 0.06 0.21 0.39 0.32 — — — — —
Category 2: Organization — — — — — — 0.02 0.06 0.20 0.40 0.32
1 Potential cost savings 0.12 0.03 0.10 0.16 0.36 0.35 — — — — —
2 Compatibility with current processes 0.13 0.01 0.08 0.19 0.43 0.30 — — — — —
3 Competitive advantage 0.13 0.02 0.07 0.18 0.37 0.36 — — — — —
4 Top management participation 0.12 0.04 0.05 0.24 0.37 0.31 — — — — —
5 Organization culture 0.12 0.04 0.07 0.19 0.40 0.30 — — — — —
6 Peer influence 0.12 0.01 0.06 0.25 0.40 0.28 — — — — —
7 Observability 0.13 0.02 0.04 0.20 0.42 0.32 — — — — —
8 Triability 0.13 0.02 0.05 0.19 0.42 0.33 — — — — —
Category 3: External — — — — — — 0.03 0.07 0.20 0.35 0.34
1 Government policy and regulation 0.49 0.03 0.10 0.23 0.32 0.32 — — — — —
2 Client demand 0.51 0.03 0.05 0.17 0.39 0.36 — — — — —

© ASCE 04020028-8 J. Constr. Eng. Manage.

J. Constr. Eng. Manage., 2020, 146(4): 04020028


where D = final FSE evaluation matrix; and L = the scale meas- could provide some insight into the factors that influence and predict
urement for the set of grade categories. safety technology adoption. However, the results do not provide a
The TAI values for the three categories (C1 = Technology, C2 = much-needed practical assessment process for future technology
Organizational, and C3 = External) are calculated as follows: adoption decision-making. To this end, the researchers developed
a simplified safety technology adoption protocol to aid construction
TAI C1 ¼ ð0.02; 0.04; 0.14; 0.39; 0.42Þ × ð1; 2; 3; 4; 5Þ ¼ 4.15 companies interested in technology integration make practical and
user-friendly decision-making assessment of whether to adopt a par-
TAI C2 ¼ ð0.02; 0.06; 0.20; 0.40; 0.32Þ × ð1; 2; 3; 4; 5Þ ¼ 3.92 ticular technology.
To develop the protocol, each key predictor within its category
TAI C3 ¼ ð0.03; 0.07; 0.20; 0.35; 0.34Þ × ð1; 2; 3; 4; 5Þ ¼ 3.91 was operationalized. The predictors were altered to be integrated
into the assessment tool in the form of statements that are easy to
Downloaded from ascelibrary.org by PUKYONG NATIONAL UNIVERSITY on 05/03/23. Copyright ASCE. For personal use only; all rights reserved.

understand and respond to (Table 7). The questions were designed


in line with past studies that operationalized critical factors that
Developing a Construction Safety Technology influence decision-making and predict safety outcomes (Goodrum
Adoption Index (C-STAI) et al. 2011; Chen and Manley 2014; Alexander et al. 2017). For
A linear, additive model was adopted to develop the C-STAI. instance, predictors such as level of complexity, cost savings po-
An additive model was considered appropriate for this study given tential, and regulation were operationalized to “The technology
the lack of correlation between the variables and among the catego- is easy to use,” “This technology has the potential to reduce project
ries, which is a primary requirement for linear models. In addition, cost,” and “Using this technology will help meet stipulated regu-
previous studies adopted a similar approach when developing a latory requirements (insurance, government, etc.),” respectively.
benchmarking/success index using FSE (Osei-Kyei and Chan Users of the proposed technology adoption assessment protocol
2017). To develop the C-STAI, the TAIs of each category of pre- are expected to indicate their level of agreement with the statements
dictors were normalized so they sum up to unity (Osei-Kyei and in the protocol on a scale of 1–5, where 1 is strongly disagree and 5
Chan 2017). As shown in Table 6, technology-based predictors is strongly agree. Each category has a maximum allotted number of
have the most impact on the potential adoption of a safety technol- points that is a multiple of the number of predictors and the maxi-
ogy when aggregating the weights for each category. This finding is mum assessment score. To provide an example, the maximum pos-
consistent with past studies conducted by Rogers (2003) and Ven- sible points associated with the Technology Category is 45 [number
katesh and Davis (2000) that suggest a technology’s characteristics of predictors in the category (9) times the maximum score (5)]. The
play a fundamental role in the adoption and eventual diffusion of total number of obtainable points is 95 [ð5×9Þþð5×8Þþð5×10Þ].
the technology. Table 6 reveals that the TAI for the organizational- The weight for each category is applied by multiplying the aggre-
and external-based predictors are approximately equal. However, gated number of points from each category by the appropriate co-
the maximum difference between the three categories is only efficient (0.346 for the Technology Predictors category). The same
0.24, suggesting that all three categories are equally important. process is repeated for each category, and then the weighted total of
The index for measuring the potential acceptance of a safety each category is summed to generate the total Adoption Prediction
technology is expressed by Eq. (6): Score (or the C-STAI) of the safety technology being evaluated.
The maximum obtainable Adoption Prediction Score is depicted
C-STAI ¼ ðC1 Coefficient × Technology PredictorsÞ as follows:
þ ðC2 Coefficient × Organizational PredictorsÞ
C-STAImax ¼ ð0.346 × 45Þ þ ð0.327 × 40Þ þ ð0.327 × 10Þ ¼ 31.92
þ ðC3 Coefficient × External PredictorsÞ ð6Þ
To determine the acceptance potential of a construction safety
where Technology Predictors, Organizational Predictors, and Ex-
technology, the technology is subjected to an assessment using the
ternal Predictors are the aggregated average values of the predictors
proposed protocol worksheet (Table 7). The result of the assess-
within each category. The values of C1, C2, and C3 Coefficients are
ment is then compared to the maximum obtainable adoption index.
as shown in Table 6. Implications of the equation and how it can be
It is expected that as the ratio increases up to 100% [when dividing
used in practice are discussed in the next section.
the specific Technology C-STAI by the maximum obtainable
C-STAI (31.92)], the chances that the technology will be success-
Construction Safety Technology Adoption fully integrated into a construction organization also increases.
Assessment Protocol An accumulated score of at least 65% indicates that the evaluated
technology would likely be successful if adopted (Goodrum et al.
In its current state, the results from the study would present some 2011). In this case, a score of 20.75 (0.65 × 31.92) or more would
implications to construction researchers. That is, the information be considered an indication of the potential success of a technology
if properly adopted and used by a company. That being said, an
organization can set its own threshold based on its culture and em-
Table 6. Technology adoption index (TAI) and coefficient for each
ployee behavior. For instance, a company with a more progressive
category culture and younger worker population could utilize a lower thresh-
old or place more emphasis (weight) on the External and Technol-
ID Technology ogy categories since the organization and its employees are less
number Category adoption index (TAI) Coefficient
resistant to using new technologies.
C1 Technology predictors 4.15 0.346 Using the proposed protocol, it is expected that a more robust and
C2 Organizational predictors 3.92 0.327 systematic technology adoption decision would be achieved. This
C3 External predictors 3.91 0.327 decision process should include insights from and discussion be-
Total 11.98 1.00 tween top management personnel and actual end-users of the tech-
P
Note: Coefficient ¼ ðTAI for Category= TAI for all categoriesÞ. nology. Including both groups ensures that a holistic information

© ASCE 04020028-9 J. Constr. Eng. Manage.

J. Constr. Eng. Manage., 2020, 146(4): 04020028


Table 7. Safety technology adoption protocol worksheet
Rating Group Category
Number Predictors Statements (1–5) max score coefficient
Category 1: Technology 45 0.346
1 Complexity The technology is easy to use
2 Required feature The technology has the essential feature(s) to perform specified task
3 Required support Installing and using the technology requires little or no support
4 Available support If required, technical support is readily available
5 Training required Little or no additional training is required
6 Durability Available evidence indicates that this technology has a long shelf life
7 Technology effectiveness There are documented evidence showing the technology is effective
Downloaded from ascelibrary.org by PUKYONG NATIONAL UNIVERSITY on 05/03/23. Copyright ASCE. For personal use only; all rights reserved.

8 Technology reliability There is evidence that the technology consistently meets performance
requirements
9 Technology versatility The technology features improve other performance indicators
(e.g., work quality)
Category 2: Organizational 40 0.327
1 Cost savings This technology has the potential to reduce project cost
2 Peer influence Workers will have no hesitation using the technology
3 Top management Management supports the use of this technology
4 Compatibility This technology is interoperable with existing technologies and will fit
in seamlessly into our current operations
5 Organizational culture The company employees are open to trying new technologies
6 Competitive advantage This technology has a positive impact on the company’s core
competency
7 Technology observability The performance of this technology can be observed before investing
8 Technology triability The technology can be tried prior to purchase
Category 3: External 10 0.327
1 External regulation The technology will help meet stipulated regulatory requirements
(insurance, OSHA)
2 Client demand Using this technology will help meet a client’s need
Adoption prediction score (C-STAI): XX
% Relative to maximum score: XX

gathering and decision-making process is achieved. For instance, provided. A few assumptions, such as the technology being available
while end-users might not understand the impact of a safety technol- to observe and use prior to making a decision to adopt the technol-
ogy on the company’s bottom line and how that safety technology ogy, were made and presented to the participants. The interviewed
fits into the company’s strategic plan, the end-users provide valuable professionals and practitioners filled in the assessment protocol
insight on the technology itself, which is the most vital category worksheet using the information provided, insight on their organi-
based on the statistical analysis results from the present study. zation, and personal knowledge of other factors that could influence
Conversely, making an adoption decision without top management the adoption of the selected technology. After finishing the assess-
input might limit the rate of diffusion that, in turn, stagnates the chan- ment, the interviewed professionals and practitioners were asked to
ces of a safety technology becoming an organizational standard. answer a short survey involving five questions about the usefulness
of the assessment protocol. The responses and average ratings are
presented in Table 8. According to the case study participants, the
Application of Construction Safety Technology
proposed assessment protocol is practical and user-friendly and has
Assessment Protocol
the potential to make informed decisions regarding safety technology
Given the increased interest regarding WSDs in the construction adoption in construction.
industry, an illustrative case study example about WSDs was pre- The participants also provided invaluable feedback for further
pared. The case study exemplifies the process of implementing the improvement of the assessment protocol. One participant indicated
assessment protocol in practice for safety technology adoption. The that the evaluation questions did not fully capture some barriers
researchers interviewed seven construction personnel consisting of
one project manager, one safety engineer, two project engineers,
one foreman, and two field workers. Due to logistics reasons, the
Table 8. Assessment protocol feedback
interviewees were not all present in one location at the same time.
However, all personnel work for the same general contractor and Mean response
were involved in the same project at the time the interviews were Verification questions ratings (n ¼ 7)
conducted. Construction industry work experience of the inter- Are the identified predictors sufficient and practical for 8.5
viewed professionals and practitioners ranged from two to 20 years safety technology assessment?
with an average duration of experience of eight years. The research- Are the proposed categories reasonable? 9.2
ers developed and distributed a case scenario about a smart vest Is the proposed assessment process practical? 7.9
WSD. The case scenario description included information on a smart Can this assessment tool support your decision-making? 8.2
vest such as battery life, coverage area, weight, available support, Is the assessment protocol easy to use? 9.4
and features. Information on the potential cost-benefit was also Note: 0 = strongly disagree; 5 = average; and 10 = strongly agree.

© ASCE 04020028-10 J. Constr. Eng. Manage.

J. Constr. Eng. Manage., 2020, 146(4): 04020028


Table 9. Application of safety technology adoption protocol worksheet
Rating Group Category
Number Predictors Statements (1–5) max. score coefficient
Category 1: Technology 38 (45) 0.346
1 Complexity The technology is easy to use 4.4 — —
2 Required feature The technology has the essential feature(s) to perform specified task 3.7 — —
3 Required support Installing and using the technology requires little or no support 4.4 — —
4 Available support If required, technical support is readily available 5.0 — —
5 Training required Little or no additional training is required 4.7 — —
6 Durability Available evidence indicates that this technology has a long shelf life 3.4 — —
7 Technology effectiveness There are documented evidence showing the technology is effective 4.0 — —
Downloaded from ascelibrary.org by PUKYONG NATIONAL UNIVERSITY on 05/03/23. Copyright ASCE. For personal use only; all rights reserved.

8 Technology reliability There is evidence showing that the technology consistently meets 4.0 — —
performance requirements
9 Technology versatility The technology features improve other performance indicators 4.3 — —
(e.g., work quality)
Category 2: Organizational 33.7 (40) 0.327
1 Cost savings This technology has the potential to reduce project cost 4.0 — —
2 Peer influence Workers will have no hesitation using the technology 2.6 — —
3 Top management Management supports the use of this technology 4.6 — —
4 Compatibility This technology is interoperable with existing technologies and will fit 4.1 — —
in seamlessly into our current operations
5 Organizational culture The company employees are open to trying new technologies 3.7 — —
6 Competitive advantage This technology has a positive impact on the company’s core 4.7 — —
competency
7 Technology observability The performance of this technology can be observed before investing 5.0 — —
8 Technology triability This technology can be tried prior to purchase 5.0 — —
Category 3: External 6.6 (10) 0.327
1 External regulation The technology will help meet stipulated regulatory requirements 3.0 — —
(insurance, OSHA)
2 Client demand Using this technology will help meet a client’s need — — 26.32
Adoption prediction score (C-STAI): 3.6 — —
% Relative to maximum score: — — 82.5%

associated with the use of smart vests such as privacy and confi- the predictors and C-STAI as a guide to increase the odds of suc-
dentiality of collected data. Although the assessment tool partially cessful research to practice transition. The odds of adoption increase
captured these barriers within the context of Peer Influence Predic- if researchers and practitioners ensure that a proposed safety tech-
tor, the participant indicated that applicable questions to capture nology meets the adoption criteria discussed in this study.
such barriers should be included as a standalone variable. Despite From a technology management perspective, findings from the
the identified areas for improvement and the small sample size, the study reveal the importance of technology attributes, organizational
case study used and the positive responses received demonstrate attributes, and external factors for making a decision of whether to
the proposed assessment protocol and provide evidence about the adopt a technology for safety management or not. To achieve a suc-
potential effectiveness of the developed protocol. cessful technology adoption exercise, it is essential that verifiable
The outcome of the case study assessment itself is provided in information on the technology of choice is readily available to work-
Table 9. According to the results, the smart vest WSD scored high- ers tasked to use the technology. The information should include data
est in the Technology Category (84.4%), suggesting the smart vest on reliability, effectiveness, and durability of the technology. To
has strong technological attributes, is relatively easy to use, and is encourage the use of the technology, management should provide
durable. The Adoption Prediction Score for the smart vest safety required training and support a reliable system for technology
technology was 26.32 points. This value represents 82.5% of the adoption and application while promoting a positive culture that
maximum obtainable score (31.92 points), which is above the 65% supports innovation and use of technology. Moreover, top manage-
threshold, suggesting that the evaluated smart vest technology could ment should strongly accentuate, demonstrate, and communicate
be successfully implemented in an effective manner if a decision to the technology’s usefulness (e.g., features, versatility) in preventing
adopt the technology is made. workplace safety risks that workers are typically exposed to.
The current study is one of the first studies to utilize a flexible
Implications for Technology Integration Researchers decision-support methodology that encourages collaboration and
and Practitioners extensive discussions between stakeholders involved in the process
of integrating a safety technology in the construction process. It is
The findings from the present study contribute to practice and expected that by utilizing such a decision-making process, the
research in the following ways. First, the study identified and vali- adoption of appropriate technologies that enhance worker safety
dated factors influencing safety technology adoption in construction. within a specific context will be achieved.
The identified predictors can be used to foretell the acceptance and Although further validation and application of the proposed
effectiveness of a safety technology in a particular work environment C-STAI and assessment protocol are needed, these developed tools
in an organization. Furthermore, researchers involved in the develop- provide rich information that can influence technology adoption
ment of new or adaptation of existing safety technologies can utilize decision-making in construction. Individuals involved in technology

© ASCE 04020028-11 J. Constr. Eng. Manage.

J. Constr. Eng. Manage., 2020, 146(4): 04020028


integration can use the current or an adapted version of the proposed an integrated conceptual implementation success model that incor-
C-STAI and assessment protocol as a tool to optimize safety tech- porates constructs on individual infusion and organization diffusion
nology adoption decision-making. The proposed C-STAI and assess- should be developed. The findings from the present study (particu-
ment protocol can be used in the adoption phase of safety technology larly quantification and categorization of adoption factors) provide
integration to evaluate the usefulness of a technology and the accep- fundamental information required to develop the first phase of a
tance readiness of an organization (and the individuals working in novel implementation success model for safety technologies in con-
that organization) to adopt a specific safety technology. Expanding struction management.
on a recent study (Nnaji et al. 2019b), information derived from the
present study could inform the development of a novel agent-based
simulation model for assessing the potential adoption success of a Conclusions and Future Research
technology prior to investing extensively in a new safety technology.
Downloaded from ascelibrary.org by PUKYONG NATIONAL UNIVERSITY on 05/03/23. Copyright ASCE. For personal use only; all rights reserved.

By incorporating the C-STAI into a simulation model, practitioners The current study represents the first empirical effort focused on
and researchers can simulate the technology adoption process, iden- enhancing our understanding of critical factors that influence the
tify, and manipulate conditions that spur adoption and develop pol- adoption of construction safety technologies and incorporating
icies and programs that support the successful adoption of a given them into a robust, yet flexible, adoption framework that supports
safety technology. technology integration. Utilizing a rigorous process, a practical and
Following the assessment, an organization can make a decision novel approach to evaluating the potential adoption of safety tech-
to continue investing in the evaluated technology or suspend the nologies by end-users was developed.
integration process and focus on creating an enabling environment First, the present study identified factors based on past research
that will spur future acceptance. The enabling environment could that can be used to predict the adoption of a technology. Through
be in the form of developing a strategic plan focused on prioritizing statistical analysis, the predictive strengths of the identified predic-
the critical predictors that influence technology acceptance and tors were verified and then the predictors were classified into three
determining the level of improvement needed for specific critical categories (external, organizational, and technical). Subsequently,
predictors. This strategic plan could lead to an improvement plan weights were generated for each predictor and category followed by
that could be utilized when integrating technologies with similar the development of the C-STAI. The C-STAI was then integrated
characteristics. If used extensively by a single organization or in into an assessment protocol for evaluating the safety technology
collaboration with other organizations with similar characteristics adoption propensity of an organization. The utility of the index
(e.g., size and target market), the predictive strength of the assess- and assessment protocol were also tested.
ment protocol would be established. The predictive strength could However, the decision-support tools developed have limitations.
inform the development of a benchmarking tool for the acceptance For instance, the predictive power of the assessment protocol was
phase of technology integration (comparing the rate of diffusion of not validated in the present study. Future research should consider
a current technology against similar technologies integrated in the adopting a precursor analysis approach supported by a general lin-
past). However, the proposed tool also provides an objective ear model (GLM) to validate the predictive power of the assessment
method for management to truncate the adoption process if the protocol. In the present study, the assessment protocol was de-
indicators from the initial assessment are not encouraging. signed based on the 19 critical predictors. In reality, additional pre-
Although primarily designed for supporting decision-making dictors could be added in the different categories depending on the
when analyzing the utility of a single technology, the C-STAI type of safety technology. Although most of the study participants
and assessment process provide a quantitative reference for the se- were extensively experienced and highly involved in decision-
lection of an appropriate technology for a specific task among making within their companies, they were not explicitly asked
multiple technologies. By rating each technology based on the pre- about their level of knowledge regarding the broad spectrum of
dictors, and applying the appropriate weight for each category, safety technologies and how often they used these technologies
decision-makers could choose the safety technology with the high- as part of their job responsibilities. However, the coefficients gen-
est aggregate point value. erated in this research are flexible and can be updated using the
Following the adoption of a technology, it is expected that an novel methodology described in the present study. Therefore, or-
organization will want to verify the success of the adoption process ganizations could develop in-house estimates using responses
(referred to as infusion) and the success from the adoption (referred received from employees who have prior experience using a spe-
to as diffusion) (Kishore and McLean 1998). Presently, there is no cific safety technology. This information would provide a more
conceptual model that clearly describes how researchers and practi- accurate result for that particular organization. Nevertheless, the
tioners can measure technology implementation success within the research presented in this paper represents an important step toward
safety management field. Given that technology integration has developing comprehensive but easy-to-use tools that support sound
multiple phases and success is measured as infusion and diffusion, technology adoption decision-making in the construction industry.

Appendix. Safety Technology Adoption Predictors Identified in the Literature


References
Potential safety technology adoption predictors a b c d e f g h i j k l
Capital cost of technology (initial purchasing cost) — — — — — — — — — — — X
Client demand X — — — — X X — — X X X
Competitive advantage X — — X — — — — — — X X
Direct competitors adopting similar technology — — — — — X X X — — — X
Government policy and regulations regarding technology — — — — — X X X — — — X

© ASCE 04020028-12 J. Constr. Eng. Manage.

J. Constr. Eng. Manage., 2020, 146(4): 04020028


Appendix. (Continued.)
References
Potential safety technology adoption predictors a b c d e f g h i j k l
Industry-level change requires technology adoption — — — — — X X X — — — X
Level of compatibility with current processes (interoperable) X X X — — — X — X — — X
Level of complexity of technology X X — — — X X X X — — X
Level of technical support available by manufacturer X — — — — X — — X — — X
Level of technical support required for optimum performance — — — — — — — — — — — X
Level of training required for optimum performance X — — — X X X — X — — X
Observability (end-user can observe performance prior to adoption) — — — X — — — X — — — X
Organization culture (receptive to change or not) — — — — — — X — — X X X
Downloaded from ascelibrary.org by PUKYONG NATIONAL UNIVERSITY on 05/03/23. Copyright ASCE. For personal use only; all rights reserved.

Partners/close associates adopt similar technology — — — — — — — — — — — X


Peer influence (how quick users will be able to influence colleagues) X — — X — — — — — — — X
Potential cost savings from using technology [positive Return on Investment (ROI)] X — — — X X — — X X X X
Potential level of resistance from employees toward technology adoption X — — — — — — — — — — X
Proven technology effectiveness (technical attributes meeting the performance requirements) — — X — — — — X X — — X
Technical attributes and features (features need for a given task) X — X — — — X — X — — X
Technology brand and reputation in the market — — — X — — — — X — — X
Technology budget (maximum funds allocated toward innovation in an organization) — — — — — — — — — — — X
Technology durability [technology lifespan (shelf live, battery life, etc.)] — — X X — — — X X — — X
Technology reliability (technology consistently meets performance requirements) — — X X — — — X X — — X
Top management degree of involvement (championing or opposing technology adoption) X X — — X — X X X — X X
Triability (user can try technology prior to adopting) — — — X — — — X — — — X
Versatility (technology can be utilized for more than one task) — — X X — — — X X — — X
Source: Adapted from Nnaji et al. (2019a).
Note: a = Peansupap and Walker (2005); b = Sargent et al. (2012); c = Nnaji et al. (2018); d = Sepasgozar et al. (2016); e = Won et al. (2013); f = Abubakar
et al. (2014); g = Mitropoulos and Tatum (2000); h = Rankin and Luther (2006); i = Sepasgozar and Bernold (2013); j = Ozorhon and Karahan 2017; k =
Gambatese and Hallowell (2011); and l = Nnaji et al. (2019a).

Data Availability Statement Awolusi, I., and E. D. Marks. 2019. “Active work zone safety: Preventing
accidents using intrusion sensing technologies.” Front. Built Environ.
Data generated or analyzed during the study are available from the 5: e21. https://doi.org/10.3389/fbuil.2019.00021.
corresponding author by request. Information about the Journal’s Baker, J. 2012. “The technology–organization–environment framework.”
data-sharing policy can be found here: http://ascelibrary.org/doi/10 In Information systems theory, 231–245. New York: Springer.
.1061/(ASCE)CO.1943-7862.0001263. Barbosa, F., J. Woetzel, J. Mischke, M. J. Ribeirinho, M. Sridhar, M.
Parsons, N. Bertram, and S. Brown. 2017. “Reinventing construction
through a productivity revolution.” Accessed December 12, 2018.
https://goo.gl/1Nqqf8.
References Borhani, A. 2016. “Individual and organizational influencing technology
adoption for construction safety.” Master thesis, Construction Manage-
Abubakar, M., Y. M. Ibrahim, D. Kado, and K. Bala. 2014. “Contractors’
ment, Univ. of Washington.
perception of the factors affecting building information modelling
Chan, A. P., P. T. Lam, D. W. Chan, E. Cheung, and Y. Ke. 2010. “Critical
(BIM) adoption in the Nigerian construction industry.” In Proc.,
success factors for PPPs in infrastructure developments: Chinese per-
15th Int. Conf. on Computing in Civil and Building Engineering, spective.” J. Constr. Eng. Manage. 136 (5): 484–494. https://doi.org/10
23–25. Reston, VA: ASCE. .1061/(ASCE)CO.1943-7862.0000152.
Alexander, D., M. Hallowell, and J. Gambatese. 2017. “Precursors of con- Chan, A. P. C., P. T. I. Lam, Y. Wen, E. E. Ameyaw, S. Wang, and Y. Ke.
struction fatalities. II: Predictive modeling and empirical validation.” 2015. “Cross-sectional analysis of critical risk factors for PPP water
J. Constr. Eng. Manage. 143 (7): 04017024. https://doi.org/10.1061 projects in China.” J. Infrastruct. Syst. 21 (1): 04014031. https://doi
/(ASCE)CO.1943-7862.0001297. .org/10.1061/(ASCE)IS.1943-555X.0000214.
Alizadehsalehi, S., I. Yitmen, T. Celik, and D. Arditi. 2018. “The effective- Chen, L., and K. Manley. 2014. “Validation of an instrument to measure
ness of an integrated BIM/UAV model in managing safety on construc- governance and performance on collaborative infrastructure projects.”
tion sites.” Int. J. Occup. Saf. Ergon. https://doi.org/10.1080/10803548 J. Constr. Eng. Manage. 140 (5): 04014006. https://doi.org/10.1061
.2018.1504487. /(ASCE)CO.1943-7862.0000834.
Ameyaw, E. E., and A. P. Chan. 2015. “Risk allocation in public-private Cho, Y. K., K. Kim, S. Ma, and J. Ueda. 2018. “A robotic wear-
partnership water supply projects in Ghana.” Constr. Manage. able exoskeleton for construction worker’s safety and health.”
Econ. 33 (3): 187–208. https://doi.org/10.1080/01446193.2015 In ASCE Construction Research Congress 2018, 19–28. Reston,
.1031148. VA: ASCE.
Antwi-Afari, M. F., H. Li, Y. Yu, and L. Kong. 2018. “Wearable insole Choi, B., S. Hwang, and S. Lee. 2017. “What drives construction workers’
pressure system for automated detection and classification of awkward acceptance of wearable technologies in the workplace?: Indoor locali-
working postures in construction workers.” Autom. Constr. 96 (Dec): zation and wearable health devices for occupational safety and health.”
433–441. https://doi.org/10.1016/j.autcon.2018.10.004. Autom. Constr. 84 (Dec): 31–41. https://doi.org/10.1016/j.autcon.2017
Awolusi, I., E. Marks, and M. Hallowell. 2018. “Wearable technology for .08.005.
personalized construction safety monitoring and trending: Review of Choudhry, R. M. 2017. “Achieving safety and productivity in construction
applicable devices.” Autom. Constr. 85 (Jan): 96–106. https://doi.org/10 projects.” J. Civ. Eng. Manage. 23 (2): 311–318. https://doi.org/10
.1016/j.autcon.2017.10.010. .3846/13923730.2015.1068842.

© ASCE 04020028-13 J. Constr. Eng. Manage.

J. Constr. Eng. Manage., 2020, 146(4): 04020028


Chuttur, M. Y. 2009. “Overview of the technology acceptance model: in informatics and computing in civil and construction engineering,
Origins, developments and future directions.” Working Pap. Inf. Syst. 181–187. Cham, Switzerland: Springer.
9 (37): 9–37. Kanan, R., O. Elhassan, and R. Bensalem. 2018. “An IoT-based autono-
Gambatese, J. A., and M. Hallowell. 2011. “Enabling and measuring in- mous system for workers’ safety in construction sites with real-time
novation in the construction industry.” Constr. Manage. Econ. 29 (6): alarming, monitoring, and positioning strategies.” Autom. Constr.
553–567. https://doi.org/10.1080/01446193.2011.570357. 88 (Apr): 73–86. https://doi.org/10.1016/j.autcon.2017.12.033.
Gheisari, M., and B. Esmaeili. 2019. “Applications and requirements of Karakhan, A., and O. Alsaffar. 2019. “Technology’s role.” Prof. Saf. 64 (1):
unmanned aerial systems (UASs) for construction safety.” Saf. Sci. 43–45.
118 (Oct): 230–240. https://doi.org/10.1016/j.ssci.2019.05.015. Karakhan, A., Y. Xu, C. Nnaji, and O. Alsaffar. 2019. “Technology alter-
Gheisari, M., J. Irizarry, and B. N. Walker. 2014. “UAS4SAFETY: The po- natives for workplace safety risk mitigation in construction: Exploratory
tential of unmanned aerial systems for construction safety applications.” study.” In Advances in informatics and computing in civil and
In ASCE Construction Research Congress 2014: Construction in a construction engineering, 823–829. Cham, Switzerland: Springer.
Downloaded from ascelibrary.org by PUKYONG NATIONAL UNIVERSITY on 05/03/23. Copyright ASCE. For personal use only; all rights reserved.

Global Network, 1801–1810. Reston, VA: ASCE. Karakhan, A. A., S. Rajendran, J. Gambatese, and C. Nnaji. 2018. “Meas-
Gheisari, M., A. Rashidi, and B. Esmaeili. 2018. “Using unmanned aerial uring and evaluating safety maturity of construction contractors: Multi-
systems for automated fall hazard monitoring.” In Proc., ASCE Con- criteria decision-making approach.” J. Constr. Eng. Manage. 144 (7):
struction Research Congress, 62–72. Reston, VA: ASCE. 04018054. https://doi.org/10.1061/(ASCE)CO.1943-7862.0001503.
Goodrum, P. M., C. T. Haas, C. Caldas, D. Zhai, J. Yeiser, and D. Homm. Kim, K., H. Kim, and H. Kim. 2017. “Image-based construction hazard
2011. “Model to predict the impact of a technology on construction avoidance system using augmented reality in wearable device.” Autom.
productivity.” J. Constr. Eng. Manage. 137 (9): 678–688. https://doi Constr. 83 (Nov): 390–403. https://doi.org/10.1016/j.autcon.2017.06
.org/10.1061/(ASCE)CO.1943-7862.0000328. .014.
Grover, V., and M. D. Goslar. 1993. “The initiation, adoption, and imple- Kishore, R., and E. McLean. 1998. “Diffusion and infusion: Two dimen-
mentation of telecommunications technologies in US organizations.” sions of ‘Success of Adoption’ of IS innovations.” In Proc., Americas
J. Manage. Inf. Syst. 10 (1): 141–164. https://doi.org/10.1080/0742 Conference on Information Systems 1998. Atlanta: Association for
1222.1993.11517994. Information Systems.
Guo, H., Y. Yu, T. Xiang, H. Li, and D. Zhang. 2017. “The availability of Lee, S., and J. Yu. 2016. “Comparative study of BIM acceptance between
wearable-device-based physical data for the measurement of construc- Korea and the United States.” J. Constr. Eng. Manage. 142 (3):
tion workers’ psychological status on site: From the perspective of 05015016. https://doi.org/10.1061/(ASCE)CO.1943-7862.0001076.
safety management.” Autom. Constr. 82 (Oct): 207–217. https://doi Lee, S., J. Yu, and D. Jeong. 2015. “BIM acceptance model in construction
.org/10.1016/j.autcon.2017.06.001. organizations.” J. Manage. Eng. 31 (3): 04014048. https://doi.org/10
Hair, J. F., R. E. Anderson, R. L. Tatham, and W. C. Black. 1998. Multi- .1061/(ASCE)ME.1943-5479.0000252.
variate data analysis. Upper Saddle River, NJ: Pearson Prentice Hall. Li, X., W. Yi, H. L. Chi, X. Wang, and A. P. Chan. 2018. “A critical review
Hair, J. F., W. C. Black, and B. J. Babin. 2010. Multivariate data analysis. of virtual and augmented reality (VR/AR) applications in construction
Upper Saddle River, NJ: Pearson Prentice Hall. safety.” Autom. Constr. 86 (Feb): 150–162. https://doi.org/10.1016/j
Haymaker, J., D. H. Chau, and B. Xie. 2013. “Inference-assisted choosing .autcon.2017.11.003.
by advantages.” In Proc., 21th Annual Conf. of the Int. Group for Lean Lingard, H. C., and S. Rowlinson. 2006. Sample size in factor analysis:
Construction (IGLC), 339–348. Fortaleza, Brazil. Why size matters. Hong Kong: Univ. of Hong Kong.
Hu, Y., A. P. Chan, Y. Le, Y. Xu, and M. Shan. 2016. “Developing a pro- Martinez-Aires, M. D., M. Lopez-Alonso, and M. Martinez-Rojas. 2018.
gram organization performance index for delivering construction meg- “Building information modeling and safety management: A systematic
aprojects in China: Fuzzy synthetic evaluation analysis.” J. Manage. review.” Saf. Sci. 101 (Jan): 11–18.
Eng. 32 (4): 05016007. https://doi.org/10.1061/(ASCE)ME.1943-5479 Mathieson, K., E. Peacock, and W. W. Chin. 2001. “Extending the tech-
.0000432. nology acceptance model: The influence of perceived user resources.”
Hwang, S., H. Jebelli, B. Choi, M. Choi, and S. Lee. 2018. “Measuring ACM SIGMIS Database: DATABASE Adv. Inf. Syst. 32 (3): 86–112.
workers’ emotional state during construction tasks using wearable https://doi.org/10.1145/506724.506730.
EEG.” J. Constr. Eng. Manage. 144 (7): 04018050. https://doi.org/10 Mitropoulos, P., and C. B. Tatum. 2000. “Management-driven integration.”
.1061/(ASCE)CO.1943-7862.0001506. J. Manage. Eng. 16 (1): 48–58. https://doi.org/10.1061/(ASCE)0742
Hwang, S., and S. Lee. 2017. “Wristband-type wearable health devices to -597X(2000)16:1(48).
measure construction workers’ physical demands.” Autom. Constr. Mu, S., H. Cheng, M. Chohr, and W. Peng. 2014. “Assessing risk manage-
83 (Nov): 330–340. https://doi.org/10.1016/j.autcon.2017.06.003. ment capability of contractors in subway projects in mainland China.”
Irizarry, J., M. Gheisari, and B. N. Walker. 2012. “Usability assessment of Int. J. Project Manage. 32 (3): 452–460. https://doi.org/10.1016/j
drone technology as safety inspection tools.” J. Inf. Technol. Constr. .ijproman.2013.08.007.
17 (12): 194–212. Nnaji, C., J. Gambatese, A. Karakhan, and C. Eseonu. 2019a. “Influential
Jebelli, H., C. R. Ahn, and T. L. Stentz. 2015. “Comprehensive fall-risk safety technology adoption predictors in construction.” Eng. Constr.
assessment of construction workers using inertial measurement units: Archit. Manage. 26 (11): 2655–2681. https://doi.org/10.1108/ECAM
Validation of the gait-stability metric to assess the fall risk of iron work- -09-2018-0381.
ers.” J. Comput. Civ. Eng. 30 (3): 04015034. https://doi.org/10.1061 Nnaji, C., H. W. Lee, A. Karakhan, and J. Gambatese. 2018. “Developing a
/(ASCE)CP.1943-5487.0000511. decision-making framework to select safety technologies for highway
Jebelli, H., S. Hwang, and S. Lee. 2018a. “EEG-based workers’ stress rec- construction.” J. Constr. Eng. Manage. 144 (4): 04018016. https://doi
ognition at construction sites.” Autom. Constr. 93 (Sep): 315–324. .org/10.1061/(ASCE)CO.1943-7862.0001466.
https://doi.org/10.1016/j.autcon.2018.05.027. Nnaji, C., I. Okpala, and S. Kim. 2019b. “A simulation framework for tech-
Jebelli, H., S. Hwang, and S. Lee. 2018b. “EEG signal-processing frame- nology adoption decision making in construction management: A
work to obtain high-quality brain waves from an off-the-shelf wearable composite model.” In Computing in civil engineering 2019: Visualiza-
EEG device.” J. Comput. Civ. Eng. 32 (1): 04017070. https://doi.org/10 tion, information modeling, and simulation, 499–506. Reston, VA:
.1061/(ASCE)CP.1943-5487.0000719. ASCE.
Jebelli, H., S. Hwang, and S. Lee. 2018c. “Feasibility of field measurement Norusis, M. 2008. SPSS 16.0 advanced statistical procedures companion.
of construction workers’ valence using a wearable EEG device.” Upper Saddle River, NJ: Prentice Hall Press.
J. Comput. Civ. Eng. 32 (1): 04017070. https://doi.org/10.1061 NRC (National Research Council). 2009. Advancing the competitiveness
/9780784480830.013. and efficiency of the US construction industry. Washington, DC:
Jebelli, H., and S. Lee. 2019. “Feasibility of wearable electromyography National Academies Press.
(EMG) to assess construction workers’ muscle fatigue.” In Advances Nunnally, J. 1978. Psychometric methods. New York: McGraw-Hill.

© ASCE 04020028-14 J. Constr. Eng. Manage.

J. Constr. Eng. Manage., 2020, 146(4): 04020028


Osei-Kyei, R., and A. P. Chan. 2017. “Developing a project success index for SmartMarket. 2017. Safety management in the construction industry 2017.
public–private partnership projects in developing countries.” J. Infrastruct. SmartMarket Report. Bedford, MA: Dodge Data and Analytics.
Syst. 23 (4): 04017028. https://doi.org/10.1061/(ASCE)IS.1943-555X Son, H., Y. Park, C. Kim, and J. S. Chou. 2012. “Toward an understanding
.0000388. of construction professionals’ acceptance of mobile computing devices
Ozorhon, B., and U. Karahan. 2017. “Critical success factors of building in South Korea: An extension of the technology acceptance model.”
information modeling implementation.” J. Manage. Eng. 33 (3): Autom. Constr. 28 (Dec): 82–90. https://doi.org/10.1016/j.autcon.2012
04016054. https://doi.org/10.1061/(ASCE)ME.1943-5479.0000505. .07.002.
Pallant, J. 2001. SPSS survival manual: A step by step guide to data analy- Suhr, J. 1999. The choosing by advantages decisionmaking system.
sis using SPSS for Windows (versions 10 and 11): SPSS student version Westport, CT: Quorum.
11.0 for Windows. Milton Keynes, UK: Open University Press. Szymberski, R. T. 1997. “Construction project safety planning.” Tappi J.
Pallant, J. 2007. SPSS survival manual: A step by step guide to data analy- 80 (11): 69–74.
sis for windows. 3rd ed. Maidenhead, UK: Open Univ.
Tang, S., D. R. Shelden, C. M. Eastman, P. Pishdad-Bozorgi, and X. Gao.
Downloaded from ascelibrary.org by PUKYONG NATIONAL UNIVERSITY on 05/03/23. Copyright ASCE. For personal use only; all rights reserved.

Park, M. W., and I. Brilakis. 2012. “Construction worker detection in video


2019. “A review of building information modeling (BIM) and the in-
frames for initializing vision trackers.” Autom. Constr. 28 (15): 15–25.
ternet of things (IoT) devices integration: Present status and future
https://doi.org/10.1016/j.autcon.2012.06.001.
trends.” Autom. Constr. 101 (May): 127–139. https://doi.org/10.1016/j
Park, Y., H. Son, and C. Kim. 2012. “Investigating the determinants of con-
struction professionals’ acceptance of web-based training: An extension .autcon.2019.01.020.
of the technology acceptance model.” Autom. Constr. 22 (Mar): 377–386. Venkatesh, V., and F. D. Davis. 2000. “A theoretical extension of the tech-
https://doi.org/10.1016/j.autcon.2011.09.016. nology acceptance model: Four longitudinal field studies.” Manage.
Peansupap, V., and D. Walker. 2005. “Factors affecting ICT diffusion: A Sci. 46 (2): 186–204. https://doi.org/10.1287/mnsc.46.2.186.11926.
case study of three large Australian construction contractors.” Eng. Wang, C., L. Ikuma, J. Hondzinski, and M. de Queiroz. 2017. “Application
Constr. Archit. Manage. 12 (1): 21–37. https://doi.org/10.1108 of assistive wearable robotics to alleviate construction workforce short-
/09699980510576871. age: Challenges and opportunities.” In ASCE computing in civil engi-
Pedrycz, W., P. Ekel, and R. Parreiras. 2011. Fuzzy multicriteria decision- neering 2017, 358–365. Reston, VA: ASCE.
making: Models, methods and applications. Chichester, UK: Wiley. Won, J., G. Lee, C. Dossick, and J. Messner. 2013. “Where to focus for
Perlman, A., R. Sacks, and R. Barak. 2014. “Hazard recognition and risk successful adoption of building information modeling within organiza-
perception in construction.” Saf. Sci. 64 (Apr): 22–31. https://doi.org/10 tion.” J. Constr. Eng. Manage. 139 (11): 04013014. https://doi.org/10
.1016/j.ssci.2013.11.019. .1061/(ASCE)CO.1943-7862.0000731.
Qualtrics. 2018. “Unlock breakthrough insights with market research panels.” Xu, Y., A. P. C. Chan, and J. F. Yeung. 2010. “Developing a fuzzy risk allo-
Accessed January 7, 2020. https://www.qualtrics.com/online-sample. cation model for PPP projects in China.” J. Constr. Eng. Manage. 136 (8):
Rankin, J., and R. Luther. 2006. “The innovation process: Adoption of in- 894–903. https://doi.org/10.1061/(ASCE)CO.1943-7862.0000189.
formation and communication technology for the construction industry.” Yang, Z., Y. Yuan, M. Zhang, X. Zhao, and B. Tian. 2019. “Assessment of
Can. J. Civ. Eng. 33 (12): 1538–1546. https://doi.org/10.1139/l05-128. construction workers’ labor intensity based on wearable smartphone
Rogers, E. M. 2003. “Elements of diffusion.” In Vol. 5 of Diffusion of system.” J. Constr. Eng. Manage. 145 (7): 04019039. https://doi.org/10
innovations. New York: Simon and Schuster. .1061/(ASCE)CO.1943-7862.0001666.
Ryu, J., J. Seo, H. Jebelli, and S. Lee. 2019. “Automated action recognition Zhang, H., X. Yan, H. Li, R. Jin, and H. Fu. 2019a. “Real-time alarming,
using an accelerometer-embedded wristband-type activity tracker.” monitoring, and locating for non-hard-hat use in construction.”
J. Constr. Eng. Manage. 145 (1): 04018114. https://doi.org/10.1061 J. Constr. Eng. Manage. 145 (3): 04019006. https://doi.org/10.1061
/(ASCE)CO.1943-7862.0001579.
/(ASCE)CO.1943-7862.0001629.
Sargent, K., P. Hyland, and S. Sawang. 2012. “Factors influencing
Zhang, Y., H. Luo, M. Skitmore, Q. Li, and B. Zhong. 2019b. “Optimal
the adoption of information technology in a construction business.”
camera placement for monitoring safety in metro station construction
Australas. J. Constr. Econ. Build. 12 (2): 72.
work.” J. Constr. Eng. Manage. 145 (1): 1–13. https://doi.org/10
Sepasgozar, S. M., and L. E. Bernold. 2013. “Factors influencing con-
struction technology adoption.” In Proc., 19th CIB World Building .1061/(ASCE)CO.1943-7862.0001584.
Congress. Brisbane, Australia: Brisbane Convention Centre. Zhou, S., and M. Gheisari. 2018. “Unmanned aerial system applications in
Sepasgozar, S. M., M. Loosemore, and S. R. Davis. 2016. “Conceptualising construction: A systematic review.” Constr. Innovation 18 (4): 453–468.
information and equipment technology adoption in construction: A https://doi.org/10.1108/CI-02-2018-0010.
critical review of existing research.” Eng. Constr. Archit. Manage. Zhou, Z., J. Irizarry, and Q. Li. 2013. “Applying advanced technology to
23 (2): 158–176. https://doi.org/10.1108/ECAM-05-2015-0083. improve safety management in the construction industry: A literature
Şerban, G., I. Rus, D. Vele, P. Breţcan, M. Alexe, and D. Petrea. 2016. review.” Constr. Manage. Econ. 31 (6): 606–622. https://doi.org/10
“Flood-prone area delimitation using UAV technology, in the areas .1080/01446193.2013.798423.
hard-to-reach for classic aircrafts: Case study in the north-east of Zhu, Z., X. Ren, and Z. Chen. 2017. “Integrated detection and tracking of
Apuseni Mountains, Transylvania.” Nat. Hazard. 82 (3): 1817–1832. workforce and equipment from construction jobsite videos.” Autom.
https://doi.org/10.1007/s11069-016-2266-4. Constr. 81 (9): 161–171. https://doi.org/10.1016/j.autcon.2017.05.005.

© ASCE 04020028-15 J. Constr. Eng. Manage.

J. Constr. Eng. Manage., 2020, 146(4): 04020028

You might also like