Download as pdf or txt
Download as pdf or txt
You are on page 1of 24

The current issue and full text archive of this journal is available at

www.emeraldinsight.com/1471-4175.htm

CI
14,2 A measurement model of building
information modelling maturity
Yunfeng Chen
186 Building Construction Management, Purdue University,
West Lafayette, Indiana, USA
Hazar Dib
Building Construction Management & Computer Graphics Technology,
Purdue University, West Lafayette, Indiana, USA, and
Robert F. Cox
Building Construction Management, Purdue University,
West Lafayette, Indiana, USA

Abstract
Purpose – There is a growing requirement for a rating system of building information modelling
maturity (BIMM) to compare the effectiveness of modelling processes in construction projects.
The literature related to BIMM contains theoretical proposals and description of their maturity models.
However, the research efforts are limited and lacking substantial theoretical and empirical
justifications. This paper is a unique attempt to integrate previous models by performing empirical
investigations of key factors for measuring BIMM in construction projects. The paper aims to discuss
these issues.
Design/methodology/approach – A national survey was designed to extract the perception of
124 BIM-related practitioners and academicians about the conceptual model. Then, exploratory and
confirmatory factor analyses were employed to identify and test the key factors underlying the
27 areas.
Findings – A principal component factor analysis of the collected data had suggested a five-factor
model, which explained 69.839 per cent of the variance. The construct validity of the model was further
tested by confirmatory factor analysis. The results indicated that all factors were important in
measuring BIMM; however, compared with the factors of technology and people, more emphasis was
put on the factors of process and information.
Originality/value – The key value of the paper is to increase the understanding of multi-dimension
nature of BIMM through empirical evidence and to provide practitioners and researchers with the
insight regarding particular emphasis on the factors related to modelling process and information.
Keywords Quality management, Factor analysis, Process improvement, Maturity model, BIM,
Construction management
Paper type Research paper

Introduction
Building information modelling (BIM) has been increasingly adopted and implemented in
the architectural, engineering and construction (AEC) industry since its inception in the
1970s, with many professionals expecting it to transform how this business
Construction Innovation operates (Eastman et al., 2011; Hardin, 2009; Succar, 2010; Wong et al., 2011).
Vol. 14 No. 2, 2014
pp. 186-209
q Emerald Group Publishing Limited
1471-4175
The authors would like to thank the ConstrucTech Magazine and the buildingSMART alliance
DOI 10.1108/CI-11-2012-0060 for their support in distributing the questionnaire of this research.
The 2009 SmartMarket Report showed that there were 48 per cent of the industry claimed Measurement
employing BIM or BIM-related tools, which represented a 75 per cent growth in BIM usage model of BIMM
compared to 2007 (McGraw-Hill, 2009). Among those users, some may just employ a few
copies of BIM-related software occasionally in projects that were dictated by owners
or contacts; some may facilitate BIM implementation in a small scale of projects by
equipping staff and infrastructure; others might launch BIM on majority of their projects
and currently focus on developing and optimising internal and external collaborative BIM 187
processes. It is obvious that the extent to which BIM is “explicitly defined, managed,
integrated and optimised”, which refers to BIM maturity (BIMM), is different among the
users. So while everyone branded themselves as BIM-able, how can one tell the difference?
A reliable and valid benchmark of BIMM is needed for meaningful evaluation and
comparison (Chen et al., 2012; NIBS, 2007; Smith and Tardif, 2009; Succar, 2010).
The maturity model specifies the key areas and characteristics a process must possess
to qualify as a process of some maturity (Jalote, 2000; Paulk et al., 1995). It originated in the
field of quality management (Cooke-Davies, 2002). Its underlying premise is that the
quality of a final product depends largely on the process used to create it (Paulk et al.,
1995). It was expected that as the visibility into the process increases, the process
“predictability”, process “control”, and business performance improve (Paulk et al.,
1995, p. 27). Previous research also showed that the more mature any business process is,
the better forecast, control and performance the business can have (Lockamy and
McCormack, 2004; McCormack and Johnson, 2001). Rooted in Walter Shewhart’s principle
of statistical quality control in process improvement, maturity models proliferated in the
manufacturing and software industries. One of the most well-known and adopted
maturity models is the capability maturity model (CMM) proposed by the Software
Engineering Institute (SEI) (Vaidyanathan and Howell, 2007), which is a framework for
software organisations to evaluate and improve their software process (Paulk et al., 1995).
Other maturity models extended from CMM include capability maturity model
integration (CMMI), people-CMM (P-CMM), personal software process (PSP), team
software process (TSP) and trillium (Alshawi, 2007). To adjust to the European market,
a process model titled Bootstrap was developed by the European commission for
the software development process improvement soon after CMM (Alshawi, 2007). The
concept of maturity model was also adopted in other industries, like the
Project Management Process Maturity Model (PM)2 in project management industry,
manufacturing supply chain maturity model (MSCMM) in the manufacturing
industry and lean enterprise transformation maturity model in the aerospace industry
(Vaidyanathan and Howell, 2007). Inspired by the success of maturity model in other
industries, researchers in the construction industry started to investigate the applicability
of maturity models. One significant initiative was the Standardised Process Improvement
for Construction Enterprise (SPICE), which concluded that CMM in the software industry
cannot be reused directly to the construction industry due to its incapability address the
supply chain issue (Sarshar et al., 2000). The construction supply chain maturity model
(CSCMM) was then proposed for supply chain members to improve performance through
operation excellence (Vaidyanathan and Howell, 2007). This model cannot be used for
measurement of BIMM, because it focuses on the aspect of multi-enterprise supply chain
other than BIM. There were four maturity model proposed in the AEC industry that
claimed the ability to measure BIM when this study was conducted, which was discussed
in the section of BIMM.
CI With the proliferation of BIM in the AEC industry, there is a growing awareness
14,2 among the professionals about the need of a BIMM matrix. However, the establishment
of a BIMM matrix can be difficult even not impossible, because of the multi-dimensional
nature of BIM (Smith and Tardif, 2009). BIM is a concept functioning differently to
different professions. Architects use BIM as “a process and technology” to model
“the physical and functional characteristics” of a building (AIA, 2008, A295,
188 Section 1.3.5). Contractors use BIM as a computer software model to improve decision
making and facility delivery process (AGC, 2006). Compared with other stakeholders,
owners perceive “BIM as more of a collaborative process” (McGraw-Hill, 2009). After all,
BIM “is more than a technology” (Ashcraft, 2008); it is a set of interacting process,
human, information, and technology issues. According to the technical report by the
Center for Integrated Facility Engineering (CIFE, 2012), the impediments of BIM
adoption were shifting from “technical issues such as contractual language and
hardware and software to people issues such as training and availability of qualified
staff” (Gilligan and Kunz, 2007, p. 1). The adoption of BIM is more than the equipment of
staff and technology infrastructure; it is a systematic approach to the lifecycle
information related to a building (Smith and Tardif, 2009). According to SmartMarket
Report, the focus of BIM investment changes over time; beginners rated BIM software
and training as their highest priority investment, while experienced users ranked
collaborative BIM procedures and marketing as their top priority investment
(McGraw-Hill, 2009). The employment of any IT solution cannot reach its full
potential when focusing only on technology (Alshawi, 2007). Unlike other standalone
solution of information technology, the value of BIM can be maximised when it is used as
a collaborative platform and process to facilitate communication and interdependence
among stakeholders (Ashcraft, 2008). Although there have been some attempts to
propose metrics for measuring BIM implementation, most studies account only for one
dimension of BIM, and focus mainly on the final BIM model other than the process used
to create it. Moreover, because many research efforts lacked substantial theoretical and
empirical justifications, the reliability and validity of the models remain questionable.
This study was conducted to fill these gaps by integrating previous efforts through
performing an empirical investigation of key factors for measuring BIMM. Such
research is necessary because the results may help non-users to orient their BIM
initiation, as well as current users to locate and improve their BIM implementation.
It also offered empirical evidence to the multi-dimensional nature of BIM, and may also
serve as a starting point for future researches to quantify BIM and its values, as well as
the development of metrics with lower-level items.

Research methodology
Every project is different, which can result in different requirements for BIM and therefore
BIMM. However, it was believed by the authors through a post-positivism lens that there
should be a common framework underlying BIMM. To identify the key areas for
measuring BIMM, a two-step quantitative exploratory and confirmatory factor analytic
approach was used, which is a typical methodology for the development of valid and
reliable scale (Cudeck and Browne, 1983; Brown, 2006; Hair et al., 2009; Tabachnick and
Fidell, 2012). A non-experimental design with questionnaire survey was adopted here,
because of the factorial analysis’s requirement for large amount of data, the limit of time
and money, as well as the difficulty to find experiment subjects and provide 27 treatments.
As shown in Figure 1, this study commenced with the development of 27 BIMM areas Measurement
for measuring BIMM by reviewing literature. The 27 BIMM areas, their corresponding model of BIMM
original areas, as well as their descriptions/definitions were presented in Table AI of
Appendix. The BIM-related academicians’ and practitioners’ perceived importance of
BIMM areas was then collected through a questionnaire survey based on the 27 areas.
Qualified responses from 124 participants were received and screened for potential
problems before formal analysis. Then the exploratory factor analysis (EFA) was used 189
to identify the key areas and their underlying factors, while confirmatory factor analysis
(CFA) was employed to test the construct validity of the factorial structure.

BIM maturity
Among all the maturity models proposed in the AEC industry, only four models
claimed the ability to measure BIMM when this study was conducted. They included
the CMM by the National Institute of Building Science (NIBS, 2007), the BIM

Phase I:
Development Literature review
of
Measurement Identification of BIMM areas
Areas

Assessment of content validity

Phase II: Design of questionnaire based on the


Development identified BIMM areas
of survey
instrument Review of questionnaire

Development of web-based questionnaire

Phase III:
Data Identification of target respondents
collection
and Data collection
screening

Data screening

Phase IV:
Data Exploratory factor analysis (EFA)
analysis and
conclusions Confirmatory Factor Analysis (CFA)

Figure 1.
Research conclusions Research methodology
CI deliverable matrix by the Alliance for Construction Excellence (ACE, 2008), the BIM
14,2 proficiency matrix by Indiana University (IU, 2009), and the BIM competency sets by
Succar (2010). NIBS’s (2007) CMM was developed for the users to evaluate their
internal business process and practices. There are two versions of NIBS’s CMM. The
first is the static tabular CMM consisting of eleven areas of interest against
ten maturity level (NIBS, 2007). Based on the tabular CMM, the second is the
190 interactive CMM that offered output through interaction with the users (NIBS, 2007).
The NIBS’s CMM is “a good first step toward establishing BIM implementation
benchmarks” (Smith and Tardif, 2009, p. 44); a minimum BIM is defined in terms of the
11 areas and only the users meet the minimum requirements can claim themselves as
BIM-able. CMM was validated by a test bed of BIM Award winners in 2007
(McCuen et al., 2012) and was considered as the default standard for the measurement
of information management (McGraw-Hill, 2008). Nevertheless, its inability in
assessing any other metrics above information management has limited its
applicability (Succar, 2010). Given that the success of a project team depends on the
clear regulation of the project scope and deliverables for each stakeholder, ACE’s (2008)
BIM deliverable matrix lists BIM services and deliverables at each phase of a typical
BIM project against three implementation levels, as well as the types of software used
by different stakeholders. This matrix concentrates on the description of the digital
models of each phase other than the process to create the models. IU’s (2009) BIM
proficiency matrix is used to evaluate the BIM capability and proficiency of a firm, the
result of which serves as a selection criterion for a given project. The matrix contained
32 credit areas under eight categories. There are few documents about the development
process of BIM proficiency matrix and the rationale of the credit areas. Based on the
limited examples corresponding to each credit area in the enhanced BIM proficiency
matrix, it was perceived that “the matrix focuses on the accuracy and richness of
digital model data and has less focus on the process of creating that digital model”
(Succar, 2010, p. 102). Succar’s (2010) BIM competency set is intended to evaluate the
capability and maturity of a BIM player. It is a hierarchical set of competency areas
under four granularities. Competency areas with varied breadth and specificity can be
used for the different goals of measurement, including discovery, evaluation,
certification and auditing. The BIM competency set is comprehensive in covering the
key areas related to technology, process and policy. However, there is limited
description about the areas related to information management. What is more, there is
redundancy of the areas. For example, the area of “networking solution” overlaps
“network deliverables” (Succar, 2010, p. 72). Additionally, although the competency set
was claimed to be generated based on previous frameworks related to performance,
excellence, and quality management, there is no explicit explanation or comparison of
the competency areas with items exposed in previous frameworks. More importantly,
the competency set was not validated empirically.
Although there have been some efforts toward BIMM, most research studied BIM as
a one-dimensional concept or an end-product. More significantly, most studies were
limited to the theoretical proposal, and there is limited empirical research to test the
reliability and validity of the models and frameworks. Key themes were identified and
extracted based on a review of previous literature of maturity, quality, and performance
models. It turned out that the areas included in the CMM of NIBS and Succar’s BIM
competency set covered most of the key themes. Considering all above, 27 areas were
extracted and combined from the NIBS’s CMM and Succar’s BIM competency set to Measurement
establish the initial pool of areas for measuring BIMM. The content validity of the model of BIMM
27 areas was confirmed by aligning with areas/items exposed in previous models of
maturity, excellence, and quality management. The 27 BIMM areas of this research,
their corresponding original areas in NIBS’s CMM and Succar’s BIM competency set and
similar areas in previous research were listed in Table I of Chen et al. (2012). The 27 areas
were then evaluated by two BIM-related experts with academic and industry experience 191
as having validity to tap the BIMM. The content validity of the areas was further
confirmed by participants, since most of them commented that the questionnaire was
comprehensive and thorough in measuring BIMM.

Data collection and screening


Data collection
An on-line questionnaire survey was conducted for data collection. The questionnaire
was designed to solicit the perception of BIM-related experts about the degree of
importance of the 27 areas in measuring BIMM through a seven-point Likert scale.
A sample questionnaire can be found in Dib et al. (2012). To ensure the relevance of the
responses, the questionnaires were only sent to the academicians conducting research
in BIM and practitioners with BIM-related experience. In order to safeguard the
reliability of the received responses, both types of professionals were asked about their
BIM-related experience. If the academic respondent replied that s/he had not
BIM-related experience of project or research, her/his response was not used for the
analysis. Similarly, if the practical respondent claimed to be involved in less than
two BIM-implemented projects or had less than one-year direct working experience with
BIM, her/his was discarded as well. The survey was conducted to a total of
579 BIM-related experts in the USA. 131 of them responded to the survey,
which represented a response rate of 22.63 per cent. Based on the criteria listed above,
124 qualified responses were filtered out for further data analysis. The detailed
information of the responses was shown in Table I. The industry sample represented a
wide spectrum of professional disciplines working as owner, architect or contractor.
These profiles were listed in Tables II and III.

Data screening
Before the data analysis, the raw data should be screened for potential problems,
including missing data, departure from normality and collinearity. First, the data profile
included 0.96 per cent missing values, which was then substituted with the mean of
the corresponding areas. Second, considering that most responses lied between 4 and 7,
the skewness index (SI) and kurtosis index (KI) were used to examine the normality

Responses received Valid responses Qualified responses


Questionnaires sent (%) (%) (%)
Table I.
Academicians 206 52 (25.24%) 50 (24.27%) 49 (23.79%)
Information about
Practitioners 373 92 (24.66%) 81 (21.72%) 75 (20.11%)
response from
Total 579 144 (24.87%) 131(22.63%) 124 (21.42%)
academicians and
Source: Adapted from Chen et al. (2012) practitioners in the USA
CI of the distribution of each area. Because the absolute value of SI and KI for each area was
14,2 less than the cut-off point of 3 and 10 (Kline, 2010), the normality of the distribution of
each area was not rejected. Finally, in order to use factor analysis, the data matrix should
have sufficient correlations greater than 0.30 while less than 0.7 (Hair et al., 2009, p. 104).
There are multiple ways with different “rules of thumb” to detect multi-collinearity,
including the bivariate correlation (r), the squared multiple correlation (R2sme ¼ 0:778),
192 and the variance inflation factor (VIF). Commonly, an r above 0.7 for small sample or 0.85
for larger sample (Berry and Feldman, 1985), a R2sme above 0.9 (Kline, 2010), and a VIF
more than 4 or 10 (O’Brien, 2007) may indicate extreme collinearity. Based on the
recommended approaches, it was found that there was collinearity between hardware
equipment (HE) (r ¼ 0.846, R2sme ¼ 0:835, VIF ¼ 6.065) and hardware upgrade (HU)
(r ¼ 0.846, R2sme ¼ 0:778, VIF ¼ 4.505). Considering the nature of the two areas, HE and
HU were combined to form a new area of hardware, the score of which was the average of
the original two areas (Kline, 2010). The further examination of the data set revealed that
there may be collinearity between the areas of standard operation procedure (SOP)
(r ¼ 0.731, R2sme ¼ 0:648, VIF ¼ 2.841) and documentation and modelling standard
(DMS) (r ¼ 0.731, R2sme ¼ 0:659, VIF ¼ 2.933). Although the related detection indices for
these two areas were not as extreme as the ones for HD and HU, it still posed considerable
threat. Besides, from the theoretical perspective, SOP and DMS aligned on the
standardisation of modelling and operation. To be more conservative, it was decided to
run exploratory factor analyses, of the 26 areas with HE and HU combined into
hardware and of the 25 areas with SOP and DMS combined into Standardisation,
respectively.

Business types Amount (%)

General contractor/construction manager 21 (28.00%)


Architect/engineer 19 (25.33%)
Consultant 12 (16.00%)
Owner/developer 6 (8.00%)
Table II. Software vendor 6 (8.00%)
Profile of industry Subcontractor 5 (6.67%)
respondents Others 6 (8.00%)
(by business types) Total 75 (100%)

Professions Amount (%)

Company manager (e.g. president/VP/CEP/owner) 18 (24.00%)


Project manager (e.g. project/construction/product/
program manager) 13 (17.33%)
Model director (e.g. VDC/BIM/CAD director) 14 (18.66%)
Model designer (e.g. designer/engineer/specialist/
coordinator/consultant) 23 (30.67%)
Table III. Model involver (e.g. estimator/account manager/
Profile of industry sales manager) 3 (4.00%)
respondents Not specified 4 (5.33%)
(by professions) Total 75 (100%)
Data analysis and result Measurement
To ensure the reliability and validity of a scale development, a two-step exploratory and model of BIMM
confirmatory factor analytic approach was typically applied (Cudeck and Browne, 1983;
Brown, 2006; Hair et al., 2009; Tabachnick and Fidell, 2012). The same development
process was used to develop measurement scale for knowledge management maturity
(Chen and Fong, 2012), construction project success (Tabish and Jha, 2012) and climate
safety (Hon et al., 2013; Zhou et al., 2011). EFA was a data-driven approach used to 193
identify the key areas and their underlying factors, which was usually applied in the
early stage of scale development (Brown, 2006). CFA was typically used to test the
construct validity of the identified factorial solution by evaluating how well it
reproduces the sample correlation matrix (Brown, 2006). In this research, step I involved
the development of BIMM framework based on EFA. Step II used CFA to test the
construct validity of the first-level and second-level measurement models originated
empirically from EFA result at step I. Throughout this process, a five-factor second-level
measurement model was developed to satisfactorily fit the structure of BIMM.

Stage I: measurement model – EFA


The underlying constructs of the initial pool of areas for measuring BIMM were
explored by the principal component factor analysis (PCFA) with a varimax rotation.
Based on the discussion in the section of data screening, the 26 areas with HE and HU
combined was analysed separately from the 25 areas with the additional combination
of SOP and DMS. The two measurement models were then compared from the
perspectives of quantitative indices and theory.

EFA of 26 areas
The Kaiser-Meyer-Olkin (KMO) measures of sampling adequacy of 0.857 and the
significant Bartlett test of sphericity (1,537.952, p , 0.000) indicated that there were
meritoriously significant intercorrelation among the areas and the factor analysis was
appropriate (Hair et al., 2009). In order to obtain a simpler solution with factors that had
clearer interpretation, the areas with factor loadings less than 0.5 (Hair et al., 2009) or
the difference of cross-loadings less than 0.2 (Statwiki, 2012) were removed. After
two-iteration of PCFA, nine inconsistent areas were removed. The acceptance of each
area explained by the factor solution was confirmed by its corresponding
communality, all of which were bigger than 0.510. The identified five-factor model
explained 68.127 per cent of the variance. The structure of the model and its related
statistics were listed in Table IV. For the interpretation of the five factors, please also
refer to the data analysis and interpretation of Chen et al. (2012).

EFA of the 25 areas


The results from the KMO measure of sampling adequacy (0.850) and the Bartlett test
of sphericity (1,453.401, p , 0.000) indicated the appropriateness of factor analysis.
Following the same criteria about convergent validity and discriminant validity, nine
inconsistent areas were discarded and 16 significant areas were retained after
three-iteration of PCFA. Table V presented the detail about the model structure and
statistical measures.
The five factors accounted for 69.839 per cent of the variance. Factor 1 (process
definition and management (PDM)) explained 22.670 per cent of the variance and loaded
CI
BIMM areas Factor 1 Factor 2 Factor 3 Factor 4 Factor 5 Communalities
14,2
Doc. and modelling
standards 0.811 0.213 20.066 20.029 0.032 0.709
Standard operating
process 0.809 0.159 0.050 20.084 0.095 0.699
194 Role 0.738 0.229 0.067 0.090 0.008 0.610
Quality control 0.733 0.113 0.304 0.193 20.054 0.682
Strategic planning 0.684 0.098 0.145 0.358 0.063 0.630
Senior leadership 0.681 20.084 0.211 0.297 0.213 0.649
Work flow 0.102 0.807 0.128 0.128 0.060 0.698
Lifecycle process 0.069 0.789 0.046 0.166 0.137 0.675
Geospatial capability 0.265 0.683 0.215 20.004 0.045 0.586
Real-time data 0.199 0.588 20.003 0.355 0.012 0.512
Training program 0.160 0.177 0.848 0.037 0.088 0.786
Training delivery
method 0.151 0.112 0.841 20.067 0.178 0.779
Information delivery
method 0.119 0.214 0.056 0.836 0.072 0.767
Information assurance 0.157 0.182 20.087 0.835 0.053 0.776
Applications 20.047 20.010 0.137 0.093 0.821 0.704
Hardware 0.118 0.035 0.425 0.072 0.702 0.694
Process and tech.
innovation 0.206 0.371 20.131 20.022 0.662 0.637
Eigenvalue 5.369 1.913 1.865 1.321 1.115
% variance 21.154 14.711 11.106 10.921 10.235
Cronbach’s a 0.848
KMO measure of
sampling adequacy 0.797
Bartlett’s test of
sphericity: approx.
Table IV.
x 2 (sig.) 822.620 (0.000)
Factor structure and
variance explained based Notes: Significant factor loadings are highlighted; the factors represent: 1 – PDM; 2 – information
on responses from the management; 3 – training; 4 – information delivery; 5 – technology
experts in the USA Source: Adapted from Chen et al. (2012)

on areas of standardisation, role, strategic planning, senior leadership, quality control


and information accuracy (IAc). Factor 2 (13.627 per cent) concerned information
management, including areas of lifecycle process, work flow, and geospatial capability.
Factor 3 (11.488 per cent) loaded on items concerning training, involving training
program and training delivery method. Factor 4 (11.083 per cent) included areas of
applications, hardware and process and technology innovation (PTI), all of which
aligned on technology. Factor 5 (10.971 per cent) focused on information delivery,
consisting areas of information delivery method and information assurance (IA).

Comparison of the two identified five-factor measurement models


The two models were quite close since the key structure of the measurement models stayed
quite the same, as well as the statistical measures. This was understandable because the
only difference between the two initial pools lied in the combination of SOP and DMS.
However, compared with the first model accounting for 68.127 per cent of the total variance
with 17 areas, the second model accounted for more variance (69.839 per cent) with fewer
Measurement
Factor 1 Factor 2 Factor 3 Factor 4 Factor 5 Communalities
model of BIMM
Standardisation 0.764 0.200 2 0.002 0.042 2 0.057 0.629
Quality control 0.762 0.123 0.278 2 0.069 0.138 0.697
Role 0.755 0.252 0.029 2 0.023 2 0.008 0.636
Strategic planning 0.755 0.101 0.094 0.054 0.268 0.664
Senior leadership 0.750 2 0.078 0.157 0.201 0.180 0.667 195
Info accuracy 0.666 0.078 0.134 0.244 0.300 0.617
Lifecycle process 0.080 0.832 0.028 0.112 0.222 0.762
Work flow 0.098 0.824 0.129 0.039 0.184 0.741
Geospatial capability 0.287 0.642 0.218 0.050 2 0.022 0.545
Training program 0.166 0.170 0.857 0.094 0.057 0.803
Training delivery
method 0.180 0.116 0.825 0.189 2 0.083 0.769
Applications 20.050 2 0.021 0.153 0.820 0.110 0.710
Hardware 0.149 0.048 0.407 0.703 0.050 0.687
Process and tech.
innovation 0.245 0.396 2 0.174 0.655 2 0.047 0.678
Information assurance 0.203 0.138 2 0.081 0.047 0.848 0.788
Info delivery method 0.174 0.186 0.054 0.064 0.842 0.782
Eigenvalue 5.232 1.849 1.684 1.332 1.078
% variance 22.67 13.627 11.488 11.083 10.971
Cronbach’s a 0.845
KMO measure of
sampling adequacy 0.812
Bartlett’s test of
sphericity: approx. x 2 Table V.
(sig.) 748.178 (0.000) Factor structure and
variance explained based
Notes: Significant factor loadings are highlighted; the factors represent: 1 – PDM; on responses from the
2 – information management; 3 – training; 4 – technology; 5 – information delivery experts in the USA

areas (16 areas). In addition, from the theoretical perspective, SOP and DMS were
perceived to measure the different aspects of standardisation: SOP focused more on the
standardisation of process while DMS emphasised on final product. What is more,
the author tested both models with CFA, the result of which turned out that the overall
fitting of higher-level measurement model (x299 2 146:250, CFI ¼ 0.931, RMSEA ¼ 0.062)
based on the 16-area structure was better than the one with further modification based on
the 17-area structure (x2113 2 177:523, CFI ¼ 0.915, RMSEA ¼ 0.068). Considering all
above, the second model extracted from 25 areas was used for further CFA.

Stage II: measurement models: CFA


The result from EFA provided empirically derived hypotheses as to the number and
nature of factors of BIMM. The hypothesised measurement model was then tested for
its goodness-of-fit and validity using the first-order CFA. Then the common construct
of BIMM underlying the five factors were further examined by second-order CFA.

First-order measurement model


Based on the result from EFA, the model to be tested hypothesised a priori that BIMM was
a five-factor structure composed of PDM, information management (Info Management),
training, technology and information delivery (Info Delivery), as shown in Figure 2.
CI
14,2

196

Figure 2.
Hypothesised model
of factorial structure
for the BIMM

The model and its components were specified as following:


.
Responses to the measuring instrument of BIMM can be explained by
five factors, PDM, Info Management, training, technology and Info Delivery.
.
The five factors are inter-correlated.
.
Each observed variable loads on only one factor.
. Error terms associated with each observed variable were uncorrelated.

Different goodness-of-fitness (GOF) measured different facets of a model’s ability to


represent the data. Based on the recommendation from previous literature, the absolute fit
of the normed x 2, the goodness-of-fit index (GFI) and root mean square error of
approximation (RMSEA) were reported (Tabachnick and Fidell, 2012), as well as the
incremental fit of comparative fit index (CFI) and the parsimony fit of adjusted goodness of
fit index (AGFI) (Hair et al., 2009). The satisfactory threshold for model fit can be achieved
by attaining x 2/df , 2.00 (Tabachnick and Fidell, 2006), GFI . 0.85 (Maruish, 2004),
RMSEA , 0.07 (Steiger, 2007), CFI . 0.95 (Hair et al., 2009) and AGFI . 0.8 (Maruish,
2004). The fit statistics suggested good support for the model. Specifically, x 2/94 ¼ 1.361,
GFI ¼ 0.891, RMSEA ¼ 0.054, CFI ¼ 0.951 and AGFI ¼ 0.842. The standardised output Measurement
of the first-order measurement model was presented in Figure 3. The parameter estimates model of BIMM
for all the factor loadings were significant and in the expected positive direction. Except the
parameter estimate for the area of innovation, all loadings were higher than 0.5, which
suggested that the areas were strongly related to their associated factors and indicated the
construct validity (Hair et al., 2009). The modification indices (MIs) were examined to
detect any possible misspecification (Byrne, 2009). Table VI presented the information of 197
the biggest ten MIs. However, given the meaningless and trivialness of these MIs, as well
as the adequate fit of the existing model (Byrne, 2009), no additional parameters were
included in the model.

Second-order measurement model


To facilitate theoretical understanding of multi-dimensional nature of BIMM, the
first-order measurement model was specified into a second-order measurement model.
The difference between the two specifications is that a structure is “imposed on the
correlational pattern among the first-order factors” (Byrne, 2009, p. 143). In our case
here, BIMM was specified as the construct underlying the five first-order factors,
as schematically shown in Figure 4.

Figure 3.
Standardised output
of the first-order model
of factorial structure
for the BIMM
CI
Path M.I. Par change
14,2
Innovation ˆ lifecycle process 10.964 0.215
Innovation ˆ Info Mgmt 9.948 0.284
e14 $ Info Mgmt 9.526 0.236
Geospatial ˆ quality control 7.596 0.239
198 Innovation ˆ work flow 7.336 0.229
e1 $ e3 7.107 0.159
e9 $ PDM 5.921 0.123
Table VI. e4 $ e10 5.574 2 0.112
Ten largest e5 $ Info Mgmt 5.483 2 0.171
modification indexes Innovation ˆ info accuracy 5.474 0.171

Figure 4.
Hypothesised
second-order model
of factorial structure
for the BIMM

The CFA model to be tested postulated priorities that:


.
Responses to the instrument of BIMM can be explained by five first-order factors
(PDM, Info Mgmt, training, technology and Info Delivery) and one second-order
factor (BIMM).
.
Each observed variable loads on only one first-order factor. Measurement
.
Error terms associated with each observed variable were uncorrelated. model of BIMM
.
Co-variation among the five first-order factors is fully explained by their
regression on the second-order factor (BIMM).

The adequacy of the parameter estimates and the overall model suggested good support
to the validity of the BIMM measurement models (Byrne, 2009). Specifically, all the 199
factor loadings were significant and in the expected direction, as indicated by the
standardised output in Figure 5. The adequacy of the model as a whole was evidenced by
x 2/99 ¼ 1.477, GFI ¼ 0.877, RMSEA ¼ 0.062, CFI ¼ 0.931 and AGFI ¼ 0.830. Both
above support that the hypothesised model fitted the data quite well. The biggest ten
MIs were listed in Table VII. In light of the nonsense and triviality of the MIs, as well as
the good fit of the model, it was concluded that the second-order model shown in Figure 4
was the optimal representation for measuring BIMM for the US BIM-related experts.
The multi-dimensional hypothesis of the BIMM was not rejected by the test result of
parameter estimates. All the five factors were statistically significant, which further
suggested that all factors were important in measuring BIMM. Among the five factors,

Figure 5.
Standardised output
of the second-order
model of factorial
structure for the BIMM
CI
Path M.I. Par change
14,2
Hardware ˆ training delivery 9.530 0.230
Innovation ˆ lifecycle process 9.329 0.195
e13 $ F3 8.602 0.226
e14 $ F2 8.499 0.220
200 Innovation ˆ Info Mgmt 7.855 0.251
Innovation ˆ Info Mgmt 7.855 0.251
e11 $ e13 7.504 0.233
e11 $ F5 7.177 2 0.136
Table VII. Applications ˆ strategic planning 6.880 2 0.210
Ten largest Geospatial ˆ quality control 6.876 0.226
modification indexes e1 $ e3 6.531 0.151

PDM had the highest standardised second-order factor loading of 0.71, followed by IM
of 0.63, technology of 0.576, training of 0.574 and ID of 0.545. It indicated that
compared with factors of people and technology, the factors of process and information
were perceived to be more important in measuring BIMM.

Discussion
The key findings of this research were further compared with other independent BIM
and maturity related research.
Theoretically, the proposed maturity framework was considered as
comprehensive because the key factors and content of other BIM and maturity
related research can be categorised under the four dimensions. Based on the nature of
the five factors, the five factors of BIMM can be reduced to four dimensions of
technology (Factor 4: technology), process (Factor 1: PDM), people (Factor 3: training),
and information (Factor 2: information management; Factor 5: information delivery).
The four dimensions of this research were validated due to their consistency
with the key content of global BIM-related research. As listed in Table AII of
Appendix, the four dimensions of this research covered the areas/content/indices of
BIM standards/guides/protocols/specifications in Australia, UK, China, Finland and
Singapore (BCA, 2012). The factorial structure of BIMM was further confirmed by
complying with the key factors of other maturity models, as listed in Table VIII.
Empirically, the research findings were validated by comparing them with those of
SmartMarket Report by McGraw-Hill, which was the “default gold standard” of BIM
implementation status in North America (Suermann, 2009). First, all the BIMM factors
were found to be important in measuring BIMM. This finding was confirmed given
that the four dimensions of BIMM were corresponding to the four most important
factors for increasing BIM benefits, as indicated in the SmartMarket Report by
McGraw-Hill. Specifically, the four factors included “Improved Interoperability
between Software Applications”, “Improved BIM Software Functionality”, “More
Clearly-Defined BIM Deliverables between Parties” and “More Owners Asking for
BIM” (2012, p. 22), which corresponded to the four dimensions of information,
technology, process and people. Second, it was found that factors related to process and
information were more important than factors of technology and people. This finding
was consistent with findings of SmartMarket Report, which concluded that the top
obstacles to BIM improvement were related to information and process (2009).
Measurement
Study\factor Technology Process People Information
model of BIMM
Lean “Enabling “Life-cycle “Lean
enterprise self-
Infrastructure”: Processes”: refers to Transformation/
assessment “must support and the “Implement lean Leadership”: refers to
tool (MIT and implementation of practices across life- the development of
UW, 2001) Lean Principles, cycle processes” (p. 8) “lean implementation 201
practices and plans” (p. 6)
behaviour” (p. 11)
Construction “Technology “Process “Strategy
supply chain Assessment”: refers Assessment”: is “to assessment”: “is to
maturity to “the various tools” identify the current determine the current
model used to business as-is business process business strategy”
(Vaidyanathan process efficiency” methodology” (p. 175) with customers
and Howell, (p. 175) “Value assessment”: and suppliers (p. 176)
2007) is to “assess the
current pain points”
of “as-is business
process” (p. 176)
NIBS (2007) “Information
Management”
(p. 80)
BIM maturity Technology: this field Process: this field Policy: this field
matrix “clusters a group of “clusters a group of “clusters a group
(Succar, 2010, players who players who procure, of players focused
p. 69) specialise in design, construct, on preparing
developing software, manufacture, use, practitioners,
hardware, equipment manage and maintain delivering research, Table VIII.
and networking structures” distributing benefits, Comparison of BIMM
systems [. . .]” allocating risks and dimensions of this
minimising conflicts research with other
[. . .]” maturity models

Last, the factorial structure of BIMM was generalisable due to the consistency of
BIMM measurement models based on the perception of USA and non-USA BIM-related
professionals. This research is part of a study of BIMM measurement models based on
perception and experience of global BIM-related professionals. Given that the samples
from different countries might be heterogeneous due to the inference of issues as
culture, industry practice and regulation, infrastructure and BIM adoption status,
separate measurement models were developed for the USA and non-USA
professionals, respectively, (Dib et al., 2012). Similar factors related to information,
process, people and technology were recognised. In addition, the factors of information
and process ranked higher than the factors related to technology and people for both
models. The key difference of the two measurement models lied in the emphasis of
similar factors. For example, for the factors related to people, the USA experts focused
on the training while the non-USA experts emphasised the regulation of team
structure. This may suggest that the USA experts perceived BIM more as a technology,
while the non-USA experts treated BIM more as a process. For more detailed
discussion, please refer to Chen (2013).
CI Conclusion
14,2 With the proliferation of BIM adoption in the AEC industry, there is a need for a scientific
rating system to benchmark the different levels of BIM implementation (NIBS, 2007).
Based on a substantial theoretical and empirical justification, a valid and reliable
measurement model of BIMM with five factors under four dimensions was proposed at
the end of this study. There were several theoretical and empirical implications from this
202 research. The key value of this research is to increase the understanding of
multi-dimensional nature of BIMM through empirical evidence, and to provide
practitioners and researchers with the insight regarding particular emphasis on the
factors related to process and information besides the factors of technology and people.
In addition, the BIMM measurement model can be used by the industry to benchmark
BIMM across AEC projects, by the current 71 per cent BIM users of the USA AEC
projects as a scientific checklist to evaluate their BIM implementation, as well as a
guideline for the non-users of the remaining 29 per cent AEC projects to initiate their BIM
efforts. What is more, different projects have different requirements for BIM. BIM can be
engineered according to project stakeholders’ perceived priority of the BIMM areas to
optimise their investment. Case study can be conducted to examine the efficiency of
the application of the proposed BIMM model to real-world projects. Finally, the
quantification of BIM’s impact had been considered impossible due to the
multidimensional nature of BIM and methodological difficulties. The BIMM
measurement model here can serve as an initial point for quantifying the impact of
BIM on its intangible benefits like trust and collaboration or tangible benefits such as
productivity and construction performance. Specifically, hypothesised relationship
between BIMM and its benefits can be established based their respective measurement
matrixes, which can be further tested by data collected from surveys or projects.
The BIMM measurement was considered as generalisable from three perspectives.
First, the key dimensions and areas of BIMM measurement models were consistent
with other maturity models and BIM-related standards. Second, the four dimensions
and their ranking in this research aligned with the four most important factors
impacting BIMM benefits as indicated in the SmartMarket Report by McGraw-Hill.
Third, similar factors were perceived as important by both the USA and the non-USA
BIM-related professionals.
Some limitations of this research need to be acknowledged. One issue is about how
well the sample represented the population of the USA BIM users. There was a wide
range of respondents in terms of their experience, education, position and demography.
The BIMM measurement model might differ with different composition of participants
with distinct experience and background. Therefore, the measurement model extracted
here might not reflect the perception of the population. A natural extension of this line of
research would be to consider the differences in the perception of BIMM across the
professionals with different backgrounds, such as the difference between the group of
architects and the group of constructors or the difference between practitioners with
different levels of industry experience. Another limitation is that the findings from this
study may not be generalised directly to other specific countries; however, as discussed
before, similar factors were identified by both the USA and the non-USA experts.
Therefore, the research findings here can be adapted for use in other countries and
regions. Future studies of the perception of the experts in other countries and global
experts could also be conducted and compared with the findings in this research.
References Measurement
ACE (2008), Building Information Modeling: An Introduction and Best Methods Approach, model of BIMM
Alliance for Construction Excellence, available at: www.ace4aec.com/pdf/Introduction%
20and%20Best%20Methods%20Manual%202008.pdf (accessed 29 July 2012).
AEC (UK) Committee (2012), AEC (UK ) BIM Protocol, available at: http://aecuk.files.wordpress.
com/2012/09/aecukbimprotocol-v2-0.pdf (accessed 15 August 2013).
AGC (2006), The Contractors’ Guide to BIM, The Associated General Contractors of America, 203
available at: www.ipd-ca.net/images/Integrated%20Project%20Delivery%20Definition.
pdf (accessed 8 August 2011).
AIA (2008), AIA Document E202-2008, The American Institute of Architects, available at: www.
fm.virginia.edu/fpc/ContractAdmin/ProfSvcs/BIMAIASample.pdf (accessed 25 July 2012).
Alshawi, M. (2007), Rethinking IT in Construction and Engineering: Organisational Readiness,
Taylor & Francis, New York, NY.
Ashcraft, H.W. (2008), “Building information modeling: a framework for collaboration”,
Construction Lawyer, Vol. 28 No. 3, pp. 1-14.
BCA (2012), Singapore BIM Guide, Building and Construction Authority, available at: http://
bimsg.files.wordpress.com/2012/05/singapore-bim-guide-version-1.pdf (accessed 5 August
2013).
Berry, W.D. and Feldman, S. (1985), Multiple Regression in Practice, Sage, Newbury Park, CA.
Blanchard, P.N. and Thacker, J. (2006), Effective Training – Systems, Strategies, and Practice,
Prentice-Hall, Upper Saddle River, NJ.
Brown, T.A. (2006), Confirmatory Factor Analysis for Applied Research, 1st ed., The Guilford
Press, New York, NY.
BuildingSmart Finland (2012), Common BIM Requirements, available at: www.en.buildingsmart.
kotisivukone.com/3 (accessed 10 August 2013).
Businessdictionary (2011), Reward System, available at: www.businessdictionary.com/definition/
reward-system.html (accessed 24 December 2011).
Byrne, B.M. (2009), Structural Equation Modeling with AMOS, Taylor & Francis, New York, NY.
Chase, R.B. (2007), Operation Management in Manufacturing and Services, available at:
www.ateneonline.it/chase2e/areastudenti.asp#tn (accessed 25 September 2011).
Chen, Y. (2013), “Measurement models of building information modeling maturity”,
PhD dissertation, Purdue University, West Lafayette, IN, 2 August.
Chen, Y., Dib, H. and Cox, R.F. (2012), “A framework for measuring building information
modeling maturity in construction projects”, paper presented at the 14th International
Conference on Computing in Civil and Building Engineering, Moscow, Russia, 27-29 June.
Chen, L. and Fong, P.S.W. (2012), “Revealing performance heterogeneity through knowledge
management maturity evaluation: a capability-based approach”, Expert Systems with
Applications, Vol. 39, pp. 13523-13539.
CIFE (2012), CIFE VDC Scorecard and BIM Scorecard, Center for Integrated Facility
Engineering, available at: http://cifevdc.weebly.com/index.html (accessed 28 May 2012).
Cooke-Davies, T. (2002), “Project management maturity models – does it make sense to adopt
one?”, Project Manager Today, Vol. 14 No. 5, pp. 16-19.
Cudeck, R. and Browne, M.W. (1983), “Cross-validation of covariance structures”, Multivariate
Behavioral Research, Vol. 18 No. 2, pp. 147-167.
CI Dib, H., Chen, Y. and Cox, R. (2012), “A framework for measuring building information modeling
maturity based on perception of practitioners and academics outside the USA”, paper
14,2 presented at CIB W78 2012 Conference, Beirut, Lebanon, 17-19 October.
DOT (2012), Specifications, Department of Transportation, available at: www.state.nj.us/
transportation/eng/specs/ (access 24 June 2012).
Eastman, C., Teicholz, P., Sacks, R. and Liston, K. (2011), BIM Handbook: A Guide to Building
204 Information Modeling for Owners, Managers, Designers, Engineers, and Contractors,
Wiley, Hoboken, NJ.
Encyclopedia (2011), Real-Time Data, available at: http://encyclopedia.thefreedictionary.com/
Real-timeþdata#cite_note-0 (accessed 28 December 2011).
Gattol, M. (2012), Hardware and Related Stuff, available at: www.markus-gattol.name/ws/
hardware.html (accessed 7 January 2012).
Gilligan, B. and Kunz, J. (2007), VDC Use in 2007: Significant Value, Dramatic Growth, and
Apparent Business Opportunity, Technical Report, Center for Integrated Facility
Engineering, Stanford University, Stanford, CA, 4 December.
Hair, J.F., Black, W.C., Babin, B.J. and Anderson, R.E. (2009), “Multivariate data analysis”,
in Heine, J. (Ed.), Multiple Regression Analysis, Prentice-Hall, Upper Saddle River, NJ,
pp. 155-234.
Hardin, B. (2009), BIM and Construction Management: Proven Tools, Methods, and Workflows,
Wiley, Indianapolis, IN.
HKI (2011), BIM Project Specification, Hong Kong Institute of Building Information Modeling,
available at: www.hkibim.org/?page_id¼1386 (accessed 10 August 2013).
Hon, C.K.H., Chan, A.P.C. and Yam, M.C.H. (2013), “Determining safety climate factors in the
repair, maintenance, minor alteration, and addition sector of Hong Kong”, Journal of
Construction Engineering and Management, Vol. 139, pp. 519-528.
IEEE (1991), IEEE Standard Computer Dictionary: A Compilation of IEEE Standard Computer
Glossaries – 610, Institute of Electrical and Electronics Engineers (IEEE), New York, NY.
ITGI (2007), COBIT 4.1, Governance Institute, available at: www.isaca.org/Knowledge-Center/
cobit/Pages/Downloads.aspx (accessed 19 May 2011).
IU (2009), Indiana University BIM Standard, Indiana University, available at: www.indiana.
edu/ , uao/iubim.html (accessed 15 May 2011).
Ivanov, K. (1972), “Quality-control of information: on the concept of accuracy of information
data-banks and in management information systems”, PhD dissertation, The Royal
Institute of Technology KTH, Stockholm, 11 December.
Jalote, P. (2000), CMM in Practice: Processes for Executing Software Projects at Infosys,
Addison-Wesley Professional, Boston, MA.
Kapur, P.K., Tandon, A. and Kaur, G. (2010), “Multi up-gradation software reliability model”,
The 2nd International Conference on Reliability, Safety and Hazard, Institute of Electrical
and Electronics Engineers (IEEE), Mumbai, India, 14-16 December, pp. 468-474.
Kline, R.B. (2010), Principles and Practice of Structural Equation Modeling, The Guilford Press,
New York, NY.
Lockamy, A. and McCormack, K. (2004), “The development of a supply chain management
process maturity model using the concepts of business process orientation”, Supply Chain
Management: An International Journal, Vol. 9 No. 4, pp. 272-278.
McCormack, K.P. and Johnson, W.C. (2001), Business Process Orientation: Gaining the E-Business
Competitive Advantage, CRC Press LLC, Boca Raton, FL.
McCuen, T., Suermann, P.C. and Krogulecki, M.J. (2012), “Evaluating award-winning BIM Measurement
projects using the national building information model standard capability maturity
model”, Journal of Management in Engineering, Vol. 28 No. 2, pp. 224-230. model of BIMM
McGraw-Hill (2008), Building Information Modeling: Transforming Design and Construction to
Achieve Greater Industry Productivity, available at: www.dbia.org/NR/rdonlyres/
1631EDF1-8040-410D-BE1A-CF96AB1E9F84/0/McGrawHillConstructionBIMSmart
MarketReportDecember2008.pdf (accessed 6 June 2011).
205
McGraw-Hill (2009), SmartMarket Report – The Business Value of BIM: Getting Building
Information Modeling to the Bottom Line, available at: www.bim.construction.com/
research/ (accessed 28 April 2011).
Maruish, M.E. (2004), The Use of Psychological Testing for Treatment Planning and Outcomes
Assessment: Volume 2: Instruments for Children and Adolescents, Lawrence Erlbaum
Associates, Inc., Mahwah, NJ.
NATSPEC (2011), NATSPEC National BIM Guide, available at: http://bim.natspec.org/index.
php/natspec-bim-documents/national-bim-guide (accessed 15 August 2013).
NIBS (2007), United States National Building Information Modeling Standard, National Institute
of Building Science, available at: www.wbdg.org/pdfs/NBIMSv1_p1.pdf (accessed 29 July
2012).
Nightingale, D.J. and Mize, J.H. (2002), “Development of a lean enterprise transformation
maturity model”, Information Knowledge Systems Management, Vol. 3 No. 1, pp. 15-30.
NIST (2011), 2011-2012 Criteria for Performance Excellence, National Institute of Standards and
Technology, available at: www.nist.gov/baldrige/publications/upload/2011_2012_
Business_Nonprofit_Criteria.pdf (accessed 25 July 2012).
O’Brien, R.M. (2007), “A caution regarding rules of thumb for variance inflation factors”, Quality
and Quantity, Vol. 41, pp. 673-690.
Paulk, M.C., Weber, C.V. and Curtis, B. (1995), The Capability Maturity Model: Guidelines for
Improving the Software Process, Addison-Wesley Professional, Reading, MA.
Sarshar, M., Haigh, R., Finnemore, M., Aouad, G., Barrett, P. and Baldry, D. (2000), “SPICE:
a business process diagnostics tool for construction projects”, Engineering, Construction
and Architectural Management, Vol. 7 No. 3, pp. 241-250.
SEI (1994), The Capability Maturity Model: Guidelines for Improving the Software Process,
Software Engineering Institute, Addison-Wesley Longman, Inc., Boston, MA.
SEI (2008), People Capability Maturity Model (P-CMM) Version 2.0, 2nd ed., Software
Engineering Institute, available at: www.sei.cmu.edu/library/abstracts/reports/09tr003.
cfm (accessed 15 May 2011).
Seddigh, N., Pieda, P., Matrawy, A., Nandy, B., Lambadaris, J. and Hatfield, A. (2004), “Current
trends and advances in information assurance metrics”, Proceedings of the 2nd Annual
Conference on Privacy, Security and Trust, Fredericton, NB, pp. 197-205.
Smith, D.K. and Tardif, M. (2009), Building Information Modeling: A Strategic Implementation
Guide, Wiley, Hoboken, NJ.
Statwiki (2012), Exploratory Factor Analysis, available at: http://statwiki.kolobkreations.com/
wiki/Exploratory_Factor_Analysis (accessed 7 January 2012).
Steele, P.K. and Kirkpatrick, S. (2011), Developing Competency Profiles, available at:
www.regislearning.com/html/Publications/Article_steele_kirkpatrick_competency.pdf
(accessed 28 December 2011).
Steiger, J.H. (2007), “Understanding the limitations of global fit assessment in structural equation
modeling”, Personality and Individual Differences, Vol. 42 No. 5, pp. 893-898.
CI Succar, B. (2010), “Building information modeling maturity matrix”, in Underwood, J. and
Isikdag, U. (Eds), Handbook of Research on Building Information Modeling and
14,2 Construction Informatics: Concepts and Technologies, Information Science Publishing,
Hershey, PA, pp. 65-103.
Suermann, P.C. (2009), “Evaluating the impact of building information modeling (BIM) on
construction”, PhD dissertation, University of Florida, Gainesville, FL, May.
206 Tabachnick, B.G. and Fidell, L.S. (2006), Using Multivariate Statistics, Ally & Bacon, New York, NY.
Tabachnick, B.G. and Fidell, L.S. (2012), Using Multivariate Statistics, 6th ed., Pearson Education,
Upper Saddle River, NJ.
Tabish, S.Z.S. and Jha, K.N. (2012), “Success traits for a construction project”, Journal of
Construction Engineering and Management, Vol. 138, pp. 1131-1138.
TFD (2011), Graphics, The Free Dictionary, available at: www.thefreedictionary.com/graphics
(accessed 23 December 2011).
(The) US EPA (2007), Guidance for Preparing Standard Operating Procedures (SOPs), The US
Environmental Protection Agency.
Vaidyanathan, K. and Howell, G. (2007), “Construction supply chain maturity model – conceptual
framework”, Proceedings of the International Group for Lean Construction, Lansing, MI,
USA, pp. 170-180.
Wong, A.K.D., Wong, F.K.W. and Nadeem, A. (2011), “Government roles in implementing building
information modelling systems: comparison between Hong Kong and the United States”,
Construction Innovation: Information, Process, Management, Vol. 11 No. 1, pp. 61-76.
Zhou, Q., Fang, D. and Mohamed, S. (2011), “Safety climate improvement: case study in a Chinese
construction company”, Journal of Construction Engineering and Management, Vol. 137,
pp. 86-95.

Further reading
FMI and Construction Management Association of America (CMAA) (2007), The Eighth Annual
Survey of Owners, available at: http://images.autodesk.com/flashassets/company/BIM/
downloads/FMI_CMAA_report.pdf (accessed 28 June 2011).
IBM (2011), Workflow Process, available at: http://publib.boulder.ibm.com/infocenter/cmgmt/
v8r5m0/index.jsp?topic¼%2Fcom.ibm.administeringcm.doc%2Fmwf00008.htm
(accessed 8 December 2011).

About the authors


Yunfeng Chen is a graduate student in the Department of Building Construction Management with
the research focus on BIM, maturity and process models, project management, collaboration and
trust. Yunfeng Chen is the corresponding author and can be contacted at: yfchen428@gmail.com
Dr Hazar Dib is an Assistant Professor in a joint appointment with the Department of
Computer Graphics Technology and the Department of Building Construction Management with
the research focus on BIM, object-oriented-modelling (OOM), construction delivery, visualization,
and sustainability.
Dr Robert F. Cox is the Associate Dean for Globalization and Engagement in the College of
Technology and the Interim Department Head and Professor of the Department of Building
Construction Management with the research focus on construction management, partnering,
trust, and technique application.

To purchase reprints of this article please e-mail: reprints@emeraldinsight.com


Or visit our web site for further details: www.emeraldinsight.com/reprints
BIMM areas Source for BIMM areas Descriptions/definitions

1a. Software Software “applications” (Succar, 2010) “Applications are the automated user systems and manual procedures
applications that process the information” (ITGI, 2007, p. 12) Appendix
3. Hardware equipment Hardware “equipment” (Succar) BIM HE refers to the physical artifacts of BIM technology
(Gattol, 2012, in process)
4. Hardware upgrade Hardware “mobility” (Succar) HU refers to the “replacement” of hardware with a newer version to
improve its characteristics (Kapur et al., 2010)
6. Information “Security and access control” (Succar) IA is the practice to facilitate the transfer of information between parties
assurance in an accurate and secure fashion (Seddigh et al., 2004)
7. Process and “Innovation and renewal” (Succar) PTI refers to process and technology improvements and innovations
technology that would measurably improve a project’s software processes
innovation (SEI, 1994)
8. Strategic planning “Strategic attributes” (Succar) Strategic planning refers to how a project defines its strategies and
action plans (NIST, 2011)
9. Senior leadership “Organisational attributes” (Succar) Senior leadership describes how a common vision is communicated and
encouraged by senior leaders (NIST, 2011)
19. Reward system “Rewards” (Succar) Reward system refers to “procedures and standards” associated with
benefits allocation to related parties (Businessdictionary, 2011)
20. Risk management “Risks” (Succar) “Risk management is a systematic process to manage risk by
identifying, analysing, and controlling the events that may cause
undesirable outcome” (ITGI, 2007, p. 68)
21. Standard operating “Guidelines” (Succar) SOP is a detailed “instruction” about all “activities” of a process”
process (SOP) (The US EPA, 2007)
22. Documentation and “Standards and classifications” (Succar) Documentation and modeling standards refer to a set of rules for
modeling standards preparing and evaluating documents and models for technical projects
23. Quality control “Benchmarks” (Succar) Quality control is a process to monitor and manage the quality of
“all factors” during production (Chase, 2007)
24. Specification “Service specification”/”product Specifications are “requirements to be satisfied by a product or service”
specification” (Succar) (DOT, 2012)
25. Competency profile “Experience”, “knowledge”, “skills”, Competency profile refers to a set of knowledge, skills and abilities
“dynamics” (Succar) (KSAs) that a position requires (Steele and Kirkpatrick, 2011)
26. Training program “Training programmes [SIC]” (Succar) Training program refers to a set of related elements that focus on
addressing a project’s training needs (SEI, 2008)
(continued)
Measurement

and descriptions
corresponding sources
BIMM areas and their
model of BIMM

Table AI.
207
CI
14,2

208

Table AI.
BIMM areas Source for BIMM areas Descriptions/definitions

27. Training delivery “Research efforts, educational programmes Training delivery method refers to the method chosen to train
method [SIC] ”/”deliverables” (Succar) participants in use of BIM (Blanchard and Thacker, 2006)
2. Interoperability “Interoperability/IFC support” (NIBS, 2007) “Interoperability” is the ability of two or more systems to exchange
information that can be used (IEEE, 1991)
5. Information delivery “Delivery method” (NIBS) Information delivery method refers to a system or approach used by a
method project to bring information to users (NIBS, 2007)
10. Data richness “Data richness” (NIBS) “Data richness” refers to the completeness and value of data or
information within a system (NIBS, 2007)
11. Real-time data “Timeliness/response” (NIBS) “Real-time data denotes information that is delivered” or updated
“immediately after collection” or change (Encyclopedia, 2011)
12. Information accuracy “Information accuracy” (NIBS) IAc defines the closeness of the information received to the truth
(Ivanov, 1972)
13. Graphics “Graphical information” (NIBS) “Graphics” are “visual” presentations on computer screen or paper to
inform or illustrate. (TFD, 2011)
14. Geospatial capability “Spatial capability” (NIBS) Geospatial capability refers to the capability of a system to capture,
store, analyse, manage, and present data with reference to geographic
location data (NIBS, 2007)
15. Lifecycle processes “Life-cycle views” (NIBS) Lifecycle processes are the processes responsible for a facility from
conception through operation and maintenance (Nightingale and Mize,
2002)
17. Change management “Change management” (NIBS) “Change management” identifies, evaluates and implements
improvement to the projects’ defined BIM processes on a continuous
basis (ITGI, 2007)
18. Role “Role” (NIBS) “Role” refers to a unit of defined responsibilities that may be assumed
by one or more parties (SEI, 1994)
Notes: aThe areas were numbered corresponding to the order of the items in the questionnaire; for the detail of the questionnaire, please refer to
appendix of the article written by Dib et al. (2012); the areas under the columns of “BIMM areas” and “Source for BIMM areas” were reproduced
directly from Table I of the article written by Chen et al. (2012)
Study Technology Process People Information

NATSPEC “Technology Platform and “Collaboration Procedures” “BIM roles and “Model Sharing”; “3D Models,
National BIM Software”; “BIM management Responsibilities” Formats, and Model Structures”;
Guide Plan (BMP)”; “Requirements “File Storage and Security”;
(NATSPEC, for Using BIM” “Requirements for 2D Drawings”;
2011, pp. iv-v) “Modelling Requirements”
AEC (UK) BIM “Resources” (p. 43); “Project BIM Execution Plan”: “Project BIM Execution “Collaboration BIM Working”
Protocol (AEC “Interoperability” (p. 19) “Project Execution Plan” (p. 12) and Plan”: “Roles and (p. 14); “Data Segregation” (p. 21)
(UK) Committee, “Project BIM meetings” (p. 13) Responsibilities” (p. 10) “Modelling Methodology” (P. 25);
2012) “Folder Structure and Naming
Conventions” (p. 32); “Presentation
Styles” (p. 39)
Hong Kong BIM “Clash Analysis Process”; “BIM project Objective”; “Building “BIM Management and “Model Data and Level of Detail”
Project “Hardware Specifications”; Information Model Specification”; Staff Resources”
Specification “Software Specifications” “BIM Methodologies and
(HKI, 2011, p. 2) Processes”; “Project Deliverables
from BIM Process”
Common BIM “Software” (p. 7); “Modeling “BIM Specification” (p. 10) “Role of the BIM “BIM Accuracy” (p. 8);
requirements Tools” (p. 9) “Publishing of Models” (p. 11) Coordinator” (p. 11) “The Buildings, Floor Levels and
(BuildingSmart Divisions” (p. 9); “Naming and
Finland, 2012) Archiving of the BIM” (p. 10);
“Release of the BIM” (p. 7);
“Coordinators and Units” (p. 8);
“Working Models” (p. 11); “Quality
Assurance of BIMs” (p. 12)
Singapore BIM “BIM Modeling and Collaboration “BIM Specifications”: “BIM Specifications”: “BIM
Guide (BCA, Procedures”: “Individual Discipline “BIM Objective and Deliverables”, “Level of Detail and
2012, p. vii) Modeling”, “Cross-Disciplinary Responsibility Matrix”, Project Stages in the Singapore
Model Coordination” “Compensation BIM Guide”; “BIM Modeling and
Expectations” Collaboration Procedures”: “Model
and Documentation Production”

protocols/specifications
standards/guides/
BIM-related
Measurement

researchwith content and


dimensions of this

areas of global
Comparison of BIMM
model of BIMM

Table AII.
209

You might also like