Academia.eduAcademia.edu

Measuring BIM performance: Five metrics

The term Building Information Modelling (BIM) refers to an expansive knowledge domain within the design, construction and operation (DCO) industry. The voluminous possibilities attributed to BIM represent an array of challenges that can be met through a systematic research and delivery framework spawning a set of performance assessment and improvement metrics. This article identifies five complementary components specifically developed to enable such assessment: (i) BIM capability stages representing transformational milestones along the implementation continuum; (ii) BIM maturity levels representing the quality, predictability and variability within BIM stages; (iii) BIM competencies representing incremental progressions towards and improvements within BIM stages; (iv) Organizational Scales representing the diversity of markets, disciplines and company sizes; and (v) Granularity Levels enabling highly targeted yet flexible performance analyses ranging from informal self-assessment to high-detail, formal organizational audits. This article explores these complementary components and positions them as a systematic method to understand BIM performance and to enable its assessment and improvement. A flowchart of the contents of this article is provided.

ARTICLE Measuring BIM performance: Five metrics Bilal Succar*, Willy Sher and Anthony Williams School of Architecture and Built Environment, University of Newcastle, Callaghan Campus, NSW 2308, Australia Abstract The term Building Information Modelling (BIM) refers to an expansive knowledge domain within the design, construction and operation (DCO) industry. The voluminous possibilities attributed to BIM represent an array of challenges that can be met through a systematic research and delivery framework spawning a set of performance assessment and improvement metrics. This article identifies five complementary components specifically developed to enable such assessment: (i) BIM capability stages representing transformational milestones along the implementation continuum; (ii) BIM maturity levels representing the quality, predictability and variability within BIM stages; (iii) BIM competencies representing incremental progressions towards and improvements within BIM stages; (iv) Organizational Scales representing the diversity of markets, disciplines and company sizes; and (v) Granularity Levels enabling highly targeted yet flexible performance analyses ranging from informal self-assessment to high-detail, formal organizational audits. This article explores these complementary components and positions them as a systematic method to understand BIM performance and to enable its assessment and improvement. A flowchart of the contents of this article is provided. B Keywords – Building Information Modelling; capability and maturity models; performance assessment and improvement A BRIEF INTRODUCTION TO BUILDING INFORMATION MODELLING (BIM) SOME INDICATORS OF THE PROLIFERATION OF BIM BIM is a term that is used by different authors in many different ways (Figure 1). The nuances between their definitions highlight the rapid growth the area has experienced, as well as the potential for confusion to arise when ill-defined terminology is used to communicate specific meanings. In the context of this article, BIM refers to a set of interacting policies, processes and technologies (illustrated in Figure 2) that generate a ‘methodology to manage the essential building design and project data in digital format throughout the building’s life-cycle’ (Penttilä, 2006). It is important to identify the knowledge structures, internal dynamics and implementation requirements of BIM if confusion and duplication of effort are to be avoided. There are many signs that the use of BIM tools and processes is reaching a tipping point in some markets (Keller, Gerjets, Scheiter, & Garsoffky, 2006; McGraw-Hill, 2009). For example, in the USA an increasing number of large institutional clients now require object-based three-dimensional (3D) models to be provided as a part of tender submissions (Ollerenshaw, Aidman, & Kidd, 1997). Furthermore, the UK Cabinet Office has recently published a construction strategy article that requires the submission of a ‘fully collaborative 3D BIM (with all project and asset information, documentation and data being electronic) as a minimum by 2016’ (BIS, 2011; UKCO, 2011, p. 14). Other signs include the abundance of BIM-specific software tools, books, new media tools and reports (Eppler & Platts, 2009). B *Corresponding author: E-mail: [email protected] ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT B 2012 B VOLUME 8 B 120–142 http://dx.doi.org/10.1080/17452007.2012.659506 ª 2012 Taylor & Francis ISSN: 1745-2007 (print), 1752-7589 (online) www.tandfonline.com/taem Measuring BIM performance 121 FIGURE 1 Flowchart of the contents of this article ISSUES ARISING FROM THE PROLIFERATION OF BIM Notwithstanding the much-touted benefits of BIM as a means of increasing productivity, there are currently few metrics that measure such improvements. Furthermore, little guidance is available for organizations wishing to generate new or enhance their existing BIM deliverables. Those wishing to adopt BIM or identify and/or prioritize their requirements are thus left to their own devices. The implementation of any new technology is fraught with challenges and BIM is no exception. In addition, those implementing BIM frequently expect to be able to realize significant benefits and productivity gains while they are still inexperienced users. Successful implementation of these systems requires an appreciation of how BIM resources (including hardware, software as well as the technical and management skills of staff) need to evolve in harmony with each other. The multiple and varied understandings that practitioners have of BIM further compound the difficulties they experience. When the unforeseen happens, the risks, costs and difficulties associated with implementing BIM increase. In such circumstances compromises are likely to be made leading, in turn, to users’ expectations not being met. ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT 122 B. SUCCAR et al. FIGURE 2 The interlocking fields of BIM activity THE NEED FOR BIM PERFORMANCE METRICS BIM use needs to be assessable if the productivity improvements that result from its implementation are to be made apparent. Without such metrics, teams and organizations are unable to consistently measure their own successes and/or failures. Performance metrics enable teams and organizations to assess their own competencies in using BIM and, potentially, to benchmark their progress against that of other practitioners. Furthermore, robust sets of BIM metrics lay the foundations for formal certification systems, which could be used by those procuring construction projects to pre-select BIM service providers. ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT DEVELOPING BIM METRICS AND BENCHMARKS Although it is important to develop metrics and benchmarks for BIM performance assessment, it is equally important that these metrics are accurate and able to be adapted to different industry sectors and organizations. Considerable insight can be gained from the performance measurement tools developed for other industries but it would be foolhardy to rely on any tool which is not designed for the specific requirements of the task in question. Those required to measure key BIM deliverables/ requirements across the construction supply chain are no exception. Measuring BIM performance 123 This article describes a set of metrics purposefully developed to measure the specifics of BIM performance. To increase their reliability, adoptability and usability for different stakeholders, the first-named author identified the following performance criteria. The metrics should be: l l l l l l l l l l l Accurate: Well-defined and able to measure performance at high levels of precision. Applicable: Able to be utilized by all stakeholders across all phases of a project’s lifecycle. Attainable: Achievable if defined actions are undertaken. Consistent: Yield the same results when conducted by different assessors. Cumulative: Set as logical progressions; deliverables from one act as prerequisites for another. Flexible: Able to be performed across markets, Organizational Scales and their subdivisions. Informative: Provide ‘feedback for improvement’ and ‘guidance for next steps’ (Nightingale & Mize, 2002, p. 19). Neutral: Not prejudice proprietary, non-proprietary, closed, open, free or commercial solutions or schemata. Specific: Serve the specific requirements of the construction industry. Universal: Apply equally across markets and geographies. Usable: Intuitive and able to be easily employed to assess BIM performance. This article describes the development of a set of BIM performance metrics based on these guiding principles. It introduces a set of complementary knowledge components that enable BIM performance assessment and facilitate its improvement. RESEARCH DESIGN The investigations described in this article are part of a larger PhD study which addresses the question of how to represent BIM knowledge structures and provide models that facilitate the implementation of BIM in academic and industrial settings. It is grounded in a set of paradigms, theories, concepts and experiences which combine to form the view of the BIM domain reported here. CONCEPTUAL BACKGROUND According to Maxwell (2005), the conceptual background underpinning a study such as this is typically based on several sources including previous research and existing theories, the researcher’s own experiential knowledge and thought experiments. Various theories (including systems theory (Ackoff, 1971; Chun, Sohn, Arling, & Granados, 2008), systems thinking (Chun et al., 2008), diffusion of innovation theory (Fox & Hietanen, 2007; Mutai, 2009; Rogers, 1995), technology acceptance models (Davis, 1989; Venkatesh & Davis, 2000) and complexity theory (Froese, 2010; Homer-Dixon, 2001) assisted in analysing the BIM domain and enriched the study’s conceptual background. Constraints identified in these theories led to the development of a new theoretical framework based on an inductive approach ‘[more suitable for researchers who are more concerned about] the correspondence of their findings to the real world than their coherence with existing theories or laws’ (Meredith, Raturi, Amoako-Gyampah, & Kaplan, 1989, p. 307). METHODOLOGY AND VALIDATION The five components of BIM performance measurement are some of the deliverables of the BIM framework developed after assessing numerous publicly available international guidelines (Succar, 2009). The framework itself is composed of a number of high-level concepts that interact to generate a set of guides and tools necessary to (i) facilitate BIM implementations; (ii) conduct BIM performance assessments; and (iii) generate multi-tiered educational curricula. The theoretical underpinnings of the BIM framework have been generated through a process of inductive inference (Michalski, 1987), conceptual clustering (Michalski & Stepp, 1987) and reflective learning (Van der Heijden & Eden, 1998; Walker, Bourne, & Shelley, 2008). Framework components were then represented visually through a series of ‘knowledge models’ to reduce topic complexity (Tergan, 2003) and facilitate knowledge transfer to others (Eppler & Burkhard, 2005). ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT 124 B. SUCCAR et al. Many of the BIM framework’s components – fields, stages, lenses, steps, competencies and several visual knowledge models – have been subjected to a process of validation through a series of international focus groups employing a mixed-model approach (Tashakkori & Teddlie, 1998). The results from these focus groups and their impact on the development of the five components of BIM performance measurement will be published separately. THE FIVE COMPONENTS OF BIM PERFORMANCE MEASUREMENT The first named author identified five BIM framework components as those required to enable accurate and consistent BIM performance measurement (Succar, 2010b). These include BIM capability stages, BIM maturity levels, BIM competency sets, Organizational Scales and Granularity Levels. The following sections provide brief introductions to each component. They are followed by a step-by-step workflow which allows BIM capability and maturity assessments to be conducted. BIM CAPABILITY STAGES BIM capability is defined here as the basic ability to perform a task or deliver a BIM service/product. BIM capability stages (or BIM stages) define the minimum BIM requirements – the major milestones that need to be reached by teams or organizations as they implement BIM technologies and concepts. Three BIM stages separate ‘pre-BIM’, a fixed starting point representing industry status before BIM implementation, from ‘post-BIM’, a variable end-point representing the continually evolving goal of employing virtually integrated design, construction and operation (viDCO) tools and concepts. (The term viDCO is used in preference to integrated project delivery (IPD) as representing the ultimate goal of implementing BIM (AIA, 2007) to prevent any confusion with the term’s evolving contractual connotations within the United States.) The stages are: BIM stage 1: object-based modelling; BIM stage 2: model-based collaboration; l BIM stage 3: network-based integration. l l ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT BIM stages are defined by their minimum requirements. For example, to be considered as having achieved BIM capability stage 1, an organization needs to have deployed an object-based modelling software tool similar to ArchiCAD, Revit, Tekla or Vico. Similarly, for BIM capability stage 2, an organization needs to be engaged in a multidisciplinary ‘model-based’ collaborative project. To be considered at BIM capability stage 3, an organization needs to be using a network-based solution which links to external databases and shares object-based models with at least two other disciplines – a solution similar to a model server or BIMSaaS solution (BIMserver, 2011; Onuma, 2011; Wilkinson, 2008). Each of these three capability stages may be further subdivided into competency steps. What differentiates stages from steps is that stages are transformational or radical changes, while steps are incremental ones (Henderson & Clark, 1990; Taylor & Levitt, 2005). The collection of steps involved in working towards or within a BIM stage (i.e. across the continuum from pre-BIM to post-BIM) is driven by different perquisites for, challenges within and deliverables of each BIM stage. In addition to their type (the competency set they belong to – refer to Section BIM competency sets), the following BIM steps can be also identified according to their location on the continuum shown in Figure 3: A steps: from pre-BIM status leading to BIM stage 1; B steps: from BIM stage 1 leading towards BIM stage 2; l C steps from BIM stage 2 leading towards BIM stage 3; l D steps from BIM stage 3 leading towards post-BIM. l l BIM MATURITY LEVELS The term ‘BIM maturity’ refers to the quality, repeatability and degree of excellence within a BIM capability. Although ‘capability’ denotes a minimum ability (refer to Section BIM capability stages), ‘maturity’ denotes the extent of that ability in performing a task or delivering a BIM service/product. BIM maturity’s benchmarks are performance improvement milestones (or levels) that teams and organizations aspire to or Measuring BIM performance 125 FIGURE 3 Step sets leading to or separating BIM stages – v1.1 work towards. In general, the progression from lower to higher levels of maturity indicates (i) improved control resulting from fewer variations between performance targets and actual results; (ii) enhanced predictability and forecasting of reaching cost, time and performance objectives; and (iii) greater effectiveness in reaching defined goals and setting new more ambitious ones (Lockamy III & McCormack, 2004) (McCormack, Ladeira, & Oliveira, 2008). The concept of BIM maturity has been adopted from Software Engineering Institute’s (SEI) capability maturity model (CMM) (SEI, 2008a), a process improvement framework initially intended as a tool to evaluate the ability of government contractors to deliver software projects. CMM originated in the field of quality management (Crosby, 1979) and was later developed for the benefit of the US Department of Defence (Hutchinson & Finnemore, 1999). Its successor, the more comprehensive capability maturity model integration (CMMI) (SEI, 2006a, 2006b, 2008c), continues to be developed and extended by the SEI, Carnegie Mellon University. Several CMM variants exist for other industries (Succar, 2010a) but they are all, in essence, specialized frameworks that assist stakeholders to improve their capabilities (Jaco, 2004) and benefit from process improvements. Example benefits include increased productivity and return on investment as well as reduced costs and post-delivery defects (Hutchinson & Finnemore, 1999). Maturity models are typically composed of multiple maturity levels, or process improvement ‘building blocks’ or ‘components’ (Paulk, Weber, Garcia, Chrissis, & Bush, 1993). When the requirements of each level are satisfied, implementers can then build on established components to attempt ‘higher’ maturity. Although CMMs are not without their detractors (e.g. Bach, 1994; Jones, 1994; Weinberg, 1993), research conducted in other industries has already identified a correlation between improved process maturity and business performance (Lockamy III & McCormack, 2004). The ‘original’ software industry CMM, however, is not applicable to the construction industry. It does not address supply chain issues, and its maturity levels do not account for the different phases of the lifecycle of a construction project (Sarshar et al., 2000). Although other efforts, derived from CMM, focus on the construction industry (refer to Table 1), there is no comprehensive maturity model/index that can be applied to BIM, its implementation stages, players, deliverables or its effect on project lifecycle phases. The CMMs listed in Table 1 are similar in structure and objectives but differ in conceptual depth, industrial focus, terminology and target audience. A common theme is how CMMs employ simple experience-based classifications and benchmarks to facilitate continuous improvement within organizations. In analysing their suitability for developing a BIM-specific maturity index, most are broad in approach and can collectively form a basis for a range of BIM processes, technologies and policies. However, none easily accommodates the size of organizations being monitored. Also, from a terminology standpoint, there is insufficient differentiation between the notion of capability (an ability to perform a task) and that of maturity (the degrees of excellence in performing a task). This differentiation is critical when catering for staged BIM implementation as it responds to the disruptive and expansive nature of BIM. To address the aforementioned shortcomings, the BIM maturity index (BIMMI) has been developed by ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT SAMPLE REPRESENTATION ABBREVIATION, NAME – ORGANIZATION DESCRIPTION AND NUMBER OF MATURITY LEVELS BIM proficiency matrix – The Indiana University Architect’s Office The BIM proficiency matrix is ‘used to assess the proficiency of a respondent’s skill at working in a BIM environment’. The matrix is ‘adaptable to project needs’ and intends to communicate ‘owner intent regarding BIM objectives’ (IU, 2009a, pp. 15 and 16) The BIM proficiency matrix is a static, multi-worksheet, MS Excel workbook (IU, 2009b) which includes eight categories to be assessed. Upon assessment, a score ranging from one to four points is assigned against each category. Points for each category are then tallied and the total BIM maturity score is calculated. The matrix identifies five ‘BIM standards’ which a project can achieve, should achieve or has already achieved depending on ‘Simplified matrix’ – an Excel Worksheet from the when the matrix is deployed BIM proficiency matrix (IU, 2009b) The five proficiency levels (or BIM standards) are: ‘working towards BIM’ – the lowest standard, ‘certified BIM’, ‘silver’, ‘gold’ and ‘ideal’ – the highest BIM maturity standard BIM QuickScan – TNO Built Environment and Geosciences The BIM QuickScan tool aims to ‘serve as a standard BIM benchmarking instrument in the Netherlands’. The scan is intended to be performed ‘in a limited time of maximum one day’ (Sebastian & Van Berlo, 2010, pp. 255 and 258) The BIM QuickScan Tool is organized around four chapters: organization and management, mentality and culture, information structure and information flow, and tools and applications. ‘Each chapter contains a number of KPIs in the form of a multiple-choice questionnaire. . . With each KPI, there are a number of possible answers. For each Score representation (by category) from the sample BIM QuickScan report (TNO, 2010) answer, a score is assigned. Each KPI also carries a certain weighting factor. The sum of all the partial scores after considering the weighting factors represents the total score of BIM performance of an organization’ (Sebastian & Van Berlo, 2010, pp. 258 and 259) KPIs are assessed against a percentile score while ‘Chapters’, representing a collation of KPIs, are assessed against a five-level system (0 to 4). 126 B. SUCCAR et al. ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT TABLE 1 Maturity models influencing the BIM maturity index COBIT, Control objects for information and related technology – Information Systems Audit and Control Association (ISACA) and the IT Governance Institute (ITGI) The main objective of COBIT is to ‘enable the development of clear policy and good practice for IT control throughout organizations’ (Lainhart, 2000, p. 22) The COBIT Maturity Model is ‘an IT governance tool used to measure how well developed the management processes are with respect to internal controls. The maturity model allows an organization to grade itself from non-existent (0) to optimized (5)’ (Pederiva, 2003, p. 1). COBIT includes six maturity levels (non-existent, initial/ ad hoc, repeatable but intuitive, defined process, managed and measurable and optimized), four domains and 34 (Lainhart, 2000) control objectives Note: There is some alignment between ITIL (OGC, 2009) and COBIT with respect to IT governance within organizations (Sahibudin, Sharifi, & Ayat, 2008) of value to BIM implementation efforts CMMI, Capability maturity model integration – Software Engineering Institute/Carnegie Melon Capability maturity modelw integration (CMMI) is a process improvement approach that helps integrate for quality processes, and provide a point of reference for appraising current processes (SEI, 2006b, 2006c, 2008a, 2008b, 2008c) CMMI has five maturity levels (for staged representation, six capability levels for continuous representation), 16 core process areas (22 for CMMI-DEV and 24 for CMMI-SVC) and one to four goals for each process area The five maturity levels are: initial, managed, defined, quantitatively managed and optimizing Source: NASA, Software Engineering Process Group. http://bit.ly/ CMMI-NASA Continued Measuring BIM performance 127 ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT traditionally separate organizational functions, set process improvement goals and priorities, provide guidance SAMPLE REPRESENTATION ABBREVIATION, NAME – ORGANIZATION DESCRIPTION AND NUMBER OF MATURITY LEVELS CSCMM, construction supply chain maturity model ‘Construction supply chain management (CSCM) refers to the management of information, flow, and money in the development of a construction project’ as mentioned in (Vaidyanathan & Howell, 2007, p. 170) CSCMM has four maturity stages: ad hoc, defined, managed and controlled ( (Vaidyanathan & Howell, 2007) iBIM – integrated Building Information Modelling The iBIM maturity model – introduced in Bew, Underwood, Wix, and Storer (2008) – has been devised ‘to ensure clear articulation of the standards and guidance notes, their relationship to each other and how they can be applied to projects and contracts in industry’ (BIS, 2011, p. 40) The iBIM model identifies specific capability targets (not performance milestones) for the UK Construction Industry covering technology, standards, guides, classifications and delivery (total number of topics not defined). Targets for each topic are organized under one or more loosely defined maturity levels (0 –3) (BIS, 2011) 128 B. SUCCAR et al. ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT TABLE 1 Continued I-CMM, Interactive capability maturity model – National Institute for Building Sciences (NIBS) Facility Information Council (FIC) This I-CMM is closely coupled with the NBIMS effort (version1, part 1) and establishes ‘a tool to determine the level of maturity of an individual BIM as measured against a set of weighted criteria agreed to be desirable in a Building Information Model’ (Suermann, et al., 2008, p. 2; NIST, 2007; NIBS, 2007) The ICMM has 11 ‘areas of interest’ measured against 10 maturity levels (Suermann, Issa, & McCuen, 2008) Knowledge retention maturity levels incorporates these levels: (i) knowledge is shared between employees, (ii) shared knowledge is documented (transferred from tacit to explicit), (iii) documented knowledge is stored and (iv) stored knowledge is accessible and easily retrievable (Arif, et al., 2009) (Arif, Egbu, Alom, & Khalfan, 2009) Continued Measuring BIM performance 129 ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT Arif et al. (2009) introduced four levels of knowledge retention maturity Knowledge management is an integral part of BIM capability and subsequent maturity. The matrix thus SAMPLE REPRESENTATION ABBREVIATION, NAME – ORGANIZATION DESCRIPTION AND NUMBER OF MATURITY LEVELS LESAT, Lean Enterprise Self-Assessment Tool – Lean Aerospace Initiative (LAI) at the Massachusetts Institute of Technology (MIT) LESAT is focused on ‘assessing the degree of maturity of an enterprise in its use of ‘lean’ principles and practices to achieve the best value for the enterprise and its stakeholders’ (Nightingale & Mize, 2002, p. 17). LESAT has 54 lean practices organized within three assessment sections: lean transformation/leadership, life cycle processes and enabling infrastructure and five maturity levels: some awareness/sporadic, general awareness/informal, systemic approach, ongoing refinement and exceptional/innovative (Nightingale & Mize, 2002) P3M3, Portfolio, programme and project management maturity model – Office of Government Commerce The P3M3 provides ‘a framework with which organizations can assess their current performance and put in place improvement plans with measurable outcomes based on industry best practice’ (OGC, 2008, p. 8) The P3M3 has five maturity levels: awareness, repeatable, defined, managed and optimized (OGC, 2008) 130 B. SUCCAR et al. ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT TABLE 1 Continued P-CMMw, People capability maturity model v2 – Software Engineering Institute/Carnegie Melon P-CMM is an ‘organizational change model’ and a ‘roadmap for implementing workforce practices that continuously improve the capability of an organization’s workforce’ (SEI, 2008d, pp. 3 and 15) P-CMM has five maturity levels: initial, managed, defined, predictable and optimizing (SEI, 2008d) The project management process maturity (PM)2 model ‘determines and positions an organization’s relative project management level with other organizations’. It also aims to integrate PM ‘practices, processes, and maturity models to improve PM effectiveness in the organization’ (Kwak & Ibbs, 2002, p. 150) (PM)2 has five maturity levels: initial, planned, managed at project level, managed at corporate level and continuous learning (Kwak & Ibbs, 2002) Continued Measuring BIM performance 131 ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT (PM)2, Project management process maturity model SAMPLE REPRESENTATION ABBREVIATION, NAME – ORGANIZATION DESCRIPTION AND NUMBER OF MATURITY LEVELS SPICE, Standardized process improvement for construction enterprises – Research Centre for the Built and Human Environment, The University of Salford SPICE is a project which developed a framework for continuous process improvement for the construction industry. SPICE is an ‘evolutionary step-wise model utilizing experience from other sectors, such as manufacturing and IT’ (Hutchinson & Finnemore, 1999, p. 576; Sarshar et al., 2000) SPICE has five stages: initial/chaotic, planned & tracked, well defined, quantitatively controlled, and continuously improving (Hutchinson & Finnemore, 1999) Supply chain management process maturity model and business process orientation (BPO) Maturity Model The model conceptualizes the relation between process maturity and supply chain operations as based on the supply-chain operations reference model (Stephens, 2001). The model’s maturity describes the ‘progression of activities toward effective SCM and process maturity. Each level contains characteristics associated with process maturity such as predictability, capability, control, effectiveness and efficiency’ (Lockamy III & McCormack, 2004, p. 275; McCormack, 2001). The five maturity levels are: ad hoc, defined, linked, integrated and extended (Lockamy III & McCormack, 2004) Other maturity models – or variation on listed maturity models – include those on software process improvement (Hardgrave & Armstrong, 2005), IS/ICT management capability (Jaco, 2004), interoperability (Widergren, Levinson, Mater, & Drummond, 2010), project management (Crawford, 2006), competency (Gillies & Howard, 2003) and financial management (Doss, Chen, & Holland, 2008) 132 B. SUCCAR et al. ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT TABLE 1 Continued Measuring BIM performance 133 FIGURE 4 Building Information Modelling maturity levels at BIM stage 1 analysing and then integrating these and other maturity models used across different industries. The BIMMI has been customized to reflect the specifics of BIM capability, implementation requirements, performance targets and quality management. It has five distinct levels: (a) initial/ ad hoc, (b) defined, (c) managed, (d) integrated and (e) optimized (Figure 4). Level names were chosen to reflect the terminology used in many maturity models, to be easily understandable by DCO stakeholders and to reflect increasing BIM maturity from ad hoc to continuous improvement (Table 2). BIM COMPETENCY SETS A BIM competency set is a hierarchical collection of individual competencies identified for the purposes of implementing and assessing BIM. In this context, the term competency reflects a generic set of abilities suitable for implementing as well as assessing BIM capability and/or maturity. Figure 5 illustrates how the BIM framework generates BIM competency sets out of multiple fields, stages and lenses (Succar, 2009). BIM competencies are a direct reflection of BIM requirements and deliverables and can be grouped into three sets, namely technology, process and policy: Technology sets in software, hardware and data/ networks. For example, the availability of a BIM tool allows the migration from drafting-based to object-based workflow (a requirement of BIM stage 1) Process sets in resources, activities/workflows, products/services, and leadership/management. For example, collaboration processes and databasesharing skills are necessary to allow model-based collaboration (BIM stage 2). Policy sets in benchmarks/controls, contracts/ agreements and guidance/supervision. For example, alliance-based or risk-sharing contractual agreements are pre-requisites for network-based integration (BIM stage 3). Figure 6 provides a partial mind-map of BIM competency sets shown at Granularity Level 2 (for an explanation of Granularity Levels, please refer to Section BIM granularity levels). BIM ORGANIZATIONAL SCALES To allow BIM performance assessments to respect the diversity of markets, disciplines and company sizes, an Organizational Scale (OScale) has been developed. The scale can be used to customize assessment efforts and is depicted in Table 3. BIM GRANULARITY LEVELS Competency sets include a large number of individual competencies grouped under numerous headings (shown in Figure 6). To enhance BIM capability and maturity assessments and to increase their flexibility, a granularity ‘filter’ with four Granularity Levels (GLevels) has been developed. Progression from lower to higher levels of granularity indicates an increase in (i) assessment breadth, (ii) scoring detail, (iv) formality and (iv) assessor specialization. Using higher Granularity Levels (GLevel 3 or 4) exposes more detailed competency areas than lower Granularity Levels (GLevel 1 or 2). This variability enables the preparation of several BIM performance measurement tools ranging from low-detail, informal and self-administered assessments to high-detail, formal and specialist-led appraisals. Table 4 provides more information about the four Granularity Levels. ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT 134 B. SUCCAR et al. ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT TABLE 2 A non-exhaustive list of terminology used by CMMs to denote maturity levels including those used by the BIM maturity index MATURITY MODELS MATURITY LEVELS 0 BIM maturity index COBIT, Control objects for information and related technology Non-existent CMMI, Capability maturity model integration (staged representation) CMMI (continuous representation) Incomplete 1 or a 2 or b 3 or c 4 or d 5 or e Initial/ad hoc Defined Managed Integrated Optimized Initial/ad hoc Repeatable but intuitive Defined process Managed & measurable Optimized Initial Managed Defined Quantitatively Optimizing Defined managed Quantitatively Optimizing Performed Managed managed CSCMM, Construction supply chain maturity model LESAT, Lean enterprise self-assessment tool Ad-hoc Awareness/ Defined General awareness/ Sporadic informal Initial Awareness (PM)2, Project management process maturity model SPICE, Standardized process improvement for P-CMMw, People capability maturity model P3M3, Portfolio, programme and project management Managed Systemic approach Controlled Ongoing refinement N/A Exceptional/ Managed Repeatable Defined Defined Predictable Managed Optimizing Optimized Ad-hoc Planned Managed at project level Managed at corporate level Continuous learning Initial/chaotic Planned & tracked Well defined Quantitatively Continuously Linked controlled Integrated improving Extended innovative maturity model construction enterprises Supply chain management process maturity model Ad hoc Defined Measuring BIM performance 135 FIGURE 5 Structure of BIM competency sets v1.0 Granularity Levels increase or decrease the number of competency areas used for performance assessment. For example, the mind map provided in Figure 6 reveals 10 competency areas at GLevel 1 and 41 competency areas at GLevel 2. Also, at GLevels 3 and 4, the number of competency areas available for performance assessment increases dramatically as shown in Figure 7. The partial mind-map shown in Figure 7 reveals many additional competency areas under GLevel 3, such as data types and data structures. At GLevel 4, the map reveals even more detailed competency areas including structured and unstructured data, which in turn branch into computable and non-computable components (Fallon & Palmer, 2007; Kong et al., 2005; Mathes, 2004). levels, competency sets, Organizational Scales and Granularity Levels) allow performance assessments to be conducted involving combinations of these components. The guiding principles discussed in Section Developing BIM metrics and benchmarks all apply. To manage all possible configurations, a simple assessment and reporting workflow has been developed (Figure 8). The workflow shown in Figure 8 identifies the five steps needed to conduct a BIM performance assessment. Starting with an extensive pool of generic BIM competencies – applicable across DCO disciplines and organizational sizes – assessors can first filter-out non-applicable competency sets, conduct a series of assessments based on the competencies remaining and then generate appropriate assessment reports. APPLYING THE FIVE ASSESSMENT COMPONENTS A FINAL NOTE The aforementioned five complementary BIM framework components (capability stages, maturity The five BIM framework components, briefly discussed in this article, provide a range of ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT 136 B. SUCCAR et al. FIGURE 6 BIM Competency sets v3.0 – shown at Granularity Level 2 opportunities for DCO stakeholders to measure and improve their BIM performance. The components complement each other and enable highly targeted yet flexible performance analyses to be conducted. ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT These range from informal self-assessments to highly detailed and formal organizational audits. Such a system of assessment can be used to standardize BIM implementation and assessment Measuring BIM performance 137 TABLE 3 Organizational scales LOW DETAIL NAME MACRO markets HIGH DETAIL SYM M GRANULARITY Markets NAME (Macro M) SYM M GRANULARITY Market and industries SHORT DEFINITION Markets are the ‘world of commercial activity where goods and services are bought and sold’. (Meso M) I Industries Md Defined http://bit.ly/pjB3c Defined markets can be geographical, geopolitical market or resultant from multi-party agreements similar to NAFTA or ASIAN Sub-markets can be local or regional. (Micro M) Ms Sub-market (Macro I) I Industry Industries are the organized action of making of goods and services for sale. Industries can traverse markets and may be service, product or project-based. The AEC industry is mostly Project-Based. http://bit.ly/ielY3 (Meso I) Is Sector A sector is a ‘distinct subset of a market, society, industry, or economy whose components share similar characteristics’ http://bit.ly/15UkZD (Micro I) Id Discipline Disciplines are industry sectors, ‘branches of knowledge, systems of rules of conduct or methods of practice’. http://bit.ly/7jT82 MESO projects and P Project teams n/a Isp Specialty Specialty is a focus area of knowledge, expertise, P Project team production or service within a sub-discipline Project teams are temporary groupings of their teams organizations with the aim of fulfilling predefined objectives of a project – a planned endeavour, usually with a specific goal and accomplished in several steps or stages. http://bit.ly/dqMYg MICRO organizations units, O Organizations (Macro O) O Organization An organization is a ‘social arrangement which pursues collective goals, which controls its own their groups and performance, and which has a boundary members separating it from its environment’. http://bit.ly/ v7p9N (Meso O) Ou Og Organizational Departments and units are specialized divisions of unit an organization. These can be co-located or distributed geographically Organizational Organizational Groups consist of individual human group (or team) resources assigned to perform an activity or deliver a set of assigned objectives. Groups (also referred to as organizational teams) can be physically co-located or formed across geographical or departmental lines (Micro O) Om Organizational Organizational members can be part of multiple member organizational groups. ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT 138 B. SUCCAR et al. TABLE 4 BIM competency Granularity Levels v2.1 GLEVEL NUMBER, GLEVEL NAME, DESCRIPTION AND SCORING SYSTEM (NUMERICAL AND/OR NAMED) OSCALE APPLICABILITY ASSESSMENT BY, REPORT TYPE AND GUIDE NAME 1 All scales Self Discovery A low detail assessment used for basic and semi-formal discovery Discovery notes BIMC&M of BIM capability and maturity. Discovery assessments yield a basic numerical score discovery 2 Evaluation A more detailed assessment of BIM capability and maturity. All scales Self and peer Evaluation assessments yield a detailed numerical score guide Evaluation sheets BIMC&M evaluation guide 3 Certification A highly detailed appraisal of those competency areas applicable across disciplines, markets and sectors. Certification appraisal is 8 and 9 External consultant certification used for structured (staged) capability and maturity and yields a 4 Auditing Certificate BIMC&M Self, peer and guide Audit report competencies covered under certification, auditing appraises external BIMC&M detailed competency areas including those specific to a market, discipline or a sector. Audits are highly customizable, suitable for consultant auditing guide formal, named maturity level Auditing is the most comprehensive appraisal type. In addition to 8, 9, 10 and 11 non-structured (continuous) capability and maturity and yield a named maturity level plus a numerical maturity score for each competency area audited efforts, enable a structured approach to BIM education and training as well as establish a solid base for a formal BIM certification process. After scrutiny of a significant part of the BIM framework through peer-reviewed publications and a series of international focus groups, the five FIGURE 7 Technology competency areas at Granularity Level 4 – partial mind map v3.0 ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT Measuring BIM performance 139 FIGURE 8 BIM capability and maturity assessment and reporting workflow diagram – v2.0 components and other related assessment metrics are currently being extended and field tested. Sample online tools (focusing on selected disciplines, at different granularities) are currently being formulated. All these form part of an ongoing effort to promote the establishment of an independent BIM certification body responsible for assessing and accrediting individuals, organizations and collaborative project teams. Subject to additional field testing and tool calibration, the five components may be well placed to consistently assess, and by extension improve, BIM performance. UAE. Engineering, Construction and Architectural Management, 16(1), 92 –108. Bach, J. (1994). The immaturity of the CMM. American Programmer, 7(9), 13 –18. Bew, M., Underwood, J., Wix, J., & Storer, G. (2008). Going BIM in a commercial world. Paper presented at the EWork and EBusiness in Architecture, Engineering and Construction: European Conferences on Product and Process Modeling (ECCPM 2008), Sophia Antipolis, France. BIMserver. (2011, 20 October). Open source building information Modelserver. Retrieved from http://bimserver.org/ BIS. (2011). A report for the Government Construction Client Group, Building Information Modelling (BIM) working party strategy. Department for Business Innovation & Skills (BIS). Retrieved from http://www.cita.ie/ ACKNOWLEDGEMENTS This article draws on the Bilal Succar’s PhD research at the University of Newcastle, School of Architecture and Built Environment (Australia). Bilal Succar wishes to acknowledge his supervisors Willy Sher, Guillermo Aranda-Mena and Anthony Williams for their continuous support. images/assets/uk%20bim%20strategy%20(summary).pdf Chun, M., Sohn, K., Arling, P., & Granados, N.F. (2008). Systems theory and knowledge management systems: The case of Pratt-Whitney Rocketdyne. Paper presented at the Proceedings of the 41st Hawaii International Conference on System Sciences, Hawaii. Crawford, J.K. (2006). The project management maturity model. Information Systems Management, 23(4), 50– 58. Crosby, P.B. (1979). Quality is free: The art of making quality certain. REFERENCES Ackoff, R.L. (1971). Towards a system of systems concepts. Management Science, 17(11), 661– 671. AIA. (2007). Integrated project delivery: A guide. AIA California Council. Arif, M., Egbu, C., Alom, O., & Khalfan, M.M.A. (2009). Measuring knowledge retention: A case study of a construction consultancy in the New York: New American Library. Davis, F.D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology [Article]. MIS Quarterly, 13(3), 319– 340. Doss, D.A., Chen, I.C.L., & Holland, L.D. (2008). A proposed variation of the capability maturity model framework among financial management ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT 140 B. SUCCAR et al. settings. Paper presented at the Allied Academies International Conference, Tunica. Eppler, M., & Burkhard, R.A. (2005). Knowledge visualization. In D.G. Schwartz (Ed.), Encyclopedia of knowledge management (pp. 551–560). Covent Garden, London: Idea Group Reference. Eppler, M.J., & Platts, K.W. (2009). Visual strategizing: The systematic use of visualization in the strategic-planning process. Long Range Planning, 42, 42 –74. Kong, S.C.W., Li, H., Liang, Y., Hung, T., Anumba, C., & Chen, Z. (2005). Web services enhanced interoperable construction products catalogue. Automation in Construction, 14(3), 343 –352. Kwak, Y.H., & Ibbs, W.C. (2002). Project management process maturity (PM)2 model. ASCE, Journal of Management in Engineering, 18(3), 150–155. Lainhart IV, J.W. (2000). COBITTM : A methodology for managing and controlling information and information technology risks and vulnerabilities. Journal of Information Systems, 14(s-1), 21– 25. Fallon, K.K., & Palmer, M.E. (2007). General buildings information handover Lockamy III, A., & McCormack, K. (2004). The development of a supply chain guide: Principles, methodology and case studies. Washington, DC: US management process maturity model using the concepts of business Department of Commerce. process orientation. Supply Chain Management: An International Journal, Fox, S., & Hietanen, J. (2007). Interorganizational use of building information models: Potential for automational, informational and transformational effects. Construction Management and Economics, 25(3), 289 –296. Froese, T.M. (2010). The impact of emerging information technology on 9(4), 272–278. Mathes, A. (2004). Folksonomies – Cooperative classification and communication through shared metadata. Paper presented at the Computer Mediated Communication, LIS590CMC (Doctoral seminar), project management for construction. Automation in Construction, 19(5), Graduate School of Library and Information Science. Retrieved from http:// 531–538. www.adammathes.com/academic/computer-mediatedcommunication/ Gillies, A., & Howard, J. (2003). Managing change in process and people: Combining a maturity model with a competency-based approach. Total Quality Management & Business Excellence, 14(7), 779– 787. Hardgrave, B.C., & Armstrong, D.J. (2005). Software process improvement: It’s a journey, not a destination. Communications of the ACM, 48(11), 93– 96. Henderson, R.M., & Clark, K.B. (1990). Architectural innovation: The reconfiguration of existing product technologies and the failure of established firms. Administrative Science Quarterly, 35(1), 9 –30. Homer-Dixon, T. (2001). The ingenuity gap. Canada: Vintage. Hutchinson, A., & Finnemore, M. (1999). Standardized process improvement for construction enterprises. Total Quality Management, 10, 576–583. IU. (2009a). BIM design & construction requirements, follow-up seminar (PowerPoint Presentation). The Indiana University Architect’s Office, 32. Retrieved from http://www.indiana.edu/~uao/ IU%20BIM%20Rollout%20Presentation%209-10-2009.pdf. IU. (2009b). IU BIM Proficiency Matrix (Multi-tab Excel Workbook). 9 tabs. The Indiana University Architect’s Office. Retrieved from http://www. indiana.edu/~uao/IU%20BIM%20Proficiency%20Matrix.xls. Jaco, R. (2004). Developing an IS/ICT management capability maturity framework. Paper presented at the Proceedings of the 2004 Annual Research Conference of the South African Institute of Computer Scientists and Information Technologists on IT Research in Developing Countries, Stellenbosch, Western Cape, South Africa. Jones, C. (1994). Assessment and control of software risks. New Jersey: Prentice-Hall. Keller, T., Gerjets, P., Scheiter, K., & Garsoffky, B. (2006). Information folksonomies.html Maxwell, J.A. (2005). Qualitative research design: An interactive approach. Thousand Oaks, CA: Sage Publications, Inc. McCormack, K. (2001). Supply chain maturity assessment: A roadmap for building the extended supply chain. Supply Chain Practice, 3, 4– 21. McCormack, K., Ladeira, M.B., & de Oliveira, M.P.V. (2008). Supply chain maturity and performance in Brazil. Supply Chain Management: An International Journal, 13(4), 272 –282. McGraw-Hill. (2009). The business value of BIM: Getting Building Information Modeling to the bottom line. McGraw-Hill Construction Analytics. Retrieved from http://construction.com/ Meredith, J.R., Raturi, A., Amoako-Gyampah, K., & Kaplan, B. (1989). Alternative research paradigms in operations. Journal of Operations Management, 8(4), 297 –326. Michalski, R.S. (1987). Concept learning. In S.S. Shapiro (Ed.), Encyclopedia of artificial intelligence (Vol. 1, pp. 185 –194). New York: Wiley. Michalski, R.S., & Stepp, R.E. (1987). Clustering. In S.S. Shapiro (Ed.), Encyclopedia of artificial intelligence (Vol. 1, pp. 103– 111). New York: Wiley. Mutai, A. (2009). Factors influencing the use of Building Information Modeling (BIM) within leading construction firms in the United States of America (Unpublished Doctor of philosophy). Indiana State University, Terre Haute. NIBS. (2007). BIM Capability Maturity model. National Institute for Building Sciences (NIBS) Facility Information Council (FIC). Retrieved October 11, 2008, from www.buildingsmartalliance.org/client/assets/files/bsa/ BIM_CMM_v1.9.xls Nightingale, D.J., & Mize, J.H. (2002). Development of a lean enterprise visualizations for knowledge acquisition: The impact of dimensionality and transformation maturity model. Information Knowledge Systems color coding. Computers in Human Behavior, 22(1), 43 –65. Management, 3(1), 15 – 30. ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT Measuring BIM performance 141 NIST. (2007). National Building Information Modeling Standard – Version 1.0 – Part 1: Overview, principles and methodologies. Washington, DC: US Department of Commerce. OGC. (2008). Portfolio, programme, and project management maturity model (P3M3). England: Office of Government Commerce. OGC. (2009, 13 February). Information Technology Infrastructure Library (ITIL) – Office of Government Commerce. Retrieved from http://www. itil-officialsite.com/home/home.asp Ollerenshaw, A., Aidman, E., & Kidd, G. (1997). Is an illustration always worth ten thousand words? Effects of prior knowledge, learning style and multimedia illustrations on text comprehension. International Journal of Instructional Media, 24(3), 227– 238. Onuma. (2011, 20 October). Onuma model server. Retrieved from http:// onuma.com/products/BimDataApi.php Paulk, M.C., Weber, C.V., Garcia, S.M., Chrissis, M.B., & Bush, M. (1993). SEI. (2008a, 11 October). Capability maturity model integration. Software Engineering Institute/Carnegie Melon. Retrieved from http://www.sei. cmu.edu/cmmi/index.html SEI. (2008b). Capability Maturity Model Integration for Services (CMMI-SVC), partner and piloting draft, V0.9c. Pittsburgh, PA: Software Engineering Institute/Carnegie Melon. SEI. (2008c, 24 December). CMMI for services. Retrieved from http://www. sei.cmu.edu/cmmi/models/CMMI-Services-status.html SEI. (2008d, 11 October). People Capability Maturity Model – Version 2, Software Engineering Institute/Carnegie Melon. Retrieved from http:// www.sei.cmu.edu/cmm-p/version2/index.html Stephens, S. (2001). Supply chain operations reference model version 5.0: A new tool to improve supply chain efficiency and achieve best practice. Information Systems Frontiers, 3(4), 471– 476. Succar, B. (2009). Building information modelling framework: A research and Key practices of the capability maturitymModel – version 1.1 delivery foundation for industry stakeholders. Automation in Construction, (Technical Report). Software Engineering Institute, Carnegie Mellon 18(3), 357–375. University, Pittsburgh, PA. Pederiva, A. (2003). The COBITw maturity model in a vendor evaluation case. Information Systems Control Journal, 3, 26 –29. Penttilä, H. (2006). Describing the changes in Architectural Information Technology to understand design complexity and free-form architectural expression. ITcon, (Special Issue The Effects of CAD on Building Form and Design Quality), 11, 395–408. Rogers, E.M. (1995). Diffusion of innovation New York: Free Press. Sahibudin, S., Sharifi, M., & Ayat, M. (2008). Combining ITIL, COBIT and ISO/ Succar, B. (2010a). Building information modelling maturity matrix. In J. Underwood, & U. Isikdag (Eds.), Handbook of research on building information modelling and construction informatics: concepts and technologies (pp. 65 – 103). Information Science Reference, IGI Publishing. doi:10.4018/978-1-60566-928-1.ch004 Succar, B. (2010b). The five components of BIM performance measurement. Paper presented at the CIB World Congress. Suermann, P.C., Issa, R.R.A., & McCuen, T.L. (2008, 16 –18 October). Validation of the U.S. National Building Information Modeling Standard IEC 27002 in order to design a comprehensive IT framework in Interactive Capability Maturity Model. Paper presented at the 12th organizations. Paper presented at the Modeling & Simulation, AICMS 08. International Conference on Computing in Civil and Building Engineering, Second Asia International Conference, Kuala Lumpur, Malaysia. Sarshar, M., Haigh, R., Finnemore, M., Aouad, G., Barrett, P., Baldry, D., & Sexton, M. (2000). SPICE: A business process diagnostics tool for construction projects. Engineering Construction & Architectural Management, 7(3), 241– 250. Sebastian, R., & Van Berlo, L. (2010). Tool for benchmarking BIM performance of design, engineering and construction firms in the Beijing, China. Tashakkori, A., & Teddlie, C. (1998). Mixed methodology: combining qualitative and quantitative qpproaches. Thousand Oaks, CA: Sage. Taylor, J., & Levitt, R.E. (2005). Inter-organizational knowledge flow and innovation diffusion in project-based industries. Paper presented at the 38th International Conference on System Sciences, Hawaii, USA. Tergan, S.O. (2003). Knowledge with computer-based mapping tools. Paper Netherlands. Architectural Engineering and Design Management (Special presented at the ED-Media 2003 World Conference on Educational Issue: Integrated Design and Delivery Solutions), 6, 254–263. Multimedia, Hypermedia & Telecommunication, Honolulu, HI: University of SEI. (2006a). Capability Maturity Model Integration for Development (CMMI-DEV), improving processes for better products. Pittsburgh, PA: Software Engineering Institute/Carnegie Melon. SEI. (2006b). Capability Maturity Model Integration Standard (CMMI) appraisal method for process improvement (SCAMPI) A, Version 1.2 – method definition document. Pittsburgh, PA: Software Engineering Institute/Carnegie Melon. SEI. (2006c). CMMI for development, improving processes for better products. Pittsburgh, PA: Software Engineering Institute/Carnegie Melon. Honolulu. TNO. (2010). BIM QuickScan – A TNO initiative (sample QuickScan Report – PDF). 3. Retrieved from http://www.bimladder.nl/wp-content/uploads/ 2010/01/voorbeeld-quickscan-pdf.pdf UKCO. (2011). Government construction strategy. London: United Kingdom Cabinet Office. Vaidyanathan, K., & Howell, G. (2007). Construction supply chain maturity model – Conceptual framework. Paper presented at the International Group for Lean Construction (IGLC-15), Michigan, USA. ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT 142 B. SUCCAR et al. Van der Heijden, K., & Eden, C. (1998). The theory and praxis of reflective learning in strategy making. In C. Eden, & J.-C. Spender (Eds.), Managerial and organizational cognition: Theory, methods and research (pp. 58– 75). London: Sage. Venkatesh, V., & Davis, F.D. (2000). A theoretical extension of the technology acceptance model: Four longitudinal field studies. Management Science, 46(2), 186 –204. Walker, D.H.T., Bourne, L.M., & Shelley, A. (2008). Influence, stakeholder Weinberg, G.M. (1993). Quality software management (Vol. 2): First-order measurement. New York: Dorset House Publishing Co., Inc. Widergren, S., Levinson, A., Mater, J., & Drummond, R. (2010, 25 – 29 July). Smart grid interoperability maturity model. Paper presented at the Power and Energy Society General Meeting, 2010 IEEE, Minnesota, USA. Wilkinson, P. (2008, 12 July). SaaS-based BIM. Extranet evolution – Construction collaboration technologies. Retrieved from http://www. mapping and visualization. Construction Management and Economics, extranetevolution.com/extranet_evolution/2008/04/saas-based- 26(6), 645– 658. bim.html ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT